NewCity

The Need for Speed

Light trails created by a long exposure photo of a highway
One of the best ways to lose your users is to provide a slow website experience. Studies have shown that milliseconds of time can make the difference between retaining your users and losing them to someone else. Google is using site speed as part of their ranking algorithm. There's no better time to work on speeding your site up.

While some optimizations may take some doing, there are a number of quick, easy optimizations you can do which will make a big difference.

Cache Cache Cache

The quickest communication is that which doesn’t happen. By setting up caching layers and sending appropriate response headers to your users, you can make sure that as few items as possible are requested of your web server.

  • On the server side, you can use key/value stores or enable file-based caching to return data without needing to request it of your database.
  • If using PHP, enable an opcode cache like APC to make sure that PHP doesn’t have to parse your scripts every time they’re needed.
  • Use a reverse proxy like Varnish Cache to drastically reduce the number of requests that even make it to the web service stack by serving data straight out of memory.
  • Queries making it to your database should be able to fall back on a good query cache.
  • Finally, make sure you are sending appropriate expiration headers with all responses, so that browsers and other proxies will hold on to those files for as long as makes sense, rather than asking for them each page load.

Leverage a CDN

Using a CDN (content delivery network) to serve up things like images, css, and JavaScript will ensure that such static assets are delivered to your end-user from a data center located as close to them as possible.

Not only does this result in a faster load time due to the shorter trip for those files, but you will free your web server up to deliver dynamic content more quickly. This also helps your browser parallelize download requests for the pieces of your webpage.

There are many players in this space — we’ve been making great use of Amazon Cloudfront lately.

Other excellent solutions to this problem include:

Keep your server kernel updated

If you run your own web servers, and they’re running Linux, make sure that they will speak to your users’ browsers as efficiently as possible by keeping your server kernel up to date. This ensures that data is sent as quickly as possible for every connection, and it gets up to that speed swiftly.

Older server kernels take a longer time to figure out how quickly they can send data to your users — there’s always a little analysis that needs to happen to determine how much bandwidth your user can accept. Over the course of many connections, this slower analysis and throughput can add up.

Combine/Minify CSS and JS

Frequently overlooked, this practice adheres nicely to the two tenets of sending as little as possible in as few round trips as possible between your server and your user’s browser. Disparate JavaScript and CSS files should, if possible, be combined into a single CSS and a single JavaScript file. Their size is reduced by using an algorithm to rewrite them beforehand so that they can be sent as quickly as possible.

Enable Gzip Compression

You can achieve instant and significant bandwidth savings when sending text files to browsers that support gzip compression, which is pretty much all of them, by merely enabling this feature in your web service. If using Apache, mod_deflate. If using Nginx, simply make sure the gzip directive is set. The only caveat here is to make sure that only non-binary assets are being gzipped — binary files won’t see much of a size reduction.

Quality DNS

DNS is the service that takes care of resolving hostnames to IP addresses, and is usually the very first thing that happens when you connect to a resource on the Net. Use a good DNS service with speedy performance and set a reasonably large (but not too high, in case you need to make changes) TTL on your zone records. We like somewhere in the half-hour to one hour range.

This will help ensure that results are cached fairly locally, resulting in resolution happening as quickly as possible for your users. Try not to chain CNAMEs together more than is necessary in order to get your user to an IP address as quickly as possible.

Load JavaScript asynchronously

If you’re loading JavaScript into your webpage, by default the page will patiently wait for all of the JavaScript to load and be interpreted before it paints the webpage. Remove this limitation by loading as much of your JavaScript asynchronously as possible. Doing so will allow your browser to keeping loading and displaying other assets at the same time as it loads and interprets your scripts. This is especially helpful if you’re pulling in remote JavaScript and don’t want to have to worry about that third-party server slowing your own page down.

Lazy-load images

By not loading images until they’re needed in your user’s viewport, your time-to-paint can be reduced. This will especially help your mobile users by completing the initial load of the page with fewer assets. And those images — you’re optimizing them, right?

Just do it!

There are a lot of utilities that can help you in this endeavor — make use of them and follow their recommendations for a faster website! Some of these tools include:

Also, check out mod_pagspeed, put out by Google — it can get you going pretty quickly by rewriting your webpages for you with these tweaks in mind (though be careful — automatic optimizations can sometimes cause bugs).

By implementing at least some of these updates, you’ll end up with a speedier site, a more relaxed web server, better pagerank, and happier users.

Photo credit: tobias.munich

NewCity logo