You could write a book on website optimization, in fact people already have. But, here’s 10 really easy steps to getting your sites optimized for speed. Hopefully this will get you started and you’ll want to investigate front end optimization a little further.
1. Run your images through smushit.com.
While it’s obvious to keep your image sizes down as low as possible, it’s not as obvious that some image editing programs like photoshop can add extra meta and formatting data in your images. Its viewable if you open the image in a text editor. Smushit is a firefox plugin that strips out all of this extra formatting and can reduce your image sizes without sacrificing quality. Once the plugin is installed, simply go to the webpage you want to optimize, click the smushit icon in the bottom right of your firefox window, and smushit will tell you how much you can save on each image as well as give you a zip file filled with the smushed images.
2. Use the Fastest CSS Selectors.
When styling elements with CSS there are many ways to specify what you’re styling and some of those ways are faster than others. According to this study by John Sykes, you should first try to just style tags:
the next fastest way to select something in css is by class:
Then its descenders:
[sourcecode language="css"]div div div p a.link[/sourcecode]
finally the slowest is child selectors:
[sourcecode language="css"]div > div > div > p > a.link[/sourcecode]
3. Combine all of your non-repeating background images into one giant sprite.
Reducing HTTP requests is a huge part of website optimization and images can be the biggest culprit. So one easy thing to do is combine all of you background images that don’t repeat into one big image. Most time it helps to not put any spaces between the images and to build the sprite horizontally to make the smallest image. Then in your css you can specify the background image like this…
background:url(images/bigImageSprite.png) -12px 0 no-repeat;
( “-12px 0″ specifies to display the image 12px to the right and 0px down)
4. Run all of your css and JS files through compressors.
Keeping code file sizes down is another no brainer. So run your CSS and JS through optimizers/compressors. For your CSS stylesheets there are plenty of optimizers out there. In fact there is a great round up and review of many of them here. I agree with the round up that the Icey CSS Compressor does a great job. In fact it reduced the stylesheet of this blog by 23%.
When it comes to compressing your JS the best service is Yahoo’s YUI compressor. Other services are either unsecure, or don’t do as well a job. You can download the YUI compressor here. Or you can use this online version from refresh-sf.com. While we’re on the subject of compressing your JS, Prototype and Scriptaculous could use some compressing. So if you use those frameworks get the compressed version aka Protopacked.
5. When saving images, try jpg, png, and gif.
There is no one image file type that always yields the smallest file sizes. So when possible try all of them. Of course there are times that a jpg just won’t guarantee the correct colors that you need, but when you can try exporting your images as each type and see which is the smallest. Photoshop’s “save for web and devices” feature is great for this. It tells you what the file size will be before you have to actually export the images.
6. Place <script> tags at the bottom of the page.
This one isn’t always possible, but try as much as you can to put all your scripts at the bottom of the page. When a browser is downloading the assets to a page it goes through your html file to find what to retrieve. It generally downloads multiple items in parallel (we’ll discuss that more later), but since a script can change anything and everything on the page it stops everything and downloads just that one script file. So if you put it at the end the browser will download everything as fast as possible first, then it will download your scripts.
7. Attach external scripts without blocking parallel downloads.
If you can’t put your <script> tags at the bottom there is a way you can put them in your page without blocking parallel downloads. You can do this in a variety of methods, but each has their own pro’s and cons. My personal choice is to add the <script> tag via the DOM. This allows you to have a download progress indicator in at least firefox (no IE, sorry), and to put your scripts on subdomains (see rule 8).
var js = document.createElement(‘script’);
js.src = ‘scriptName.js’;
var head = document.getElementsByTagName(‘head’);
For more options see Steve Souder’s power point presentation on “Even Faster Websites”.
8. Put assets on sub-domains to increase parallel downloads.
The idea behind this tip is that browsers only download 2 items in parallel per domain. This is part of the HTML 1.1 spec. So if we can download 4 or even 8 things at a time we’ll get faster page downloads. To get more downloads to run in parallel we use the loophole in the “2 downloads per domain” rule, namely, the fact that it says, “per domain”. So make some sub-domains and spread your assets out. I recommend putting your images on one and your html, css, and scripts on another.
However, you must keep in mind that it’s not advisable to have too many DNS lookups in your pages. Each sub-domain forces the browser to do a DNS lookup which hurts speed. The yahoo performance team did some research on this area and came up with the conclusion that you should use at least 2 but no more than 4 host names.
9. Disable E-tags
E-tags are a way that servers check to make sure the file in your cache is the same file that’s on the server. It basically applies a unique identifier to each asset to do this. The problem is that when your site is on a server farm and multiple servers are processing requests, the e-tag will fail from server to server. It will match on the first server but will fail on the rest, thus forcing you to re-download the file that it already cached.
Disabling E-tags only really helps those sites that are on such server farms, so if you’re on a shared host its not always going to help, but it is still recommended because shared hosts routinely change servers and you might even change hosts as well. Plus it’s drop dead simple. Add the following line to your .htaccess:
[sourcecode language="sql"]FileETags none[/sourcecode]
10. Gzip your components.
Ok so this one might not fit into the “Easy Steps” part of this list, but in some cases it will. The basic idea here is to, once again, reduce file sizes. By Gzipping your components you’re reducing the file sizes of the browser’s downloads.
There are several methods you can use to Gzip your files, but depending on your server and host it might become tricky. The easiest way to do it if you’re running Apache is to first verify what version of Apache you’re running. To do this you can use firebug and look in the net panel for the server response header. You should see it.
There is also a good chance your host has the Apache version stated in the control panel somewhere.
If your Apache is version 1, add the following to your .htaccess file in your root directory:
[sourcecode language="c"]mod_gzip_on Yes
mod_gzip_item_include mime ^application/json$
mod_gzip_item_include mime ^text/.*$
mod_gzip_item_include file .html$
mod_gzip_item_include file .php$
mod_gzip_item_include file .js$
mod_gzip_item_include file .css$
mod_gzip_item_include file .txt$
mod_gzip_item_include file .xml$
mod_gzip_item_include file .json$
Header append Vary Accept-Encoding[/sourcecode]
If you have Apache 2 add this to the .htaccess file instead:
[sourcecode language="c"]AddOutputFilterByType DEFLATE text/html text/css text/plain text/xml
Header append Vary Accept-Encoding[/sourcecode]
There are many situations where your shared hosts will not allow you to edit your .htaccess file or editing it to enable gzip just won’t work. If thats the case follow this post from fiftyfoureleven.com. It shows a variety of ways to achieve this, some using php.
This article just scrapes the surface of out what’s there as far as website optimization. I hope this at least gets you thinking in terms of optimizing your sites. If you’d like to read more I highly encourage reading Steve Souder’s blog, Website Optimization Secrets (O’Reilly), and the Yahoo Optimization page (which used to be captained by Steve Souder). Also be sure to check out YSlow, and the Webkit Element Inspector (also found in chrome).