Clicky

X

Subscribe to our newsletter

Get the State of Digital Newsletter
Join an elite group of marketers receiving the best content in their mailbox
* = required field
Daily Updates

Top-3 Performance Optimizations (you should apply right away!)

16 February 2011 BY

Since Google officially introduced page speed as a ranking factor in 2010 I thought I’d put together some of the things we do when optimizing a website for better performance. As a starting point (besides of actually accessing the site and get a feeling on how fast it really is) I’d usually go with the Google Webmaster Tools > Labs > Site performance tab:

GWT basically tells you what the average load time of that specific page is. In addition Google compares this to other sites being monitored. If you scroll down on that page you also see details on a per URL base to optimize. This might look like this:

So I suggest we try to get rid of those, shall we? All right, here we go:

1. Reduce number of HTTP requests

Every website consists of various components. Mostly these are (background-) images as well as JavaScript and CSS files. Sometimes you also might have iFrames, Flash films and others to be served to your visitors. Each component has to be loaded by performing a separate HTTP request. Depending on the amount of available content that can be a lot of requests, as you can probably imagine. Okay, let’s have a look at the “big-3”:

CSS: In general, CSS files should be put on top of a page – this helps pages appear loading faster because of progressively rendering. Also you really should have just one single CSS file not dozens of them. Remember: Its one request vs. X.

JavaScript: Again, you should have as few JS-files as possible in order to reduce the number of HTTP requests. That being said there are mainly two optimizations you can easily apply: Either move (depending on the website’s structure) these JS files to the bottom of the page so that they won’t block the loading process or even better: Implement a JavaScript loader – we got some awesome results using Head JS – to load scripts in parallel but without blocking the site. It’s an awesome tool – go see for your-self.  Another thing which is quite obvious but happens a lot of times: Make sure you do load a script just once – especially with sites continuously developing further the risk of serving duplicates does increase. So you really should make sure that this does not happen.

Minify CSS & JavaScript: Minification means removing unnecessary characters (like white spaces, tabs and stuff) from the code to reduce its size (which obviously improves load times due to the smaller file size). The most popular one is called JSMin – however if you don’t want to download anything I suggest you check out minifyjavascript.com – an online based minifier.

Images: Besides of just serving the images you really need there is one thing that does have a massive impact on performance – using CSS sprites. I’m not going to dive too deep into this one because it’s a quite complex topic. On a very high-level: Using sprites means combining an “unlimited” number of (small) images into just one and then only display parts of that “big-image” by using CSS (positions). This technique saves the overhead of having to fetch multiple images and the server does just have to do a single HTTP request. There is detailed explanation available over here.

2. Cache & compress your files

To make sure files that haven’t changed since a visitor downloaded them previously you need to work with appropriate header tags. Since web pages are getting more complex these days the number of components which have to be downloaded with each page view does increase. By implementing the correct header tags you are going to make sure those components are cacheable and won’t be downloaded on subsequent page views.

According to Yahoo!’s best practices for speed up your website: For static components: implement “Never expire” policy by setting far future Expires header. For dynamic components: use an appropriate “Cache-Control” header to help the browser with conditional requests.

Doing so you need to keep in mind, that you have to change the component’s filename whenever the component changes – otherwise the visitor will keep the old (and locally cached) file which might end in lack of functionality or other unwanted behavior.

Depending on the components size it’ll take a specific amount of time to download it to the visitor’s computer. Using GZip compression the load times can significantly be reduced which also results in less traffic per request. Since we already do have a great article on State of Search by fellow blogger Louis Venter I’m just referring to his post on how to set up compression in Apache or IIS.

Another way to speed up file delivery is using a CDN (content distribution network) – the idea behind a CDN is to distribute static files (like to above mentioned ones: CSS, images, flash, etc.) to a large amount of servers on different geo-locations. CDNs usually do the caching and compressing stuff anyways – so if you can’t do that stuff your-self it might be worth considering; even more if you have a lot of international visitors.

3. The server-side

Another area to tap in might be “the server-side” of your website. Obviously this very much depends on what setup you’re running (Unix vs. Win) but let’s assume we’re on a “classical” Unix box running Apache, PHP and mySQL.

Looking at this from a web-servers perspective: Of course Apache is pretty much the “to-go with”-solution if you want to keep it simple. However if your sites does serve a lot of request (or you’re on a smaller box but don’t want to change for whatever reason) you should really have a look at nginx which is a HTTP and reverse proxy server. A lot of high traffic sites including rambler.ru, wordpress.com and others do use it for a reason. Check out this article for a nice comparison.

Moving forward to PHP you’d probably consider some kind of PHP accelerator to speed things up even more. Most of these accelerators work by caching the compiled byte code of PHP scripts to avoid parsing and compiling the code each time a site will be requested. The code will then be stored in shared memory to reduce the amount of (slower) disk reads. Check out this list – if I’d choose one I’d go with APC.

And last but not least – the database: A lot of self-developed database-driven sites do suffer from poor performance. Mostly this happens because simple “SELECT“-queries don’t use appropriate indexes and stuff like that. As a reference, go check out this top-10 list on mysql.com.

In addition to that it might also be an idea to setup a static file cache before even serving dynamic PHP scripts accessing the database. If you’re on WordPress for example, go check out W3 Total Cache which does exactly this.

Of course there is even more you can do: Things like source code optimization (move inline styles and JavaScripts to external files), removing empty href-attributes (can often be found in combination with JavaScript onclick-events) or not scaling images in HTML, etc. – but hopefully this article gives you an idea where to start performance optimizing your site.

AUTHORED BY:
h

Bastian Grimm is founder and CEO of Grimm Digital. He mainly works as online marketing consultant with a strong focus on organic search engine optimization (SEO). Grimm specializes in SEO strategy consulting, website assessments as well as large scale link building campaigns.

Nice job, you found it!

Now, go try out the 12th one:

Use Google Translate to bypass a paywall...

Ran into a page you can't read because it is blocked or paywalled? Here's a quick trick (doesn't always work, but often does!):

Type the page into Google translate (replace the example with the page you want):

http://translate.google.com/translate?sl=ja&tl=en&u=http://example.com/

How about that!?

Like this 12th trick? Tell others they need to look for this trick on our page: http://www.stateofdigital.com/search-hacks-marketers/

Or Tweet: Found the secret 12th one!