We often work with clients on speeding up their website. Great results are achievable but focus needs to be put on the right areas as there are a myriad of reasons why your website could be slow. Recently, we worked with a client to speed up a customised WordPress website by implementing heavy caching with Nginx and extensive tuning for HTTP2 and Google Pagespeed. The result is that real visitors at a significant geographical distance can see the page in less than 0.6s and Google Page Insights and GTMetrix are both very happy. However, the extensive nature of the work to get every possible millisecond for as many visitors as possible, highlighted several important points.

Myth #1: Small images should be combined into sprites

A few years ago this was absolutely true. Today, almost every major browser supports HTTP2. This means that, between a server configured to use HTTP2 and a browser with HTTP2 support, several files can be downloaded simultaneously over one connection. Furthermore, this is faster than downloading one large file.

Myth #2: Load x first, then y, then z

Also related to HTTP2, because files can be downloaded simultaneously, it is no longer a case that certain types of files should be loaded at a certain point (e.g. Javascript last). A good developer should now aim to load as much as possible as quickly as possible and make anything that would otherwise prevent the browser from rendering the page, asynchronous. On a waterfall chart, either on a web browser or GTMetrix, you want to see several rows that look identical toward the top.

Myth #3: You need to spend more on hosting

I am disappointed that website speed is so closely aligned with server size and hosting costs. We have successfully moved high traffic websites to smaller servers and made them faster at the same time. Part of it is the configuration of the server, that should aim to make efficiency savings, and part is the software itself. If your software: WordPress, Magento or anything else, is badly coded (think about plugins in particular – the ones you downloaded without paying much attention to ;)) then the additional hosting costs to achieve a better speed can be very very high. It is better to deal with the problems in the code so that the server doesn’t have to do as much work.

Myth #4: You need a CDN

I am generally in favour of using a CDN. However, it is not the secret sauce that it is sometimes made out to be. A good rule of thumb, is if your website is fast for local visitors but slow for international visitors (and you have enough international visitors for this to be a problem), then you should consider a CDN. Be aware that CDNs introduce more complexity as they need to cache your content so your website needs to tell the CDN when content has changed, and needs to reference the CDN on every page. A CDN is relatively unlikely to improve the speed of low traffic website or one where the geographic distance between the server and the users is already fairly small.

Myth #6: Google Page Insights will tell me how fast my website is

Tools like Google Pagespeed, Pingdom Tools and GTMetrix need to be interpreted properly. I despise the fact that they make it so easy for website owners to see what appear to be problems. Google Pagespeed does not take into account HTTP2 much, if at all. There is debate as to whether Google rankings correlate to this tool but as for real visitors, I wouldn’t think about it much. GTMetrix is a very advanced tool that takes a lot of different factors into account – it is a very good tool. However, again, despite its bells and whistles, it needs to be interpreted properly. Your score on GTMetrix is not always representative of how users see your website.

On a side note, Qualys SSL Labs test also needs to be interpreted properly. This test gives you some (actually very good) information about the security of your HTTPS configuration. However, what is neglected to mention, is that not having a top score is not necessarily a problem. Why? Because in some cases, using the latest ciphers will cause problems for users in locations where, for example, Windows XP is still widely used (there are some!). Equally, a very strong HTTPS setup can damage the performance of your website as more CPU power is needed to handle crytography

Myth #7: Always enable Gzip compression

Following on from my aside about HTTPS encryption using CPU, gzip compression uses CPU power too. If your website is busy and your server’s bottleneck is the CPU, enabling gzip will take CPU resources away from the server’s normal activities and have a good chance of making it slower. If you simply want to tick the “gzip?” box on online tools, which I discourage until there is substantial evidence that Google rankings depend on this, enable it at level 1, the lowest level. Level 1 is substantially better than nothing for page size, and not that much different to any other level relatively speaking. It does, however, use much, much less CPU time to compress and extract.

Leave a Reply

Your email address will not be published. Required fields are marked *