Is it Time for You to Put Your Site on a Wait Loss Program?

Google has a need...a need for speed!

Google has a need...a need for speed!

If you haven’t heard, Google announced yet another update to their guidelines on mobile site page load speed. Though it shouldn’t come as any surprise that Google is concerned about page load speed (especially since they use site speed as a ranking factor), the more interesting part of the announcement is that they’ve provided a benchmark for page load speed (for above-the-fold content): 1 second.  Why should you care?

Well, aside from page load speed being a user experience issue, as stated above, it can also affect your rankings. So, if you care about your mobile site optimization, it makes sense to care about your page load speed. Even if you hired a web design company to make your site pretty, if it isn’t loading quickly enough, it doesn’t matter how pretty it may look. Your mobile visitors may be too impatient to wait for it to load. Oh, and if you don’t have a mobile site, you should still be concerned about page load speed (perhaps even more so) given that people accessing your site from a mobile device will be rendering the same code as a desktop computer would.

Unfortunately, it’s a lot easier said than done when it comes to improving site speed, and when you’re striving to reduce having to spend a lot of time trying to fix small bits of code on your site for seemingly small gains, you may be tempted to scrap your efforts. Don’t give up! Though this issue may seem overwhelming and too technical to address, it’s still an opportunity that you could be missing out on that your competitors may not be.

Though some website owners may be more inclined to just hire an internet marketing agency to handle this all, if you are a DIYer or have some technical help on standby ready to roll up their sleeves to pitch in for this effort, I hope this helps to give you some guidance on how you can tackle the effort of putting your website on a “wait loss program” in the most efficient and effective way possible.

 

You Only Get to Work With a Fraction of that 1 Second

According to Google’s calculations, it takes about 200-300 milliseconds for 3G networks (which the majority of mobile users will likely be using) to perform a DNS lookup, perform the TCP handshake, and then make the HTTP request and subsequent response. Collectively, this adds up to about 600 milliseconds of overhead that you most likely won’t have any control in speeding up. That leaves you only about 400 milliseconds to work with to optimize. When working with such small numbers as these, even small updates can potentially help you get closer to that 1-second-for-above-the-fold-content-load goal.

Nipping Delays in the Bud

Repair any broken links or scripts from the source

I suppose it goes without saying (but I’ll write it nevertheless) that an ounce of prevention is worth a pound of cure. In this case, it’s best to prevent any of the links that are pointing to your site to have to undergo any additional delays that are a result of 404 errors or redirects. The key point here is to eliminate only the unnecessary. Sometimes, broken links happen and you need to implement redirects because you have no control over where the link was hosted (like if one of your vendors were linking to your site with a broken link, you don’t have access to your vendor’s site to update the URL).

However, if your site is redirecting to the www-version URLs but the internal links are all pointing to the non-www URLs, you should update these links before anything else. Is this time consuming and boring? You betcha! But it’s a chore that needs to be done regardless, so comb through your internal links to try to stop some of the delays at the source.

Avoid any unnecessary URL redirects

If you have any of the URLs on your site going through multiple redirections, go through and clean this up, too. If URL A is redirecting to URL B which is redirecting to URL C, then someone trying to access URL A is going through the delay of going through HTTP requests not only for URL A, but also URL B and then URL C. Get rid of unnecessary delays by cutting out the middle man (URL B in this case) and just have URL A redirect straight to URL C. Here’s an example:

URL A = (broken link)
URL B = (non-www version of the page you want to redirect to)
URL C = (www version of the page you want to redirect to)
→ = redirect

(Assuming URL A is a broken link and you have a redirect rule in place to redirect non-www URLs to their www counterparts)

Bad

URL A (broken link) → 404 Page

Not as Bad

URL A (broken link) → URL B (non-www URL) → URL C (www URL)

Ideal

URL A (broken link) → URL C (www URL)

In the example from Google, the “Not as Bad” scenario would look something more akin to:

These aren’t always going to be the exact time frames that you have to work with, and certainly with a mobile phone on a 4G network, this wouldn’t be as much of a problem, but you should still expect that you won’t always have this luxury.

Improve your website’s server response time

Sometimes, the problem could be caused from the server being slow or having difficulty rendering HTML code because it is bogged down, having to update files constantly. As Google points out, there are several potential factors that could slow the response of your server (slow application logic, database queries, routing, frameworks, libraries, or memory starvation, just to name a few). In some cases, servers may be running slow due to limited memory.

If you’ve ever tried running a bunch of programs all at once on your desktop computer, you might notice how it slows down everything, right? Well this is a bit like what is going on with the server. If it is running too many requests at once (particularly during bouts of high traffic from visitors), the server may get bogged down (and depending on the severity of the overload, could end up rendering the site unresponsive, but that usually only happens when done maliciously so you shouldn’t have to worry about this unless you’ve upset a group of spammers).

One of the ways (but certainly not the only way) to potentially increase your server’s response speed is by upgrading to a faster server or having your site moved to a dedicated server (which you’d have to work out with your site’s host). A lot of people may skip this step on the basis that their host may charge them more. However, if your site is running slow or not at all because it can’t handle the volume of traffic that could potentially be driving you more business, you might want to reconsider coughing up more money to keep your site running smoothly.

Remove any “road blocking” scripts

Unfortunately, despite the best-intentioned programming, when an external script needs to be rendered from within the above-the-fold region, this can require an additional delay of having to download that script. For smaller scripts, it is recommended to keep the script inlined within the HTML code (instead of externalizing it). This prevents the page from having to wait on the external script to be requested and downloaded while generating the above-the-fold content. This should only be done if the script is small.

There is no one-size-fits-all solution for inlining or externalizing scripts, and where to place your site’s external scripts greatly depends upon your site. Externalizing scripts can help prevent the need from having to edit scripts across several pages (you’d only need to edit that one file, instead of the code across every page in your site) but if they’re bulky and redundant or just full of unnecessary code, you should start trimming the fat on that code ASAP (that’s where minification comes into play).

Use Asynchronous Scripts wherever possible

When you simply can’t avoid having certain external scripts on your site, it’s best if you can utilize the asynchronous version of that script wherever possible. Whether you’ve got Google Analytics, Google AdSense, Crazy Egg, or Quantcast tracking codes, you’ll want to make sure that you’re using an asynchronous script that can be downloaded in the background, instead of forcing users to wait for the script to finish downloading before the page is rendered.

Minimize HTTP Requests

Leverage browser caching through HTTP “expires” or “max-age” headers

After cleaning up as many unnecessary redirects (or errors) as possible, you can also set the Expires header value with an expiration date. Though the Expires value needs to be denoted in an actual date form (e.g. Expires: Tue, 15 September 2013 16:00:00 GMT), the Cache-Control:Max-Age value allows you to set the “age” as opposed to a date (e.g. Cache-Control: max-age= 2629000). The benefit to this is its simplicity (you just denote how long from today that you want the cache to expire instead of an actual date, which could end up changing through the course of time).

Enable HTTP Keep-Alive

Don’t assume that because many HTTP connections are going to default to a keep-alive response header that you can always depend on this to happen every time. There may be some scenarios where the server may close the keep-alive connection (particularly to improve performance), so explicitly requesting the keep-alive response header is your “insurance” to keep from having the server reopen a new connection for each file it tries to retrieve.

Removing Excess Code and Externalizing What You Can

Minifying HTML, CSS and Javascript Files

Whenever you have “duplicate” code across your site that is used to render the formatting for the rest of the content, externalizing the script is often the preferred method. However, externalizing bloated code is just moving around a mess instead of cleaning it. Overlapping CSS rules, excess commenting, or non-utilized HTML code or javascript are just occupying space unnecessarily. Not only can excess code be found individually on a page, but it can also be found across your entire site or specific groups of pages (like category pages that might utilize a common template). Checking for common strings of code that can be eliminated, condensed, or externalized can help you minimize the amount of code that is necessary to render the pages on your site.

Enabling Compression

If you have large files that can be compressed in order to improve their load times then by all means compress those suckers. There’s several file types that you can compress into a zipped format. For Apache servers, this can be handled within the .htaccess file and it can also be configured within the settings for a site in IIS.

Modifying Content

Make sure you have text based navigation

This is an obvious detail that would be brought up on an SEO-related analysis of a site. However, even if you use alt attributes in place of actual text-based navigation and you end up depending on bigger, higher resolution images to render the text in the images in your navigation, you could be adding additional delays in page load time. Though images are great to engage your audience, just keep in mind that too many images aren’t such a great alternative either. If possible, try using the smallest, fastest images where you can without sacrificing the aesthetics and user experience of your site. This bring us to…

Optimize the images on your site

So there’s more than one way to “optimize” your images. Some people may say that optimizing the images simply refers to adding keyword-rich image alt attributes as well as a keyword-rich src attribute URL. However, there’s also something to be said about adjusting the size of the image (whilst maintaining as much of the same quality) to fulfill its purpose without it being excessively large and taking forever to load. The problem often comes from trying to reduce the size of the image, which can cause a loss in quality of the image.

Cropping the size and reducing the color depth or saving it in a specific format can greatly affect the size (and load time) of an image, but there’s also no one-size-fits-all approach to this either. For thumbnails, you usually don’t need very large, high quality images. However, the image that those thumbnails open up to may need to be higher quality. In a scenario such as that, it is better to have two separate images (the smaller-sized thumbnail that will load quickly, perhaps a GIF or PNG) which points to the link to the larger (higher quality, most likely a JPG) file. Here are some quick tips that Google provides while trying to select the file format that best suits your needs:

  • Go with PNGs if possible, but keep in mind that some older versions of browsers don’t have full support for PNG files.
  • If you have a lot of visitors using older browers, default to GIFs instead of PNGs.
  • Use GIFs for:
    • very small images
    • simple graphics (little color depth)
    • images containing animation
  • Use JPGs for all photographic-style images
  • Avoid using BMPs or TIFFs

Prioritize visible content by reducing the size of Above-The-Fold (ATF) content

Since Google has put the emphasis on the 1 second load time for the ATF content, it’s best that you structure your page’s code to render the code that generates the ATF content before the rest of the page’s code. Don’t put the cart before the horse. If you cannot avoid having the entire page load in over a second then make sure that you “front load” the rendering to the above-the-fold content first. You may also need to minify any resources that might be essential to helping render the ATF content ASAP as well.

Tools:

You can’t give the proper treatment without first diagnosing your website, so here are some tools to help you analyze your page load speed and some other data points that can help you identify what files are carrying the most wait (time). Trying to apply a global solution to a local problem doesn’t always provide the most efficient and effective results, so don’t be afraid to do a little more digging around before you begin implementing any changes. Here are just a few tools you can use to identify issues with page load speed:

Pingdom – http://tools.pingdom.com/fpt/

Pingdom is a popular tool that provides a nice visual representation of the file sizes and load times of various files or paths found on a given webpage. Since the visualization is provided in a gantt chart, it not only shows you the length of the time each file takes to load, but also the order in which it is loaded in sequence with the entire page load. This can help you find glaring issues that might be holding up your page load speed.

Google PageSpeed Insights – http://developers.google.com/speed/pagespeed/insights/

Another free tool provided by the Big G, PageSpeed Insights provides a report on some of the issues listed above (thus the reason there are so many links above that are pointing to PageSpeed Insights) as well as examples of where the issues are coming from and how to go about fixing them.

Google PageSpeed Module – https://developers.google.com/speed/pagespeed/module

Google also provides a module (you’ll likely need your developer to help out with the installation) to help implement some of the optimization strategies listed above. Since this is a bit over my head, I’ll just stop myself here and let you (or your developer) do the rest of the research with the link above.

Google Webmaster Tools – http://www.google.com/webmasters/tools/

Believe it or not, you can use some of the data that Google already has in order to detect issues on your site. Not only can you identify Crawl Errors, like 404 “Not Found” response codes, you can also check your average page download speed in the Crawl Stats report and, in some cases, you can find some helpful user experience improvement suggestions in the HTML Improvements section as well.

Google Analytics – http://www.google.com/analytics/

You can actually access some average page load time data straight from Google Analytics. You can view this averaged data based on browser, country/territory, and page, and you can even get PageSpeed Insight suggestions straight out of a Google Analytics Site Speed Suggestions report (just go to Content > Site Speed > Speed Suggestions). Oh, and did I mention you can make a custom 404 error report in Google Analytics as well? Pretty cool, if you ask me.

Xenu Link Sleuth – http://home.snafu.de/tilman/xenulink.html

You can also crawl your site to detect redirected and broken links, large files sizes (which is also categorized by file type, so you can see which images are the largest for example), and the duration of the page load time.

Screaming Frog – http://www.screamingfrog.co.uk/seo-spider/

Much like with Xenu Link Sleuth, Screaming Frog allows you to crawl your site, only it grabs a bit more data* than with Xenu. I included that asterisk because if you use the free version of Screaming Frog, you are limited to 500 URLs (that’s including files and images) whereas with Xenu, it will crawl and scrape until you stop it or it is done checking the entire site. The difference is that Screaming Frog pulls more data from more HTML elements than Xenu Link Sleuth (e.g. rel=”canonical”, meta robots, meta description, and meta keywords tags), but if I need to crawl a site with more than 500 URLs, I usually default to using Xenu and supplement that missing data by using the SEO Tools Plugin for Excel to extract any other additional data on a page-by-page basis. It’s not as ideal to do it that way, but in the clutch, I’d rather have data that took a while to gather than no data at all.

References:

I am by no means an expert when it comes to website development, though I try to understand things as best as I can. Fortunately, there are a lot of other experts out there who have provided some really helpful guidance on this matter. Here are some great posts that I have either drawn upon for some of the examples above or that I think are super cool and insightful:

Connect with Amanda Thomas on Google+

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>