It is a common practice among large site owners to consider that site speed is not all that important of a factor when it comes to a good user experience. They sometimes forget there are still millions of dial-up users, mobile users, and users on low-quality connections who will not find 1 or 2 seconds to load a page to be an adequate experience. Besides a bad user experience, slow load times can impact your SEO efforts as well.
A slow-loading site can mean the difference between getting all your pages indexed, or just a few. If Googlebot or other search engine spiders spend their limited time on your site waiting for pages to load, they may not be able to index all of your pages.
As I have said in a previous column, I also believe that latency is a ranking consideration in some search engine algorithms. Otherwise, it would not be shown in the Site Maps feature of Google's Webmaster Tools. I have often seen sites gain significant traffic lifts from Google when they decrease the latency to their sites.
Thus, when you hear that a 1- or 2-second load time is acceptable for search engines, don't believe it. Make sure you set up Google Webmaster Tools and monitor the latency that Googlebot is experiencing. You can find this under the crawl rate meter.
Addressing the problem of slow load times begins with optimizing page size. This can improve your site from a search engine's point of view, and a user's point of view as well. It is important to go through every image file on a page and make sure they are as small as possible. Using simple images and re-using the same images can make a big difference.
Your optimization efforts should also include the ads you are displaying on your site. Are they too large? Some ads use large Flash movies, or require other plugins, which can slow download time significantly. User testing can provide helpful insights to these types of problems.
A site's latency can easily be multiplied if the business has an international presence. It is possible to optimize your connection speed by maintaining hosting in the U.S., if that's an option.
Depending on the kind of site and audience, it may also make sense to create a mobile version of your site. This type of stripped-down site could be served to users with slower connections, mobile phones, and users who depend on screen reading programs. This type of accessibility can potentially provide a boost in your rankings as well.
One thing to ensure you don't do is try to improve latency by restricting or blocking search engine spiders. This is a common response from many operations teams when crawlers are taking up too many resources, but it is a very bad move.
Restricting search engine spiders could cause a major loss in search engine traffic. Most throttling software packages send back 5xx errors, which are basic "server busy" errors. When a spider gets this kind of response for a given page enough times, the page will be removed from the search engine's index. And as I said before, it's likely that poor response time is a factor in the page's ranking as well.
In the end, if speed is still an issue, caching companies like Akamai can be a good option. These companies have data centers throughout the world, keeping your pages loading fast and locating them as close as possible geographically to the users.
While more and more users are connecting by broadband every day, there will always be a good reason to pay attention to page load times. Keeping your site lean and mean can keep your customers happy, and it can also give you an edge in the search results.
The Original Search Marketing Event is Back!
SES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Register today!