Many large companies assume that site speed isn't very important to the overall user experience. This is normally a result of the site's operational departments setting goals in terms of what they think is a reasonable range. This range can often be well over two to four seconds.
This may not sound like much, but if you add other network issues on top of this time, it can create a horrible user experience. For example, if you have a visitor from a small country on a poor dial-up connection you could see a load time of several minutes, in an extreme case.
More often, you won't see the several minutes load time -- or even hear about it -- because the user on dial-up won't come back. You'll be lucky if they take the time to tell you the problem they had with your site.
These considerations are taken into account by most major search engines. If you look at short tail or tier head keywords, such as news, cars, real estate, and others, you'll find these generic terms searched for in massive quantities and high frequencies.
If we assume any one of these keywords receives 100,000 searches per day with an average CTR (define) of 75 percent, you would need to return this page at an average of 250 milliseconds to be able to handle the load for optimal speed and efficiency. If you return at a rate over 1000 milliseconds, your speed will be to slow to deliver a reasonable experience.
It would be unwise to assume this form of measurement isn't taken when search engines perform their quality score (define) in relations to your PPC (define) campaigns. It would be unwise for them to list a slow site in the top listings.
You can see what your average site speed is through the Google Webmaster Tools interface. If you're still skeptical about search engines looking at this data, carefully think about why a search engine would store this data.
It would be illogical to store data you're not going to use in your calculations. It's rare to see a slow Web site ranking for a competitive term, no matter how much link popularity and unique content can be found.
Many operations departments find search engine bots to be a nuisance. Instead of making the site fast enough to handle these issues, they'll introduce an application called mod throttle. This application can be loaded in apache and will spit back server error messages, typically within the 500 range, saying the server is unavailable.
While the users will never see this message, don't kid yourself. If you give this message to the search engine, bots or spiders (define) will assume it's the message you're sending to your users. They won't manually check every site to be sure it's responding quickly and accurately.
Here are some other helpful tweaks in order to keep your site fast and clean.
Consider using a caching service like Akamai or install hardware to make sure your pages are stored in memory.
Decrease the size of the images you see on the pages. The more images you have -- and the larger they are -- the slower your site will respond.
Many people still don't have broadband connections. Others use comparatively slow WiFi. Still others access the Internet on their mobile phones. These users will likely experience slower site load time.
Another great Google-inspired trick: Use more white space in your design in order to load less in terms of visual affects. Or, if you're green and in favor of consuming less energy, use more black space, as Blackle does.
In writing code for your site, decrease the white space by eliminating excess spaces. Excess spaces between your coding are simply unneeded tabs and spaces. That reduces the number of bytes off the total size of the page, speeding load time.
There are also companies that specialize in monitoring sites and sending you alerts when the speed changes dramatically. These services are great for making sure you're returning appropriate levels of performance around the world.