While surfing the web, we’ve all encountered pages that take a long time to load. Sometimes Internet backlog causes slow load times. At other times, the slowness is caused by bloat.
If you have bloated pages on your site, there are several good reasons not to ignore them.
Users Abhor Bloat
Bloated pages can ruin user experience. Site visitors want to accomplish their task and move on as quickly as they can. Research studies suggest Web surfers may make an evaluation of a page in less than one second.
The Web version of the 8-second rule indicates your page better load in eight seconds or less or the user will bail on you.
Today, the 8-second rule is obsolete.
- The new rule is that pages must load in five seconds or less.
- Not everyone has high-speed Internet connections. Plenty of people still surf with modem connections.
In addition to caring about how users perceive your site, think of how bloat can affect search engine ranking and linking. The user you lose may have been a potential linker.
Bloat Raises Ranking Issues
I’ve talked to a number of Google engineers about problems caused by bloated pages. Googlers working on the Web spam team or search engine crawling and indexing have also addressed the issue at public events.
Twice, the answer has been that cleaning up these scenarios and moving the unique text toward the top of the page would indeed result in better crawling, indexing, and ranking. One of the most notable proponents of this strategy was Dan Crow, of Google’s crawling and indexing team. Jonathan Hochman saw Dan speak at the SEMNE event in July 2007. Here is an excerpt from the article Jonathan wrote for Search Engine Watch on this topic:
I had a similar experience speaking with Shiva Shivakumar, Google distinguished entrepreneur and director of the Seattle-Kirkland R&D center. He told me in direct terms that streamlining code can have a direct impact on rankings.
When I interviewed Google search evangelist Adam Lasnik, he indicated there were numerous reasons for cleaning up code, but improving SERP rankings in Google would not be one of them. At best, the view from Google seems somewhat contradictory.
Yahoo! announced the robots-nocontent tag back in May 2007. The no-follow tag tells the Yahoo! crawler which portions of the page represent unique content. The Yahoo! tag alone makes cleaning up your code and page size worthwhile. If there were no potential ranking issues, it’s doubtful the new tag would have been created.
Bloat Weights Down Web Analytics
Clean, Reduce, Resize
There are plenty of reasons to clean up your code and lower the size of your page: better user experience, improved search engine interactions, and more accurate analytics. What’s the right size for a page? A 100 Kbyte file can take 14 seconds to download on a modem. The same page may take about 2 seconds or so to load via cable modem provided the Internet doesn’t introduce other delays into the load time.