As Danny mentions, it's good to see the total size war go away for at least the time being. Danny also points out this page from Google that lays out there thoughts on comprehensiveness. A couple of quick comments, including wondering if the results counts that every search engine shows should now go away.
From the page:
The basic test for search engine comprehensiveness is whether you can find uncommon information. Popular queries return millions of results, but even the most obsessive searcher isn't about to surf a few million pages, or even a tiny fraction of them; in most of these cases, you'll either quickly find what you're looking for or refine your search to be more focused.
Perhaps it's time to take a look at the usefulness (asides from their marketing value and likely the reason they don't point out this fact) of the page estimates that Google and others provide at the top of results pages.
Just how accurate are they? What are they telling the typical searcher? It would be useful if all search companies (not only Google) would let the public (including many journalists) know that they're just estimates and often far from accurate.
Yes, some people will refine (if they know how, do they?) their searches. However, don't forget that even if you wanted to view all of the results, you couldn't. Most web engines will only show the first 1000 results.
Are the estimates on web results pages going to be the next battleground? I wonder how many people even noticed the total that Google used to list on their home page vs. the estimates they see each and every time they run a search?
More from the Google page:
To see for yourself, try searching for something very specific, or try a query that previously returned very few results. For example, you could enter your name or hometown, along with your favorite color or animal. Navigate to the last page to see how many results the search engine really delivered. (On the last page, you may have to click the "repeat the search with the omitted results included" link to see all the results.) Do this on different search engines for several queries and see what you come up with. As you can imagine, we've run quite a few tests like this, and we expect your results will be very similar to ours.
Sure, you'll likely find a result for this type of query but the real question is how useful is the info to the searcher? Is it a page simply scraping or reposting (possibly without permission) content from another page that's already in the index? Are random words (note the Google suggested search above) simply appearing on a word list? Is it one of the thousands of versions (technically different pages) of the Online Directory Project appearing in the index? How about nearly identical pages for a book appearing at Amazon.com and many affiliates?
These pages will show up on results pages and be included in the total count but, in many cases, the material could prove to be of little value to most searchers.
Don't get me wrong, comprehensiveness can be a VERY good thing. However, larger indices can also be a challenge, especially for the unsophisticated searcher. That's why verticals and specialized search tools that focus on a specific type of material can be very valuable.
As I said yesterday, Google and all of the major engines would be doing all searchers a favor by using their notoriety to teach people, even in a small way, to use ALL the tools they offer to build better queries that offer more precise results.
ClickZ Live San Francisco (Aug 11-14) will bring together the industry's leading online marketing practitioners to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, the comprehensive agenda will help you maximize your marketing efforts and ROI. Register today!