About The Update
The Search Engine Update is a twice-monthly update of search engine news. It is available only to Search Engine Watch members. Please note that long URLs may break into two lines in some mail readers. Cut and paste, should this occur.
In This Issue
+ Search Engine Strategies Is Boston-Bound!
+ Vote In The Search Engine Watch Awards!
+ Yahoo Now Charging Annual Listing Fee
-- (full story online, link provided)
+ Google Gets Bigger, Fresher, Offers Better News
+ Google Launches Catalog Search
-- (full story online, link provided)
+ Gapster Security Hole Fixed
+ Excite To Be New Meta Search Site
+ Espotting To Announce Major Deal This Week
+ SearchDay Articles
+ Search Engine Articles
+ List Info (Subscribing/Unsubscribing)
Happy New Year, Everyone!
I hope all of you had a nice holiday break and that the New Year brings you great things!
Search Engine Strategies Is Boston-Bound!
The date is set! On March 4 & 5, Search Engine Strategies will arrive in Boston for two days' worth of sessions packed with information about search engine marketing. The program is now online and brings back many of the panels that proved popular from our last US show in Dallas, as well as some new ones.
The Search Engine Strategies conference is suitable for both those new to search engine marketing and those who are more advanced. Multiple "tracks" ensure there's always a session of interest to everyone. Both experts in search engine marketing and speakers from major search engines such as AltaVista, Google, FAST and the Open Directory will be presenting.
Those interested in sponsoring or exhibiting at the event should contact Frank Fazio Jr, [email protected], for more information. Those interested in attending can find an overview of tentative sessions and sign-up information via the URL below. If you sign-up before Feb. 13, you'll save on the admission price.
Search Engine Strategies Boston 2002
Vote In The Search Engine Watch Awards!
It's official. Votes are now being accepted for the 2001 Search Engine Watch Awards. There are a variety of categories, ranging from "Outstanding Search Service" to "Best Search Feature." You can vote for any categories of interest to you or in all of them.
The form will be online for the next week. Then, I and Search Engine Watch associate editor Chris Sherman will review what readers had to say and make our final decisions over which search engines deserve recognition for their work during 2001.
Search Engine Watch Awards Voting Form
Yahoo Now Charging Annual Listing Fee
Yahoo is now requiring that new sites seeking to be listed in its commercial areas pay an annual listing fee of $299 or $600, if they are adult sites. Previously, the fee had been a one-time charge. The change transforms Yahoo from being a web guide to an online yellow pages, to some degree. A look at how the new program works, as well as the potential the impact on searchers and site owners. The full story can be found via the URL below:
Yahoo Now Charging Annual Listing Fee
The Search Engine Update, Jan. 7, 2002
Google Gets Bigger, Fresher, Offers Better News
In December, Google became the first crawler-based search engine to break the 1.5 billion web page mark. In addition, the service rolled out changes designed to improve the freshness of its results and the ability for users to find news.
The Google index now contains more than 1.5 billion web pages that have been actually visited by Google, as well as an additional half-billion pages that it knows about through links. There are also another 330 million image files and 700 million Usenet posts, which stretch back to 1981.
The enlargement of Google's Usenet information makes it a fantastic resource for researching the early days of the Internet, and Search Engine Watch's associate editor Chris Sherman takes a closer look at the enhanced Google Groups, in his story below.
Sherman's story also provides more details about Google's improved news search results. Since the middle of 2000, Google has provided links to news stories at the top of its results page, in response to certain queries. The news content was pulled from major wire services.
The latest changes now pull that content from hundreds of web sites that Google says it has identified as having news content. Google did not specify exactly how news sites are identified nor is there any specialized news submission option. The company simply said that it has an automated process that it believes will find sites with good news content.
"If it looks even remotely like a news site, then it should be part of it," said Urs Hvlzle, Google Fellow and member of Google's executive management team.
Google says that news links are three times more likely to appear in results, than in the past. When it appears, news content shows up at the top of the standard Google results page, with the word "News" to the left of any links. Try a search for "euro" or "argentina," and you'll see examples of news links.
Users are also apparently pleased to get this news content, because the clickthrough rate on news links is five times better than before, Google said.
Unfortunately, the changes still leave Google weak in the news search arena. At competitors AltaVista and FAST, there are dedicated news search offerings. There are also a variety of good, new news search sites such as Daypop and RocketNews available, in addition to established ones such as Moreover. In any of these places, users who specifically want to find news content can be guaranteed to find it.
In contrast, there's no way to specifically perform a news-only search at Google, in the way you can an image search or a newsgroup search. Instead, you have to hope that the Google search algorithm manages to float news search results up in response to your query. To stay competitive, given the huge interest in news search, Google needs to finally make a dedicated news search option available.
Google did roll out a "Headline News" search service also in December, but that's not the same thing. This service aggregates top headlines from more than 100 leading English language newspapers into a single page, as well as grouping them into six categories: World, US, Business, Entertainment, Technology and Sports.
Google is promising future changes, such as more news sources and interface enhancements. Hopefully, one of those enhancements will be the ability to do keyword searching against the Google news search index used to feed its main results page.
Google is also trying to improve the freshness of its web page index. Previously, Google updated its web page index on a roughly monthly basis. This meant that pages could be around a month old, if you used Google just before the latest refresh happened.
The monthly refresh is still continuing, but a new daily refresh now also runs. A few million pages identified as being time-sensitive are being spidered regularly, so that the latest information from them is available.
Google is even highlighting if a page has been refreshed recently by the use of a new "Fresh!" tag that appears next to a page's URL. They show the exact time the page was respidered.
For instance, search for "white house," and you'll see that the US White House site is noted as "Fresh!," having last been visited on January 6.
The Fresh notations are welcomed, but even better would be if Google showed dates for all the pages it lists, in the way AltaVista used to offer. Then, it would be extremely easy to know exactly when a page was last visited by the Google spider.
By the way, that long-standing page date option was available at AltaVista until recently. It now appears to have been pulled, probably because it made it so easy to understand how fresh -- or stale -- AltaVista's index was.
Google didn't explain exactly how pages are selected for regular visits, but there are some common factors that can help. A good PageRank is one. Pages deemed more important in Google's link analysis system have an improved chance of being visited more often. However, it's also important that the page is also "relevant" for regular updating, Google says.
What's that mean? Google wouldn't go into more detail at the moment, but it's something I'm looking to revisit with them. I think it's fair to assume that it means the page has shown some degree to change frequently. However, "Just changing your page everyday isn't going to help," Hvlzle said.
Google also said that the daily refresh of web pages operates completely separate from the news search service. In other words, just because your URL shows up with a "Fresh" notation doesn't mean you've been selected as part of Google news search. However, it does mean that for the near-future, that particular page will be revisited on a regular basis.
Overall, I wouldn't get too worried about trying to get your content "Fresh" noted. If your content changes often, then with luck, it will happen automatically. If it changes often and you seem to be overlooked, then use the general Google feedback form to alert them to this. And if your content doesn't change frequently, then don't waste time or effort trying to make it seem as if it does. The Fresh moniker does not come with any type of ranking boost.
Given all these crawling changes, it's also a good time to revisit the basic architecture of how Google serves its web pages. The company says it currently runs four data centers, two on the West Coast of the United States and two on the East Coast.
Each data center consists of around 2,000 to 4,000 separate computers, which store the Google web page index. They are roughly mirror images of each other, but there may be small differences. For example, changes to a master data center will propagate to the others, but there can be a short delay until this happens.
For the most part, users shouldn't notice any differences. When you do a query, Google tries to automatically route you to the closest data center. For example, European users would tend to get results from one of the East Coast data centers. Google also tries to ensure that you stay with a particular different data center during a search session, so that if you repeat a search, you aren't surprised by minor differences.
Things are a bit different with Google Groups and Google Images searches. Because there is less demand on these indexes, they are not mirrored at every data center, Google says.
Overall, about 10,000 computers are involved in maintaining all of Google's search indexes. A fifth data center is currently being built on the East Coast, so expect that number to rise more.
Also, it's popular on the major search engine forums to talk about changes that happen on the Google "numbered" servers, "www2.google.com," "www3.google.com," and "www4.google.com." Google says these are "production" services, not live servers that correspond to being switched automatically to a particular data center.
Google engineers use the production servers to test different things, so some people find them as a guide to what may happen on Google in the future. Then again, they may not follow what goes live on Google at all. About the only thing Google said they were useful for, from a webmaster perspective, is knowing whether a previously unindexed site has been visited.
In other words, if you aren't in the live Google results but see your site listed in one of the production servers, you'll probably hit the live server in the near future. The caveat here -- and it is important -- is that while you may be listed in the live results, there's no guarantee that you'll rank for a particular term just as you did with one of the production servers.
Another popular topic on the forums at the moment is that some web site owners have found that their PageRanks have dropped. One reason behind the drops was a glitch in the Google Toolbar, which site owners can use to determine PageRank, Google said.
"As it turns out, while upgrading a server, we made a slight change that effected the toolbar. Once we discovered the issue, a fix was implemented," said Google spokesperson Nate Tyler.
I had asked about any potential problems last Thursday, when noticing that my toolbar was giving sites such as Yahoo and even Google itself no PageRank. Sure enough, the next day when the response came, I also found the both sites were again getting top scores of 10 out of 10.
Finally, Google looks to be readying plans to syndicated its AdWords paid listings to other sites or perhaps on the search results it provides to partners and others. The FAQ page about AdWords recently got this new addition:
"AdWords advertisers already know the power of keyword targeted text ads. Google has made arrangements to extend the reach of these ads to a wider range of people conducting searches on Google's partner sites. Google's new syndication program will put your ad on several popular sites, which means more new users see your ad -- increasing your reach to a wider audience of potential customers across the Web. The Google syndication program launches in the next 1-2 months."
Google said it couldn't talk about the plans yet, but as soon as they can, I'll bring you more details.
Google Launches New Salvo in Search Engine Size Wars
SearchDay, Dec. 11, 2001
More details on Google getting bigger, enhancing its Google Groups area and making freshness changes.
Google Headline News
News Search Engines
Freshly-updated, a guide to major news search resources.
Google Production Server Search
Lets you conduct the same search against all of Google's production servers, at the same time.
The Google Toolbar makes it easy to see a page's precise web rank.
Google AdWords: About Syndication
More about Google's plans to syndicate its ads and how to opt-out, if you don't want this to happen automatically, when syndication begins.
Newly updated page that's an entertaining read of Google's Cinderella story.
Google Launches Catalog Search
In December, Google rolled out a completely unexpected offering: Google Catalogs. The new service allows you to search through the contents of catalogs from over 600 companies. A look at how the index was created, tips on using it and how one of the ways Google would like to make money off of it might mix editorial content and cash at Google, for the first time. That's not necessarily bad for the user, but it would be a new direction for Google. The full story can be found via the URL below:
Google Launches Catalog Search
The Search Engine Update, Jan. 7, 2002
Gapster Security Hole Fixed
The Gapster bid management tool has closed a security hole that left account information unprotected on the web for what Gapster-owner Did-It.com says was a "brief" period.
Kevin Lee, CEO of Did-It.com, says that only two people are known to have accessed the information -- myself and the reader who reported the hole to me, as indicated by server log data.
Gapster is a software-based tool that allows users to manage bids on Overture, Overture UK, FindWhat and Kanoodle. It can log in to your account and make changes, automatically.
Though software-based, Lee said that Gapster stores account information on Did-It's computers, which can make bid changes. This is done so that the software doesn't need to be changed, if a paid placement search engine alters how its account management system works.
For example, Overture made changes to its DirecTraffic Center in December that required Gapster to alter how it interacted with the DTC. Because Gapster is managed through a central server, Did-It didn't need to ship software patches, Lee said.
The downside, of course, is that such a system means that your information is being routed through a third party, rather than directly to the paid placement search engine. If that's a concern, you should check with the makers of any bid management software, to find out where and how your data is stored.
Lee said it was a programming error that left a file with account data on a publicly-accessible web server. The error was corrected in mid-December, and the file removed to a protected location. At the time there were only a "handful" of Gapster beta users, he added.
"Programmers are human, and ours made a mistake. We are pleased that the only two IP address blocks that accessed the data were yours and the originator of the email to you," Lee said.
No credit card data was in the file, but it was possible to discover log-in details. If you were a Gapster user before Dec. 18 and this concerns you, you may wish to change your password with the paid placement search engines that you use.
I found a free Bid Management Software
SearchEngineForums, Dec. 13, 2001
The security hole was also raised in a public forum, though after the hole was blocked.
Excite To Be New Meta Search Site
Last newsletter, I wrote about how Excite had shifted over to using Overture-dominated listings. InfoSpace, the new owners of Excite, then got in touch to say this will change, in the future. The company says it plans to create a new meta search service for Excite that will be different from its existing Dogpile and MetaCrawler meta search engines.
"It will be a meta search solution, which is currently being worked on and designed for the savvy Excite search user," said InfoSpace spokesperson Steve Stratz. "Overture is assisting in providing a short-term solution, while the long-term meta search solution, which Overture will be included, is being worked on and designed."
Plans are for the new meta search service to go live sometime in the next three months, Stratz said.
I also wrote in the last newsletter that Excite UK was to close. Now it has. Attempts to reach Excite.co.uk now lead to former rival, Lycos UK.
Excite & WebCrawler Go To Overture
The Search Engine Update, Dec. 18, 2001
Previous article on changes to Excite.
Espotting To Announce Major Deal This Week
Espotting.com, the UK-based paid listings service, will greatly expand its European reach through a new deal to be announced this week. The news comes following partnerships made last month to place Espotting results on AltaVista France and Lycos Europe.
I can't name the exact European portal that's involved, because the news embargo hasn't been lifted. However, it's an important one that will add to an already impressive distribution list that the company has. Existing Espotting partners include Lycos UK, Ask Jeeves UK, LookSmart UK and UK Plus.
I'll bring more news in my next newsletter, or watch the SearchDay newsletter or the Espotting.com site itself.
Here are some recent articles that may be of interest, from Search Engine Watch's daily SearchDay newsletter:
2001's Most Wanted Search Terms
SearchDay, Jan. 2, 2002
What were the most popular search terms of the past year? It depends on which search engine you ask.
SearchDay, Dec. 27, 2001
One of the earliest web subject directories was developed by the people who created the web itself -- and somewhat remarkably, it's still online.
Internet History Archives
SearchDay, Dec. 26, 2001
The Internet's rich history is easily accessible via these outstanding searchable archives of original source documents.
MSN Search Testing Paid Listings with a Twist
SearchDay, Dec. 20, 2001
MSN Search is now including paid listings from Overture in some search results, but with a "twist" that makes them more useful than those served by other major search engines and portals.
How Search Engines Use Link Analysis
SearchDay, Dec. 19, 2001
Link analysis is the secret sauce used by Google and other search engines to determine relevance -- and webmasters ignore it at their peril.
On the archive page below, you'll find more articles like those above, plus have the ability to sign-up for the free newsletter.
Search Engine Articles
Marketers Use Invisible Words on the Web
New York Times, Dec. 24, 2001
Just when you think search engine marketing has become more mainstream, it only takes an article like this to help you realize misperceptions remain. [Begin Dr. Evil's voice” Turns out there are these unique things called "meta tags" that can help you get better rankings with search engines. This article looks at how companies are using these "meta tags" to gain better rankings. But it warns: "The use of meta tags may become less important as some search engines shift away from keyword placement." [End Dr. Evil voice”. The reality is the only two of the major search engines, AltaVista and Inktomi, give any significant weight to meta tags. Sure, you should use them, but don't expect too much from them. And that predicted "shift" happened several years ago -- something to do with analyzing links, I hear.
SEO and the Web Site Design Process
ClickZ, Dec. 19, 2001
Tips on making your site search engine friendly during the design process.
In Defense of Search
Semantic Studios, Dec. 7, 2001
Getting Them to What They Want
User Interface Engineering, Oct. 2001
Well-known usability expert Jared Spool could be described at anti-search. Reports from his company consistently warn that web site designers shouldn't force users to search for what they want. Instead, he tends to favor clearly-labeled navigational structures. I'd agree with much of what he says, but search still has a role to play. It's is a feature users do expect, and it can be a useful feature to offer, if done correctly. Usability analyst Peter Morville has more thoughts on why search shouldn't be ignored, in his "In Defense of Search." The second URL is a recent report from Spool's company that discusses the importance of good navigation, rather than running a search-centric site. It costs $25, but it's a small price for plenty of good content and advice.
On searching the Usenet
Pandia, Nov. 11, 2001
Review of Gripe, a Usenet search service and online reader.
How do I unsubscribe?
+ Follow the instructions at the very end of this email.
How do I subscribe?
+ The Search Engine Update is only available to paid members of the Search Engine Watch web site. If you are not a member and somehow are receiving a copy of the newsletter, learn how to become a member at: http://searchenginewatch.com/about/subscribe.html
How do I see past issues?
+ Follow the links at:
Is there an HTML version?
+ Yes, but not via email. View it online at:
How do I change my address?
+ Send a message to [email protected]
I need human help with my membership!
+ Send a message to [email protected]. DO NOT send messages regarding list management or membership issues to Danny Sullivan. He does not deal with these directly.
I have feedback about an article!
+ I'd love to hear it. Use the form at
Know your Ambiguous Customer: Effective Multi-Channel Tracking
Wednesday, June 5 at 1pm ET - Learn why a move from the "batch and blast" email approach enables better conversations with your customers.
Register today - don't miss this free webinar!