AltaVista Launches Trusted Feed Program

AltaVista has added a new component to its paid inclusion lineup, with the launch of its Trusted Feed program. The service allows businesses with large web sites to submit 500 or more URLs via an XML (Extensible Markup Language) feed directly to AltaVista's index.

Complementing its Express Inclusion program, which is designed for small to medium web sites, the Trusted Feed program provides an unprecedented degree of control over how web page listings appear in AltaVista. Webmasters have the option of submitting metadata including custom titles, keywords and descriptions for each URL. Information submitted in a Trusted Feed replaces or supplants the information ordinarily gathered by AltaVista's crawler.

While the Trusted Feed program allows webmasters to influence key components of each URL, AltaVista insists that the underlying pages will still be subject to the same relevancy algorithms as all other pages in the index. The metadata contained in a Trusted Feed is just one of many factors used to compute relevance, according to Chris Kermoian, director search services and web marketing services for AltaVista.

"With AltaVista Trusted Feed, we are giving businesses a new way to manage their URLs and content, while increasing the quality of our index and results," said Kermoian. "By analyzing the detailed web page information provided by our customers, AltaVista is offering users a more accurate snapshot of each search result."

Trusted Feed customers are provided with extensive reporting tools, designed to reveal both how much traffic particular pages receive from AltaVista, and where the traffic is coming from. Reports can be generated for overall traffic patterns, top queries, top URLs, "clicks related to this word," "clicks related to this URL," and other types of information.

Reports can be downloaded into Excel spreadsheets for further analysis. Since Trusted Feed sites are refreshed once a week, the reports will provide valuable feedback to webmasters, allowing them to check position and tweak pages to achieve higher traffic.

The program will particularly benefit sites that are traditionally difficult to crawl, such as those using frames or dynamically generated content. The program can even be used to submit URLs to AltaVista from sites that block search engine crawlers with the robots.txt protocol.

Because it is indexing content from these types of sites, the Trusted Feed program is in essence a major step forward in revealing significant parts of the Invisible web.

Pricing for the Trusted Feed program is on a cost-per-click model. "The CPC component of the pricing for the Trusted Feed product varies from $0.15 to $0.60 depending on the category of content," according to AltaVista spokesperson Kristi Kaspar. "There is a minimum monthly spending for CPC prices in the lower end of the range."

Critics will note that the Trusted Feed program has many similarities with the practice of cloaking, or submitting customized pages to search engines that have nothing in common with pages actually viewed by users. But AltaVista's Kermoian disputes the comparison, noting that a number of safeguards have been establish to prevent abuse of the program.

For pages that are already indexed, AltaVista will compare the Trusted Feed metadata with the pages themselves. It will also conduct periodic spot-checks of pages, comparing them with Trusted Feed metadata. In both cases, if the underlying pages don't sync with Trusted Feed metadata a spam penalty will be applied.

Contractually, AltaVista can also reject any page if a Trusted Feed customer doesn't comply with AltaVista's policies for site submission, which are the same for both paying and non-paying webmasters.

On balance, AltaVista's Trusted Feed program looks like a major step forward in what is often perceived as an arms race between search engines and webmasters.

AltaVista Trusted Feed Program

What Search Engines See Isn't Always What You Get
Cloaking is a technique used by some webmasters to deliver one page to a search engine for indexing, while serving an entirely different page to everyone else -- in short, the classic bait and switch technique applied to the web.

Search Headlines

NOTE: Article links often change. In case of a bad link, use the publication's search facility, which most have, and search for the headline.

About the author

Chris Sherman is a frequent contributor to several information industry journals. He's written several books, including The McGraw-Hill CD ROM Handbook and The Invisible Web: Uncovering Information Sources Search Engines Can't See, co-authored with Gary Price. Chris has written about search and search engines since 1994, when he developed online searching tutorials for several clients. From 1998 to 2001, he was's Web Search Guide.