A paper presented at the 10th International Conference on Extending Database Technology conference in Munich near the end of March, Indexing Shared Content in Information Retrieval Systems (pdf), jointly authored by employees of Yahoo, Google, and IBM, discusses how to limit index sizes of search engines by reducing the amount of duplicate content contained in their indexes.
After reading it, I started considering and listing some of the problems that sites may have that could cause search engines to not index the pages of those sites, or display them in search results. My list is in a post at SEO by the Sea, titled Duplicate Content Issues and Search Engines.
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!