There is a good post by Adam Lasnik on the processing of spam complaints. Adam also talks a bit about duplicate content in his post.
Regarding spam complaints, Adam makes it clear that Google does not bury sites like Digg articles, i.e., based on a voting scheme, where you get "voted off the island". For the most part, these complaints are used to help Google design improved algorithms to help identify spammy behavior that their algorithms do not currently detect.
Adam's comments on duplicate content illustrate another basic search engine concept. Search engines try to respond to a user's search query in manner that provides them with the best chance of answering the user's real question on the first page. Since user search queries almost always have some level of ambiguity in them, the search engine tries to supply a sampling of varied results.
One thing they don't want to do is supply multiple copies of the same content. And this is the essence of the duplicate content filter. Find the most authoritative version of an article (hopefully, the original), present that result, and ignore the rest (or in Google's case, ship them off to the Supplemental results).
This whole process is detailed really nicely in a post by Rand Fishkin titled The Illustrated Guide to Duplicate Content in the Search Engines. Rand uses some nice images to capture the content visually.
Meet Your Favorite Search Engine Watch Contributors
Many of SEW's leading expert contributors will be at ClickZ Live, the new online and digital marketing event kicking off in New York (March 31-April 3). Hear from the likes of: Thom Craver, Josh Braaten, Lisa Barone, Simon Heseltine, Josh McCoy, Lisa Raehsler, Greg Jarboe, Dan Cristo, Joseph Kerschbaum, John Gagnon, Eric Enge and more!