There are many people discussing a recent patent Google was awarded for picking up on ambient audio from your TV and pairing those sounds to your computer to serve up ads based on what you are watching (or something like that). Google Research Scientists, Michele Covell & Shumeet Baluja, described the technology as;
We showed how to sample the ambient sound emitted from a TV and automatically determine what is being watched from a small signature of the sound -- all with complete privacy and minuscule effort. The system could keep up with users while they channel surf, presenting them with a real-time forum about a live political debate one minute and an ad-hoc chat room for a sporting event in the next. And, all of this would be done without users ever having to type or to even know the name of the program or channel being viewed. Taking this further, we could collect snippets from the web describing the actors appearing in a movie or present maps of locales within the movie as it takes place (no matter if users are watching it as a live broadcast or as a recoded broadcast).
There are two additional articles that have good coverage of this, that I am aware of. The first is at Small Biz Pipeline and the second is at TechCrunch. I particularly like how TechCrunch pulled out the four main points of the paper, as such;
+ Personalized information layers Here?s what Tom Cruise is wearing in the show you are watching and here's where you can buy the same clothes in your zip code.
+ Ad hoc social peer communities If you would like to chat about this show, ten of your college friends are watching it right now as well.
+ Real-time popularity ratings Nielsen requires hardware and the results aren't available in real-time. You might want to know if there is a spike in viewers watching the show on channel 9 right now. Advertisers might want to know that too.
+ TV- based bookmarks Click to save a show or clip into your video library and there will be more than just a few shows available for watching later.