Earlier this week, ZDNet News published an article discussing the presence and availability of explicit content on video search sites like, YouTube, Yahoo Video and Google Video. "A weeklong review of some of the top user-generated video sites by CNET News.com unearthed scenes of beheadings, masturbation, bloody car accidents, bondage and sadomasochism," wrote the reporter, Greg Sandoval. He did say that this review found no child pornography.
There are a number of issues that the article directly and indirectly raises. (I spoke to Sandoval during his interview process.) Perhaps the primary issue for marketers and the video sites that want their ad dollars is a practical one. There has been considerable press and discussion about the reluctance of mainstream brands to associate themselves with user-generated video content that they can't control. And there have been celebrated cases, for example on MTV-owned iFilm, where "run of site" video ads for mainstream brands have appeared as pre-roll in front of adult content.
To attract more advertising and address this criticism and the hesitation from marketers, MySpace, for example, has recently created "safe content areas" ? safe for marketers that is ? where no questionable content appears. Companies are chomping at the bit to reach the massive MySpace audience but do not want their brands associated or juxtaposed with violent, pornographic or otherwise questionable content.
As of today, Google is testing advertising on premium video content but doesn't offer it where user-generated content is involved (to address this same issue).
It's not completely fair to lump all sites together. Not all video search sites have the same range and types of content and, again to be fair, on those video sites where adult content is available, it's typically behind a warning or "safe search" filter. But those filters can be easily changed. And, somewhat shockingly, violent content (e.g., beheadings) is not similarly gated.
Video sites need to determine whether and how to treat explicit or "over 18" content in terms of the advertiser proposition. But beyond this, there are practical "enforcement" issues as well. If you've many thousands of videos coming into your site on a daily basis, like YouTube, there's time and cost involved in mounting an effort to screen all those videos before they're posted. One approach would be to monitor the tags and flag those streams that indicated questionable content for later human editorial review.
The simple approach, of course, would be to simply ban all "non-family friendly" content and thus create a video site that was safe for advertisers and kids. But then there's that little thing called the First Amendment of the U.S. Constitution.
All pornography is not illegal; child pornography is. Yet pornography is offensive to many people. However, the discussion of what constitutes "pornography" takes us down a complicated and winding path that invariably invokes U.S. Supreme Court Justice Potter Stewart's famous 1964 quote in Jacobellis v. Ohio about the difficulty of defining pornography in the abstract: "I know it when I see it."
As a parent I'm not eager for my two young daughters to discover explicit content online when they years from now simply, out of curiosity, start entering sexually oriented words in a search box or video site. (I did the quaint equivalent as a kid in middle school looking up "sex words" in the dictionary.) But as a former lawyer with sensitivity to the complexity of questions of censorship and free expression I recognize that there's a practical and philosophical quagmire for Google, Yahoo and others around whether to show adult and other non-mainstream content on video sites. It's somewhat analogous to the question of whether to go into China and participate in the censorship of websites.
If you start "banning videos" what do you allow and what do you omit? Do you allow violence but not sex ? I just as equally would like to protect my daughters from beheading scenes. Do you allow sexual content but not extreme violence? Beheading videos from Al Qaeda are arguably "news content." And if you permit nudity, where is the line?
Monitoring and making judgments about the content of videos is not unlike the challenge of monitoring trademark infringement within paid search advertising: difficult, time consuming and inherently flawed.
I'm not suggesting there is no line and no limits but Google, YouTube, AOL, Yahoo and others need to find that line carefully, balancing the competing interests (legal, philosophical, financial) that weigh on this cluster of issues. I certainly don't have the answer and right now, apparently, neither do they.
This Year's Premier Digital Marketing Event is #CZLSF
ClickZ Live San Francisco (Aug 11-14) will bring together the industry's leading online marketing practitioners to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, the comprehensive agenda will help you maximize your marketing efforts and ROI. Register today!