Every major update or algorithm change from Google results in an uproar. It’s quite understandable because it’s all about money. If Google makes a change to its algorithm that cuts the traffic to your website, and revenue, in half, you’re going to get upset.
This is usually how it starts. People who are negatively affected by these things then start complaining.
The recent “Mayday update” is no exception. The New York Times jumped into the fray, quoting Gary Reback, the lawyer of Microsoft antitrust fame, who said Google is the “arbiter of every single thing on the Web, and it favors its properties over everyone else’s. What it wants to do is control Internet traffic. Anything that undermines its ability to do that is threatening.”
Will the Federal Government Regulate Google’s Search Algorithm?
This question was recently raised. Here’s the rationale people use:
- Google is secretive of its algorithm, so web publishers don’t know “what the rules are.”
- Google changes its algorithm without providing advance warning.
- Many businesses suffer greatly with each algorithm update.
- Search is a huge component in today’s economy, and Google is dominating the space, so its impact is unduly large.
- Therefore, it’s algorithm should be public knowledge.
There is some merit to these concerns, but when it leads to discussions of regulating the algorithm, or making it transparent, I kind of lose it.
Making the algorithm transparent would make the quality of the search experience on the web go to hell within a few short weeks.
Because there’s so much money to be made in search, imagine the field day the spammers would have. They would study the details of the algorithm more than anyone else, and then manipulate them to force their site to the top of the rankings. Legitimate businesses wouldn’t have a chance of showing up for relevant searches, even if they were actually the best result, because spammers would dominate everything.
This spamming phenomenon was the core reason for the fall of AltaVista and the rise of Google in the first place. To put it in a single phrase: the entire search ecosphere would rapidly unravel.
Protecting Search Quality
This is why a lack of transparency (regarding the algorithm, not other matters) is critical to the current web. The basis of the algorithmic approach to search is that it makes it possible to catalog and index the entire web. It isn’t possible to do this manually because the costs are way too high.
The black box is critical to our existence. And from a search quality perspective, the blacker the box is, the better.
Regulation — Risk vs. Reward
Does that mean Google shouldn’t be regulated? Not necessarily.
Companies that dominate multiple markets start to develop conflicts of interests that may not always serve the public good. However, we need to approach this discussion very carefully, especially because it would be the U.S. government (and the governments of other countries) that will ultimately make these decisions. I’m not convinced that U.S. lawmakers are qualified to make the decision, and therefore there’s a great risk that any decision they make could blow apart the web as we know it.