If you are a web site publisher, then you likely live in fear of Google’s anti-webspam chief, Matt Cutts.
Cutts is the guy in charge of adjusting Google’s search algorithm so that it downplays low-quality “scraper” sites (which copy content from legit sites and hope to earn ad money from the misplaced clicks that result).
If Cutts doesn’t like the content you’re publishing, he can make it virtually vanish from Google’s search results. He’s on a constant search for spammers who plagiarize other people’s web sites and somehow steal the search engine ranking that should have gone to the original author.
But yesterday, he got a taste of his own medicine. Here is how it went down:
In a tweet, Cutts asked the world to send him examples of scraper sites that appeared to have better search engine optimization than the regular content:
Almost immediately, e-business consultant Dan Barker replied:
The joke here is that Google has started showing its own “semantic” search results for certain queries. These results are the black and white, mostly text results that often appear at the top of common search results are intended to deliver a brief but useful answer to the question so the user does not need to click a further link to get to the source. The results are often drawn from non-Google resources, as this scraper site definition clearly is. The purpose is to build Google’s “Knowledge Graph.”
To give Google some credit, the scraper definition it supplies does include a link to the Wikipedia entry from which Google has taken its information. But still. Barker’s tweet got 13,000 retweets; Cutts’ tweet got only 407.
NOW WATCH: Tech Insider videos
Business Insider Emails & Alerts
Site highlights each day to your inbox.