I’d thought I’d heard it all in the debate over Google and newspapers, but this week Wall Street Journal managing editor Robert Thomson took it up a notch. He accused Google of making people slutty.If we’re using sexual metaphors now, here’s another one. Why doesn’t the Wall Street Journal and News Corporation in general put a condom around all of its content, to protect itself from Google? There’s a good brand called robots.txt that will help.
The accusation was lobbed at this week’s Web 2.0 Summit session called “Wither Journalism.” Be sure to watch the video, which I’ve listed at the end of this post. I’ll also link to key sections of the video, as well. It’s a stunning contrast between two cultures, Thomson and Google’s vice president of search products and user experience Marissa Mayer, who was also on the panel.
Google’s goal, Thompson said, isn’t really to send people from its site to other places. In reality, Google seems to have a nefarious plan to keep everyone on its own site:
Google wants to be the home page or wants to be the front page, and Marissa unintentionally encourages promiscuity. It’s about digital, the whole Google model is based on digital disloyalty. It’s about disloyalty to creators.
There was a murmur through the audience as the promiscuity reference that I’ve bolded was put out there. Even moderator John Battelle exclaimed, “Wow.” I’m not sure if the reaction was due to a new crass low point being reached in the overall debate about Google and newspapers or whether so many people likely disagreed with Thompson.
Somehow, in Thomson’s mind, Google has trained people that they don’t need to read newspapers at their own sites. This is despite the fact that Google didn’t even have its own news product until 2001. That it wasn’t the first news aggregator out there. That it wasn’t even the first search engine out there.
Hearing stuff like this, it’s hard not to agree with Google CEO Eric Schmidt’s assertion that Google gets blamed for changes wrought by the internet itself (see my Google CEO Eric Schmidt On Newspapers & Journalism article for more on that).
Thomson had also said earlier:
Having a debate about it obviously is important, which is clearly what we’ve been doing, provoking and stimulating and maybe unintentionally insulting.
Congrats. You were insulting. Not only did you say that Google encourages people to be news sluts, but you also accused many good people of being “net neanderthals:”
There are three types of people in the world.
There are the net neanderthals who think everything should be free all the time.
There are people who pay for ISP, 19.99 a month of whatever, and actually a lot of them think they’re getting access to content buffet. [I think that’s the amount he said, hard to hear]
And there’s a third group of people who are already on top of that paying for content.
Let’s skip the fact that there’s some debate on whether neanderthals were actually idiot savages. Really, if I expect some content for free, I’m some type of brute?
Last night I watched an episode of The Simpsons, broadcast on Fox, which is owned by News Corp, which owns the Wall Street Journal. It didn’t cost me anything. Sure, I watched it through Direct TV, and I pay a subscription fee for that. But it’s also broadcast over the air for free. My TV with an ordinary antenna would have picked it up. Later, if I wanted, I could have watched news content from Fox for free on the same channel.
Why am I a highly valued viewer in the “real” world but on the internet, if I read something for free at the Wall Street Journal, I’m some type of savage taken down the wrong path by Google?
Why also is the Wall Street Journal itself feeding those savages? Over at Nieman Journalism Lab, earlier this year WSJ executive editor Alan Murray explicitly talked about how and why the WSJ deliberately puts stuff outside a paywall. Be sure to watch that video and read the interview summary. At one point he says:
If it’s a big news story, if we report a takeover and — we could hold that behind the pay wall, but if we do, BusinessWeek or someone else will simply write a story saying ‘The Wall Street Journal is reporting x,’ and they’ll get all the traffic. Why would we do that?”
Note that it’s BusinessWeek that gets mentioned there, not some blogger, not some aggregator but another mainstream media outlet that could potentially do what mainstream media outlets have long done to each other, report on what each other is saying. But no one calls them out on that these days. It’s the aggregators, the search engines and the bloggers that get attacked.
I don’t know what research Thomson has to back his “three types of people” model. I think things are more subtle than that. I think there are plenty of people who will pay for stuff even if they’re used to expecting lots to be free.
In the real world, we have free and paid. On the web, we can have it as well. Paywalls DO have a place, and I’m not opposed to them. But the puzzling thing is that the Wall Street Journal has used its own paywall very well to both make money on the web yet also still get people from Google. They have their cake and get to eat it too. It’s a head-scratcher. What exactly is Thomson so upset with Google for?
Let’s go back to Thomson’s suggestion that Google is trying to keep traffic within its own site. Just after talking about Google’s “disloyalty” to creators, he discussed a unique feature of Google News that shows quotes. He challenged the audience to search for Hamid Karzai to see how Google fails to promote clickthrough:
In tiny, tiny font, you’ll see where the origin of those stories are …. there’s is absolutely no intention on that page to drive traffic.
Bring it on. First, most people are probably searching for hamid karzai at Google itself, not Google News. They’ll see this:
At the top of the page, three articles are listed. Those are smaller than the main link to the overall news results “News results for hamid karzai,” but they’ll still pull traffic. In fact, I challenge Thomson to give us a case study. It’s easy enough for him to know when he’s featured in that section. Is the WSJ really not getting traffic?
Also prominently featured are links to non-news sources. Fair to say that not everyone searching on Google for “hamid karzai” is after news content?
Now let’s drill into those news results. See? Google’s trying to keep people within its own site by doing that, driving them into Google News!
Sure, you can argue that. But you can also argue — and I will as someone who’s watched search habits for 14 years now — that the bigger issue is that people who DO want news content fail to actually go to a dedicated news search engine. So that link is part of the overall Universal Search change designed to help surface the right content for them.
On the news results for hamid karzai, you get this:
Now you can see what’s ticked Thomson off. To his eye, the top of the page is dominated by a quote, with the entire intent being to pull you from that into a further page of quotes (as I’ll show) rather than send you on to news stories. That’s to the eye, by the way. A search marketer would know that the actual result is much different.
The quote feature is just over a year old. My guess is that most people bypass it, moving down to click on one of the actual news stories shown. I say that because typically, things that do not look like “real” results usually get little click through (ask anyone who runs search ads about this). As I’ll point out, Thomson — if he wants — can tell us if he really gets no traffic from these or not, in relation to the amount of traffic he gets overall from Google.
How about that font? Look below the quote, and you’ll see Reuters is credited (and overall, Google’s found 120 occurrences of that quote on other news sources). The credit isn’t “tiny, tiny” to me. It’s maybe one size lower than the quote itself.
Still, that’s not the actual font he’s so concerned about. See how there are two arrows in the screenshot above? The one to the right points to a link that lets you get to a page of quotes from Karzai:
This is the page that Thomson was so upset about, the one he says where there’s absolutely no intention to drive traffic, in part because of the “tiny, tiny” font used to source the publications. In reality, the source font is identical in size to that used for the quotes.
Now, I’ll readily admit, most people who make it into a quotes page like this probably aren’t clicking off to the news publications. But then again, I think most people at Google News aren’t heading to these pages at all. Most of them are probably clicking on actual news stories, since that was their main intention for doing a news search in the first place.
Also keep in mind that quotes like this tend to show high on a page mainly when you search for the name of a newsmaker. Let’s look for a story was on the front page of the Wall Street Journal as I wrote this, about Nokia suing Apple. When I do that search, I get this:
See the arrow? There’s the Wall Street Journal listed at the top of the news unit. As I said, the WSJ can measure the traffic they’re getting off links like this. How much free traffic did they get?
Let’s drill in:
There’s no quote shown at the top of the page, because we’re not searching for about a newsmaker. Instead, what we get is the Wall Street Journal ranking at the top of the search results. And trust me, anyone who runs a news site will tell you being in that spot is sending you lots and lots of traffic.
If Google’s intention was really to keep traffic from papers, papers would see a trickle, not the flood they get. And Thomson either knows this is true for his own publication or deliberately prefers to put out misleading information.
After hearing Thomson make more accusations against Google, I’d finally had it and thought (again) why doesn’t the Wall Street Journal (and all of News Corporation, for that matter), just block Google? It’s easy. You put up a robots.txt file, and your content is out of Google.
At that exact moment, I cheered John Battelle from afar for asking that exact question:
If Murdoch’s so mad about this, why don’t you just put in robots.txt, don’t crawl?
And Thomson responded:
We can do that any time we like. But obviously we’re at the, the idea that the web has entirely evolved is ridiculous. You’re asking that question as if we’ve reached the epitome of our life, and we know that that’s not true. I think one of the questions for Google is, is the very definition of the verb to Google going to change over the next couple of years. Will it mean in the end you’re sort of rummaging around [something] content?
I couldn’t catch the exact word he used at the end of that quote. But the implication was that Google again had done something wrong. And the main point is that Thomson didn’t answer the question he was asked. At all.
Thomson took a simple question — why don’t you block — provided a non-answer and then spun things back on Google. Well, you can’t say he’s masterful.
I say, enough the hypocrisy. Let’s go back to the sex metaphor that Thomson kicked things off with.
A robots.txt block is like putting a condom around your entire web site. It keeps Google away. So if Thomson really thinks things are so promiscuous out there, have the WSJ practice a little safe news sex. Roll that robots.txt file on.
Otherwise, stop prattling on about how wronged you are by Google. The WSJ and other News Corporation sites are knowingly having unprotected indexing relations with Google. Don’t then complain that you’ve caught some type of neanderthal disease from it. Especially, when it turns out, all that’s really happened is that Google keeps knocking you up with millions of visitors.
Sadly, and I do mean this with all sincerity, underneath there are signs that Thomson is actually a reasonable person. That he doesn’t have completely crazy views. That when he’s not singing from what appears to be the hostile Murdoch songbook, he could actually be a partner for sensible change rather than as a poster child for the “newspapers are dinosaurs” crowd.
At one point in the session, he said:
We need to have a rational debate that’s almost platform neutral about professional journalism. Because in the end it’s not so much the means of delivery but the means of creation. And it’s that very issue that is the sort of thing that we should all be talking about. And it doesn’t mean that professional journalism is the antithesis of citizen journalism. There’s almost a false dichotomy there.
Agreed. So let’s have that, please. A rational debate, without accusations that aren’t backed up, which don’t hold water.
For more, see these reports of the session:
- Live Blog: Marissa Mayer & “Wither Journalism” At Web 2.0, Search Engine Land
- Live-Blogging the ‘Whither Journalism’ Panel With Google, HuffPo, NYT and WSJ, Wall Street Journal
- Newspaper, Internet titans duel at Web 2.0 Summit, AFP
You can also watch the entire video here:
Condom picture modified under Creative Commons licence from here.
Danny Sullivan is editor-in-chief of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also oversees Search Engine Land’s SMX: Search Marketing Expo conference series, maintains a personal blog called Daggle and can be followed on Twitter here.
This story was originally published on Danny’s personal blog, Daggle, and is re-published here with permission.
Read more at: http://www.huffingtonpost.com/danny-sullivan