Just in time for Facebook earnings, Forrester is actively trying to get journalists to cite a
‘damning’ report about how Facebook is failing marketers. It’s a hatchet job with supporting ‘evidence’ that can only be described as bulls—.
As a former research analyst at Jupiter (acquired by Forrester long after I left), I know how tempting it is to find data that fits your predefined opinion, and to create a straw man that supports your position. When I saw the Forrester report, I posted a comment online that the methodology of the quoted data looked questionable, even though I couldn’t see the full report for more. Within an hour I had three people offer to send me the actual Forrester report, including someone who works at Forrester (who said they thought the report was flawed).
VentureBeat’s John Koetsler hit the nail on the head saying that “Forrester does great work, by and large … but this ‘report,’ frankly, is an embarrassment” and that this work was “attention whoring and distasteful”.
In the unfortunate all-too-common-lately style of high-level broad spurious insights without any detailed methodology, most of the actual ‘data’ is only available behind a paywall, but there was enough publicly available to see some of the major issues with it and then the benefit of reading the report compelled me to speak out here.
There are 13 types of marketing Forrester compares in the chart BI shared and, unfortunately, with a spread in marketer satisfaction of just 8 per cent from lowest to highest, the differences are likely not statistically significant. I can’t actually tell for sure since the sample is at most 395 “executives” for the question but likely is lower per response, since not all of these executives have likely tried all 13 of the marketing types (Google+ marketing anyone? Hello, who are you?!). From data in the same report it appears less than 50% of the sample utilizes Facebook paid advertising, for example, calling the significance of an 8 per cent difference even further into question. Having a separate category for “mobile marketing” too, ignores the fact that Facebook is the world’s most popular mobile application /advertising platform by some margin.
The Forrester analyst who produced this appeared to have an axe to grind long before they ever got the “data” quoted in this report. The report says: “A handful of notable brands have drawn first blood, announcing they’re leaving Facebook entirely.” The analyst’s endnotes cite only one company, namely General Motors, who (a) did NOT say in May 2012 they were leaving entirely but were just stopping Facebook paid media, and (b) over six months ago said they were also returning to buy Facebook ads once more.
In May 2012, a few days before Facebook went public this Forrester analyst pushed out a blog post saying “marketing on Facebook doesn’t work very well, and marketers can’t count on things improving anytime soon.” — that statement’s proof was a link to a different blog post he wrote in 2011 where he cited no data but merely said “But I haven’t spoken to many companies that are thrilled with their Facebook programs.”
Unlike the report itself, which is devoid of any quotes from marketers, marketers spoke out about the report in various places:
“You need to look up: standard deviation, and confidence interval.” — Steve Walker
“I’ve never seen Forrester make such broad claims. It’s odd frankly. Our consumers have spoken. They’ve chosen Facebook.” – Marko Z Muellner
“You get out of Facebook what you put into it. All the rich targeting and ad capabilities in the world don’t make up for crappy marketing.” — Robin Cangie
Firms like Forrester have always lived off insinuating their opinions and data into the press — as Wired Magazine chronicled during the first Internet bubble, summed up well at the time by Yankee Research founder Howard Anderson: “If you’ve done good research, it shouldn’t go to BusinessWeek first – the clients need to see it first.”
Amid a push by the Forrester public relations group to get this report out into the world to the press rather than to clients, three leading journalists who cover online advertising separately asked for my feedback on information that to them seemed ‘deeply flawed’, ‘rehashed nonsense’ [about GM] and, yes, ‘bulls—‘.
Within the report’s methodology section, the 395-person executive survey is a result of questions posed to a panel run by a company called e-Rewards / Research Now where “respondents were motivated by receiving credits that could be redeemed for a reward” and of course, “data is not guaranteed to be representative of the population, and, unless otherwise noted, statistical data is intended to be used for descriptive and not inferential purposes”.
Executive surveys are prone to problems – Forrester itself has said that “the incentives are much larger for B2B surveys – $US2 for B2C versus $US20 or more for B2B — this is even greater in B2B technology areas because there are fewer qualified potential respondents and a greater desire to survey them … resulting in a greater potential of people claiming they have roles and responsibilities they do not have to harvest the incentives.”
This wasn’t a report that was randomly picked up by the press — there was a concerted PR campaign to push this report and its problematic data via outreach that uses colourful words to vilify Facebook like “harsh reality”, “dead last” and “results will be dire”. I have a lot of respect for Forrester — our company is currently being assessed as a vendor by some of their analysts, and during that exercise we’ve been very impressed by the attention to detail and work they do.
But while informed by our opinions, marketing is really just maths, and this Forrester analyst’s fuzzy maths just doesn’t add up.
Rob Leathern is the founder and CEO of Optimal, a Brand Networks company. Optimal is a social media marketing company.
Business Insider Emails & Alerts
Site highlights each day to your inbox.