Britain is still reeling from the shock of its Brexit referendum, in which the UK voted 52-48% in favour of leaving the EU. The people to blame for this shock are the pollsters, who consistently predicted a Remain majority before the final result, which went to Leave.
Now those pollsters have come up with an explanation for why they were wrong. The details are covered in a fascinating blog post by YouGov research director Anthony Wells.
The pollsters got the Scottish independence referendum wrong, too, when they predicted a close result. Just days before the vote polls suggested the pro-independence “Yes” vote might have the lead. In the end, there was a solid 55-45% majority for “No.”
One of the big takeaways from Wells’ analysis is just how far from being accurate polls have become. That has worrying consequences for those of us who like political stability. When elections are predictable, it helps society — investors, job seekers, employers and retirees — plan for the future. The EU Referendum caused the pound and the FTSE 100 to both immediately collapse because no one was expecting Leave to win.
Wells suggests all this is happening because more voters are appearing at the ballot box who are not detected by opinion polls.
And it is vital reading for anyone mystified as to why Donald Trump keeps winning in the US when all the experts say he should be losing, and anyone who puzzles over where Jeremy Corbyn gets his votes from.
We’re going to summarise Wells’ analysis here but it’s well worth reading the whole thing for the nitty-gritty stuff. Basically, Wells says, the pollsters got six things wrong:
- Phone polls don’t work anymore, and even online polls are inaccurate. The telephone polls came out low for Remain but even the online polls, which were more accurate, understated the Leave vote. Both types of polls predicted a Remain majority but the online polls were less wrong than the phone polls. This was crucial because the experts assumed the accurate prediction was somewhere between the phone and online results. In reality, the Leave vote was above the online prediction.
- Polls undercount voters who are hard to reach. Most polls are done over three days of research in which pollsters try to reach a sample of voters. Wells suggests that period needs to stretch to six days so that hard-to-reach voters are included.
- Graduates are over-represented in polls. Under-educated people are undercounted in polls. “There need to be enough poorly qualified people in younger age groups, not just among older generations where it is commonplace,” Wells says.
- Polls fail to add “attitudinal weights.” Some voters say they don’t know how they will vote but their votes can be predicted if you know more about their attitude to connected issues. In the Brexit poll, pollsters tried to weight their results with info about voters’ views on race and immigration. But even that weighting was flawed.
- Turnout models are wrong. Pollsters failed to realise that turnout was key. Their turnout models of who would actually vote were all wrong. “In almost every case the adjustments for turnout made the polls less accurate, moving the final figures towards Remain,” Wells says.
- Models for reallocation of “don’t knows” are wrong. You can’t vote “don’t know ” in an election, so those responses need to be reallocated to one of the voting options. “In every case these adjustments helped remain, and in every case this made things less accurate,” Wells says.
Here are the polls from April 1 through June 23, with the final data point being the actual result:
Business Insider Emails & Alerts
Site highlights each day to your inbox.