No one could have missed that there’s a General Election next month. It’s blanket coverage across the media, picking over manifestos and cross examining politicians on numbers and affordability. And of course, inevitably the predictions of what the final outcome will be. A whole industry has grown up around polling – according to this Monday morning’s press reports (15 May), the Conservatives have a lead of some 14 to 15 points over Labour’s 32%. There is talk of a “towering lead” and a “hefty majority” in June.
Casting our minds back a whole two years to May 2015 here’s what was being said the day before the Polling Stations opened…..
“What we can be sure of is that no single party will win an outright majority” (The Guardian)
“Pollsters predict a Tory lead but a Miliband government” (The Independent)
“Parliament will be hung” (The Telegraph)
The BBC Poll of Polls produced this handy summary so we could see how close to call it was going to be …
And so it came to pass that the Conservatives got 36.9% of votes and Labour 30.4% giving the Conservatives a 12 seat majority – no neck and neck result, no hung parliament, no coalition, goodbye Ed Miliband and Nick Clegg and much mocking of pollsters for getting it so wrong (and the beating of chests and howls of ‘what went wrong’ from the pollsters).
Confidence in the polling sector was shaken. The British Polling Council sprang into action and immediately set up an enquiry into how pollsters underestimated the Conservative lead over Labour. And their overall conclusion – unrepresentative samples that over-estimated Labour and underestimated Conservative voters plus a lot of methodological advice for the sector.
But that wasn’t the end of it — we’ve also had Brexit and the US Presidential Election that got it wrong – although to be fair the Brexit polls predicted a neck and neck outcome to the very last. On the 23 June 2016, the Times had Leave at 46% and Remain at 48%, YouGov at 45% Leave, 45% Remain, Opinium 45% Leave, 44% Remain – the final result was Leave 51.9% and Remain 48.1% (though a notable congratulations to What UK Thinks who predicted 52% Leave, 48% Remain).
On the eve of the US Presidential Election, the NY Times modelled all the polling results and gave Clinton an 85% chance of winning. Ouch. The Telegraph summed it up the day following the election – its headline:
“How wrong were the polls in predicting the US election and why did they fail to see Trump’s win?”
Is getting it wrong new? Are we witnessing a new trend in what voters tell pollsters? Are they making their minds up when staring at the ballot paper? Or is it down to the mysteries of sampling bias and last minute voter swing? Maybe the likelihood of the ‘olds’ to register and vote and the can’t be bothered ‘youngs’ to stay at home? Or the problem of using online and phone panels to ask people how they are going to vote rather than the old-fashioned approach of knocking on doors?
All these explanations have been put forward but it seems not – in fact getting it wrong has been happening for a long time (remember the 1992 ‘It was the Sun that Won it’/War of Jennifer’s Ear General Election when everyone predicted a hung Parliament?). Leaving the last word to John Curtice (the chap who accurately predicted the outcome of Brexit) “In historical terms, the 2015 polls were some of the most inaccurate since election polling first began in the UK in 1945. However, the polls have been nearly as inaccurate in other elections but have not attracted as much attention because they correctly indicated the winning party”.