Why we should not believe anything the polls tell us.

Another day and another opinion poll, this time from ComRes in The Sun. The details of the poll are not yet available on their website, but ComRes usually do phone polls and I think this is one of those. The referendum headline outcome is 45% for Remain and 38% for Leave. The sample size is 1002. However, The Sun leads with the story that there has been a big spike in the number of people who don’t know – from 6% in a similar poll last month to 17% now. The suggestion is that voters are becoming confused by the conflicting claims.

Opinion polls in British elections are well known to be inaccurate. However, the most notorious occasion was at the last General Election. Then, nearly all the polls showed Labour and the Conservatives level pegging in terms of percentage, with this yielding a narrow Labour victory. Apparently, David Cameron had drafted his speech acknowledging defeat. As it turned out, there was a small but sufficient majority for the Conservatives. The error in the polls was a 6.6% underestimate of the Conservative vote. Not for the first time, the “shy Tory” was blamed for misleading the pollsters. This hypothesis was first mooted in after the 1992 election which again underestimated the Conservative vote. The polling companies acknowledged that whilst the polls have been equally inaccurate in most elections since 1945, in 2015 the errors were sufficient to indicate completely the wrong result. In other elections, despite the big errors, the pollsters at least managed to correctly predict which party would win the election.

As a result of these errors, the British Polling Council conducted an inquiry into the reasons for the inaccuracies in their members’ polling. Their full report is here. The major finding that Professor Sturgis and his team found was this:

“Our conclusion is that the primary cause of the polling miss in 2015 was unrepresentative samples. [Their emphasis] The methods the pollsters used to collect samples of voters systematically over-represented Labour supporters and under-represented Conservative supporters. The statistical adjustment procedures applied to the raw data did not mitigate this basic problem to any notable degree. The other putative causes can have made, at most, only a small contribution to the total error.”

So the errors were systematic sampling errors. Every textbook on basic statistics starts off with a chapter on sampling and sampling errors; and here we find that the pollsters are continuing to make these basic errors. It is acknowledged that the changes recommended by this report will not be put into place until 2017, so we can conclude that these systematic errors will continue to be made for the duration of the referendum campaign. Another surprising conclusion of the report is this:

“We reject deliberate misreporting as a contributory factor in the polling miss on the grounds that it cannot easily be reconciled with the results of the re-contact surveys carried out by the pollsters and with two random surveys undertaken after the election.”

In other words, the “Shy Tory” effect is probably a myth and that people are not lying to the pollsters when they respond. This idea of shy Tories lying to pollsters has become so entrenched in the thinking of the political punditry, that the Guardianista view is that the Left should “stop shaming shy Tories“. But we can now reject that hypothesis.

Another suggestion that is frequently made is that telephone polling is more accurate than online polling. Once again, Sturgis et al. reject this hypothesis:

“There was no difference between online and phone modes in the accuracy of the final polls. However, over the 2010-2015 parliament and in much of the election campaign, phone polls produced somewhat higher estimates of the Conservative vote share (1 to 2 percentage points). It is not possible to say what caused this effect, given the many confounded differences between the two modes. Neither is it possible to say which was the more accurate mode on the basis of this evidence.”


To return to the ComRes poll at the start of this post, The Sun suggests that the “don’t knows” are increasing because of increasing doubt in the minds of the voter. I suggest that this is wrong because as we approach the referendum, more people will have made up their minds one way or the other – i.e. the “don’t knows” should decrease as a percentage. So the increase reported in these two successive ComRes polls suggests to me that the margin of error is at least 17% -6% = 11%.

The only thing we can say with any certainty at all is that the polls are wrong and that they will continue to be wrong right up to the day of the referendum. With a little less certainty, we can assume that they will be understating the “leave” vote by a very large margin.

Leave a Reply