Report by Sky News.
An independent inquiry is to be carried out into the accuracy of election polls after they consistently underestimated the Conservatives’ lead over Labour.
Predictions of a neck-and-neck race, a near-balanced parliament, and a potential constitutional crisis following the General Election put forward by all major pollsters during the campaign have been proved drastically wrong.
My impression is that, relative to NZ, there’s more information available about individual poll methodologies in the UK. This should assist the inquiry.
The British Polling Council (BPC), which acts as the association for opinion pollsters, will look into the causes of the “apparent bias” and make recommendations for future polls.
The BPC, which counts all major UK pollsters among its members, said in a statement: “The final opinion polls before the election were clearly not as accurate as we would like, and the fact that all the pollsters underestimated the Conservative lead over Labour suggests that the methods that were used should be subject to careful, independent investigation.
It would be wrong to assume political preference is stable during the week before an election, so ‘apparent bias’ is the correct term.
Survation said it conducted a telephone poll on Wednesday evening which showed the Tories on 37% and Labour on 31% but “chickened out” of publishing it as it appeared so out of line with other surveys.
Case in point.
Meanwhile, ICM director Martin Boon appeared to sum up the mood among Britain’s pollsters, tweeting “oh s**t” after the publication of the exit poll showing the Tories would be by far the largest party.
Heh… that’s what he said in public.
UPDATE: Article by Survation. (Thank you Matthew Beveridge)
Survation conducted a voting intention telephone poll the day before the election (Wednesday) with three specific attributes:
This was conducted over the afternoon and evening of Wednesday 6th May, as close as possible to the election to capture any late “swing” to any party – the same method we used in our telephone polls during the Independence Referendum that produced a 54% and a 53% figure for “no”.
This poll produced figures of:
Others (including the SNP) 6%
Which would have been very close to the final result.
We had flagged that we were conducting this poll to the Daily Mirror as something we might share as an interesting check on our online vs our telephone methodology, but the results seemed so “out of line” with all the polling conducted by ourselves and our peers – what poll commentators would term an “outlier” – that I “chickened out” of publishing the figures – something I’m sure I’ll always regret.
So in short, there will be no internal review of polling methodology for Survation post this General Election result.
Really? Based on the result of a single poll that was a better predictor? Let’s hope their model holds true for the next election.
Online polling is very effective and useful for a wide range of reasons, however we’re now clearer than ever that this method of telephone polling as described above will be our “Gold Standard” for voting intention going forward.
I think online polling can be effective for political polling. However this is just one of an infinite number of things that contribute to a poll.