How did the polls do? The final outcome.

4 10 2014

Now we have the final election result, I’ve updated the table from my previous post. In addition, I’ve included a similar table for the polls-of-polls, and a pretty graph!

UPDATE: I’ve revised the chart and first table with UMR’s pre-election poll result, published by Gavin White on SAYit Blog. I’ve checked all my numbers fairly carefully, but if any pollster, pundit, or media organisation spots any errors please let me know and I’ll update this post accordingly.

Final result chart

How I calculated the above results.

Polls

Poll of polls

The overall picture remains similar.

  1. Well done DigiPoll and DPF (Curia poll-of-polls)
  2. Still no evidence, this election, of the ‘National bias’ that some people talk about.
  3. If there is any poll bias, it appears to be toward the Green Party.
  4. The landline bias/non-coverage issue is a red herring – the polls that came closest only call landlines. It’s just one of many potential sources of error that pollster’s need to consider. Here’s another post about this, if anyone is interested in finding out why it’s not such a big deal.




How good was my prediction? (updated)

21 09 2014

It was okay. Not happy with my Green Party prediction.

UPDATE: Now updated with final election result. Things are a little better when it comes to the Green Party result, so I’m a little happier.

Prediction





How did the polls do?

21 09 2014

This table shows the provisional election result and all the final pre-election poll results.

Table 1

This table shows deviations from the final result, for each party and poll.

Table 2

A few initial thoughts:

  1. Well done DigiPoll.
  2. Looking at these results, I see no evidence of the ‘National bias’ that some people talk about.
  3. If there is any poll bias, it appears to be toward the Green Party.
  4. The landline bias/non-coverage issue is a red herring.




Poll-of-Polls-of-Polls

20 09 2014

We have more polls-of-polls now than we have actual polls. I thought it would be interesting to put all their predictions in one place, and also to calculate a basic average.

Please note that this is not my election prediction. I posted that yesterday.

Polls-of-polls





My election prediction

19 09 2014

So here is my election night prediction. Call it an educated guess.

2014 prediction





Final pre-election polls have begun – what about the Dotcom bombshell?

13 09 2014

Final pre-election polls either began recently, or will begin in the next day or two.

Pollsters all do things a bit differently, but my guess is most media polls are probably carrying out the majority of their interviews over the next three days, leaving the final couple of days to carry out call backs and appointments, and/or fill hard-to-reach quotas.

What does Kim Dotcom have to reveal on Monday?

Who knows…? If his bombshell has a major influence on voting decisions, the polls may pick some of this up in the last couple of days of fieldwork. The polls won't get that good a read on it though, because most interviews will be carried out prior to Dotcom's announcement.

A previous poll did reveal that more people believed Dotcom than the PM over when the PM first came to know about Dotcom, and that didn't seem to matter a great deal to National Party supporters at the time. For Dotcom's bombshell to influence National's support, it probably needs to be a WMD-level bombshell.

Why not wait until after Dotcom's announcement to begin fieldwork?

You could do this, but you'd have a very short fieldwork period, meaning limited call backs and a low response rate. You might pick up the influence of Kim Dotcom's announcement, but the results could very well be all over the place due to low response. In addition, you would have changed your methodology, so your results would not be comparable to your previous poll.

Personally, I wouldn't take this risk. But that's just me.





What are polls-of-polls telling us?

6 09 2014

James Green at The Distracted Scientist has blogged an interesting set of graphs showing, among other things, that the Roy Morgan poll tends to closely reflect a line fitted to all polling firms’ results (this line is a bit like a poll-of-polls average, like those you can find at Occasionally erudite, Dim-Post, Polity, Curiablog, and Colin James’ Poll of Polls). James notes the following:

So why does the Roy Morgan profile seem to share the shape with the same overall trend? Something I’ve known, but hadn’t really considered as an influence before, is that Roy Morgan is by far the most regular and frequent poll. This means unless that is weighted out, it will always dominate the shape of the trend. And as far as I’m aware, none of the poll averaging strategies do that.

Around half of all the data points in a basic poll average will come from the Roy Morgan poll.

Essentially, Roy Morgan poll more frequently, so they are a more important contributor to a polling average. All of the other polls will have less of an influence on a basic average calculation. Actually, the Roy Morgan poll has such a large influence on the average that it would almost be fair to look at a basic poll-of-polls average as the Roy Morgan result ‘adjusted’ by the results of other polls. (Okay, now I’m just baiting :) )

Poll averages are better than individual polls for observing trends over time. This is because they ‘smooth out’ the annoying sample variation and methodology differences, making it easier to see trends. However in New Zealand they also suffer from some of the same problems regular polls do. There are only five polls in New Zealand, so one outlying poll result, one big change in an odd direction, or one poll firm with a strong ‘house effect’ can have a large influence on an average poll result.

UPDATE: Was just reading through Nate Silver’s tweets, and noticed a similar comment about some of the poll averages in the US.

photo

(As an aside, it’s worth noting that in 2011 some polling firms’ final results were closer to the actual election result than poll-of-polls results.)

I’ve mentioned on this blog before that Gavin White at UMR has examined poll results industry-wide to show that, on average, the final mainstream media polls prior to elections have tended to get a result:

  • 2.4% too high for National
  • 0.5% too low for Labour
  • 1.5% too high for Green
  • 1.1% too low for NZ First

That’s an average across all polls, so some will have results in the opposite direction, or more strongly in the same direction, for some parties. It could be useful to factor these in when interpreting polling averages.








Follow

Get every new post delivered to your Inbox.

Join 650 other followers