James Green at The Distracted Scientist has blogged an interesting set of graphs showing, among other things, that the Roy Morgan poll tends to closely reflect a line fitted to all polling firms’ results (this line is a bit like a poll-of-polls average, like those you can find at Occasionally erudite, Dim-Post, Polity, Curiablog, and Colin James’ Poll of Polls). James notes the following:
So why does the Roy Morgan profile seem to share the shape with the same overall trend? Something I’ve known, but hadn’t really considered as an influence before, is that Roy Morgan is by far the most regular and frequent poll. This means unless that is weighted out, it will always dominate the shape of the trend. And as far as I’m aware, none of the poll averaging strategies do that.
Around half of all the data points in a basic poll average will come from the Roy Morgan poll.
Essentially, Roy Morgan poll more frequently, so they are a more important contributor to a polling average. All of the other polls will have less of an influence on a basic average calculation. Actually, the Roy Morgan poll has such a large influence on the average that it would almost be fair to look at a basic poll-of-polls average as the Roy Morgan result ‘adjusted’ by the results of other polls. (Okay, now I’m just baiting :) )
Poll averages are better than individual polls for observing trends over time. This is because they ‘smooth out’ the annoying sample variation and methodology differences, making it easier to see trends. However in New Zealand they also suffer from some of the same problems regular polls do. There are only five polls in New Zealand, so one outlying poll result, one big change in an odd direction, or one poll firm with a strong ‘house effect’ can have a large influence on an average poll result.
UPDATE: Was just reading through Nate Silver’s tweets, and noticed a similar comment about some of the poll averages in the US.
(As an aside, it’s worth noting that in 2011 some polling firms’ final results were closer to the actual election result than poll-of-polls results.)
I’ve mentioned on this blog before that Gavin White at UMR has examined poll results industry-wide to show that, on average, the final mainstream media polls prior to elections have tended to get a result:
- 2.4% too high for National
- 0.5% too low for Labour
- 1.5% too high for Green
- 1.1% too low for NZ First
That’s an average across all polls, so some will have results in the opposite direction, or more strongly in the same direction, for some parties. It could be useful to factor these in when interpreting polling averages.