Final pre-election polls have begun – what about the Dotcom bombshell?

13 09 2014

Final pre-election polls either began recently, or will begin in the next day or two.

Pollsters all do things a bit differently, but my guess is most media polls are probably carrying out the majority of their interviews over the next three days, leaving the final couple of days to carry out call backs and appointments, and/or fill hard-to-reach quotas.

What does Kim Dotcom have to reveal on Monday?

Who knows…? If his bombshell has a major influence on voting decisions, the polls may pick some of this up in the last couple of days of fieldwork. The polls won't get that good a read on it though, because most interviews will be carried out prior to Dotcom's announcement.

A previous poll did reveal that more people believed Dotcom than the PM over when the PM first came to know about Dotcom, and that didn't seem to matter a great deal to National Party supporters at the time. For Dotcom's bombshell to influence National's support, it probably needs to be a WMD-level bombshell.

Why not wait until after Dotcom's announcement to begin fieldwork?

You could do this, but you'd have a very short fieldwork period, meaning limited call backs and a low response rate. You might pick up the influence of Kim Dotcom's announcement, but the results could very well be all over the place due to low response. In addition, you would have changed your methodology, so your results would not be comparable to your previous poll.

Personally, I wouldn't take this risk. But that's just me.





What are polls-of-polls telling us?

6 09 2014

James Green at The Distracted Scientist has blogged an interesting set of graphs showing, among other things, that the Roy Morgan poll tends to closely reflect a line fitted to all polling firms’ results (this line is a bit like a poll-of-polls average, like those you can find at Occasionally erudite, Dim-Post, Polity, Curiablog, and Colin James’ Poll of Polls). James notes the following:

So why does the Roy Morgan profile seem to share the shape with the same overall trend? Something I’ve known, but hadn’t really considered as an influence before, is that Roy Morgan is by far the most regular and frequent poll. This means unless that is weighted out, it will always dominate the shape of the trend. And as far as I’m aware, none of the poll averaging strategies do that.

Around half of all the data points in a basic poll average will come from the Roy Morgan poll.

Essentially, Roy Morgan poll more frequently, so they are a more important contributor to a polling average. All of the other polls will have less of an influence on a basic average calculation. Actually, the Roy Morgan poll has such a large influence on the average that it would almost be fair to look at a basic poll-of-polls average as the Roy Morgan result ‘adjusted’ by the results of other polls. (Okay, now I’m just baiting :) )

Poll averages are better than individual polls for observing trends over time. This is because they ‘smooth out’ the annoying sample variation and methodology differences, making it easier to see trends. However in New Zealand they also suffer from some of the same problems regular polls do. There are only five polls in New Zealand, so one outlying poll result, one big change in an odd direction, or one poll firm with a strong ‘house effect’ can have a large influence on an average poll result.

UPDATE: Was just reading through Nate Silver’s tweets, and noticed a similar comment about some of the poll averages in the US.

photo

(As an aside, it’s worth noting that in 2011 some polling firms’ final results were closer to the actual election result than poll-of-polls results.)

I’ve mentioned on this blog before that Gavin White at UMR has examined poll results industry-wide to show that, on average, the final mainstream media polls prior to elections have tended to get a result:

  • 2.4% too high for National
  • 0.5% too low for Labour
  • 1.5% too high for Green
  • 1.1% too low for NZ First

That’s an average across all polls, so some will have results in the opposite direction, or more strongly in the same direction, for some parties. It could be useful to factor these in when interpreting polling averages.





Is it okay to poll advance voters and to conduct exit polls?

28 08 2014

Back in 2012 I sent a few queries to the Electoral Commission, just to confirm a few things for my records. Here is their reply, received on 23 May 2012:

Dear Andrew

Thank you for contacting the Electoral Commission following the Political Polling Forum organised by AMRO, held at Parliament on 9 May 2012.

One of the panel members representing a polling company at the event commented that where a voter was surveyed prior to an election and they indicated they had already voted their voting preferences were still recorded in the poll. 

Your query to the Commission is whether it is legal to include people who have already voted in advance in pre-election polling.

Section 197(1)(d) of the Electoral Act 1993 makes it an offence for a person at any time before the close of the poll, to conduct in relation to the election a public opinion poll of persons voting before polling day.

This provision makes it clear that it is not permissible to conduct an exit poll of voters prior to election day.  A person who commits an offence under section 197 is liable to a fine of up to $20,000.

Advance votes are cast in the 17 days before election day.  There has been an increase in advance voting over recent elections, with 334,558 (or 14.7%) of voters voting before election day at the 2011 General Election.  Consequently, care needs to be taken if random polling of electors is conducted during the advance voting period to ensure section 197(1)(d) is not contravened. 

In the Commission’s view the questions put to individuals polled and the reporting of any results need to be worded in a way that ensures that they do not ask or disclose how people have actually voted.  For example, people could be asked ‘which party they would vote for if they had to vote now?’.  In the alternative, a polling company may chose not to poll any person who has already voted.

I hope this answers you query.  Please feel free to circulate this response to other members of the panel who were present at the recent political polling forum event.

So essentially, exit polls are not allowed.

You can poll advance voters, as long as you don’t ask how they voted. To my knowledge, most polling companies ask ‘who would you vote for if an election was held today/tomorrow’, or something very similar. They do this because the purpose of a poll is to measure voter sentiment at the time of the poll, not to predict it on Election Day.

That may seem like a small distinction – but it’s really quite a big one when you think about it.





Political party Opening Statement evaluations (parties on the left)

24 08 2014

Polling is really a very small part of what I do. One of my main jobs is measuring the effectiveness of different public sector social marketing campaigns.

I thought it might be interesting to think about each Opening Statement from a comms-development perspective. Okay, so I’ve not gone out and actually evaluated the Opening Statements among each target audience (which is what I’d usually do), but the work I’ve done over the past 10 years has given me a pretty good feel for the strengths and weaknesses of different campaigns, messages and executions.

Below I’ve rated how well I think each Opening Statement (of the left) would perform against norms for engagement, message relevance, message believability, and brand ascription among their target audiences. If I get some more time, I’ll look at Opening Statements from parties on the right.

Labour

Engagement: Below average (passive positive)
Message relevance: Above average
Message believability: Average
Brand ascription: Above average

Overall assessment – Labour’s messages are right on target, and there’s no mistaking that this was the Labour Party. However the message is let down by the execution – it’s what we call a passive execution, which (unless highly enjoyable) does not keep people’s attention. Viewers will either not notice a lot of the key messages, or will forget them. The execution also felt very scripted and unnatural, which lowers message believability.

Green

Engagement: Above average (clever mix of active positive and active negative)
Message relevance: Above average
Message believability: Above average
Brand ascription: Above average

Overall assessment – Messages right on target and clearly conveyed. Very good use of imagery – there’s no mistaking this was the Green Party. Moving between emotive positive and negative messaging was a really clever way to keep people’s attention for a long time.  The negative National Party messaging probably made sense to the Greens, because they probably don’t see potential National voters as their target audience. They need to be careful here though, so they can be a viable alternative for more centrist voters who don’t want to keep supporting National, but who don’t currently see Labour as an option (I digress, as this is more about their strategy than their Opening Statement).

Internet-Mana

Engagement: Above average (active positive)
Message relevance: Average
Message believability: Average to below average
Brand ascription: Average

Overall assessment – The animation, the cat, and the whole Jetsons thing was pretty clever. This definitely encourages the audience to look and to keep watching.  I do wonder though if young people will think the message is patronising. I’m not sure if the ‘We will fix everything! Cool! Radical! Awesome!’ message will fly with a lot of young potential voters.

The branding was fairly average too. Sure, they show the logo, and they mention Internet-Mana, Laila, and Hone, but the creative execution isn’t tied to the brand as well as it is for the other Opening Statements (which you instantly know are for Labour or Green). Strong brand ascription is REALLY important for a new brand, so this should have been much stronger.





What is a push poll? (HINT – It’s not a poll.)

22 08 2014

The AAPOR has a good discussion of this.

A so-called “push poll” is an insidious form of negative campaigning, disguised as a political poll. “Push polls” are not surveys at all, but rather unethical political telemarketing — telephone calls disguised as research that aim to persuade large numbers of voters and affect election outcomes, rather than measure opinions. This misuse of the survey method exploits the trust people have in research organizations and violates the AAPOR Code of Professional Ethics and Practices.

The main thing to note is that a ‘push poll’, despite its name, is not actually a poll at all. It is a form of campaigning under the guise of being a poll. Essentially, push polling is conducting very short telephone calls to a very large number of people, specifically to influence their view. For it to be effective you’d need to call a much larger number of people than is typically called for a random political poll.

The fact that a poll contains negative information about one or more candidates does NOT in and of itself make it a ‘push poll.’ Political campaigns routinely sponsor legitimate “message-testing” surveys that are used by campaign consultants to test out the effectiveness of various possible campaign messages or campaign ad content, often including negative messages. Political message-testing surveys may sometimes be confused with fake polling, but they are very different.

If it’s a random survey by an established company, and/or the results are made public, it’s probably not a push poll.





Polls and cell phones… again…

7 07 2014

Why don’t you poll cellphones?

This question, or variations on it, is the one I’m asked most frequently. I’ve answered it before on this blog, but this time I thought I’d share some data to help explain my view.

Firstly, let me state that the company I work for does call cellphones. We just don’t randomly dial them for the political poll. As I’ve mentioned before this has very little to do with the actual cost of calling cells. For a polling company, the cost isn’t that much more than it is for landline calls.

I’d like to start by addressing the misconception that it is just low income or ‘young’ households (for lack of a better term) that don’t have a landline telephone.

Please look at the chart below, which I created using data from Statistics New Zealand’s 2012 Household Use of Information and Communications Technology Survey. This is a very robust door-to-door survey of New Zealand households. You can find out more about the methodology here. As you can see in the chart, relative to all NZ households there is a greater proportion of non-landline households in the lower income (and likely younger) groups. However, what’s also clear is that there are substantial proportions of non-landline households in higher income groups too.

1
Read the rest of this entry »





What’s the actual margin of error?

2 07 2014

Thomas Lumley over at StatsChat has used Peter Green’s polling average code to estimate the actual margin of error for political polls after adjusting for design effects. I had no idea how this could be attempted across non-probability samples (EDIT: To be fair, I had no idea how this could be attempted across multiple polls – at all).

If the perfect mathematical maximum-margin-of-error is about 3.1%, the added real-world variability turns that into about 4.2%, which isn’t that bad. This doesn’t take bias into account — if something strange is happening with undecided voters, the impact could be a lot bigger than sampling error.

That last point is a fairly important one. There are many potential sources of error in a poll other than the sampling error.








Follow

Get every new post delivered to your Inbox.

Join 600 other followers