Q+A: What should we make of political poll results?
Ahead of yet another poll, people could be forgiven for wondering whether to even tune in.
Thursday night's 1News-Colmar Brunton poll had Labour on 44 per cent support, ahead of National on 40 per cent and the minor parties fading fast. But just two days ago, Newshub's poll had National 10 points ahead with potentially enough support to govern alone.
The week before, polls showed Labour with a four percentage point lead over National. As political editor Tracy Watkins wrote: "Either support is swinging wildly between Labour and National as we hit the final straight of this election campaign — or the polls are wrong."
TOP's Gareth Morgan is going with the latter. On the AM Show on Friday, he said polls showing he wouldn't make it into Parliament were wrong because they relied on the opinions of voters with landlines.
* Rollercoaster polls to match rollercoaster election
* Survey: which politicians do you trust and what will influence your vote? Tell us your views
* Live: Back on the campaign following shock poll
* Voters punish uncertainty, Jacinda Ardern's left enough of it for National's attacks to work
* Major campaign shift for TOP, with new strategy to win seats - and stop Winston Peters
"When I ask the question in the town hall shows I do every night, 'Who's got a landline?', about 10 to 15 per cent," he said. "What's wrong with these polling companies?"
Despite polls not putting the party above 2 per cent, Morgan said he expected the party to get between 5 and 10 per cent.
With the election just around the corner — fact, tens of thousands of people have already voted — the varied results have people asking what's happened, and why. Here, we highlight previous content on political polling in order to make sense of it all.
FIRST, HOW MANY POLLS ARE THERE AND WHO'S SAYING WHAT?
Currently, there are only three public political polls: Roy Morgan, Colmar Brunton, and Reid Research. The Stuff Poll of Polls, developed with help from Massey University's Professor Malcolm Wright, provides an average of the most recent polls from each of these, with some time-weighting if polls start to get out of date.
Parties also carry out internal polls, but the results of those aren't public - although they are occasionally leaked.
SO WHY ARE THE LATEST RESULTS SO DIFFERENT?
In short: sampling error (which gives rise to the margin of error) and non-sampling error. Wright explains: The first is just the ability to get a good estimate of what a group of people think, by using a sample. Those samples aren't perfect, and that's where a margin of error comes in (more on that later)."
Non-sampling error is "all the things other than the size of the sample you've drawn".
"There are biases such as landlines versus mobile phones, the way you have worded the question, the fact you're doing it online, everything that goes into designing a poll that might affect who you're targeting, who you're reaching from that population, their willingness to respond, and so on. The great thing about the Poll of Polls is that it evens out some of those non-sampling errors."
Of course, on the day, voter turnout will also determine accuracy.
AND WHAT OF THE LANDLINE DEBATE?
Pollsters are maintaining it doesn't really matter if you don't own a landline. Yes, some, such as Colmar Brunton, which has been running the 1News poll for more than 20 years, use only landlines for political polls. A methodology called "random digit dialling" allows them to pick numbers well distributed throughout the country.
Jason Shoebridge, chief executive of Kantar Insight New Zealand, which operates market research agencies TNS New Zealand and Colmar Brunton, told us while it's getting harder to get people on landlines, they're still able to get the sample they need.
"We think using landlines is still the best way to get the most representative sample of New Zealand's population."
Others try to correct landline bias by having "top up samples" with mobile phone numbers, and then weighing those results accordingly, Wright explains. "We're never absolutely sure what are the right methodologies. Even if they're all good, they might attack different parts of the problem."
WHAT'S THE MARGIN OF ERROR?
Andy Fyers does a pretty good job of explaining this: The margin of error is the pollsters' way of conveying the level of uncertainty in a poll result. When they give you a result of Labour on 33 per cent, for example, what they are really saying is: we are very confident that the actual support for Labour is somewhere around that mark.
Shoebridge says there are a lot of misconceptions about the margin of error. "If there's a margin of error of 3.1 per cent, in 19 out of 20 cases ... if we went out and surveyed everyone in the population that proportion would be within 3.1 per cent — plus or minus — of what we're getting within our sample," he said.
But that doesn't mean that every party's result could be out by up to 3.1 per cent.
"I think it's often misunderstood. That margin of error (3.1 per cent) applies for parties that are polling around 50 per cent. If a party is polling at around 10 per cent then the margin of error is plus or minus 1.9 per cent.
"What you'll hear reported is one of the minor parties is polling within the margin of error. You can't poll within the margin of error ... as the percentage gets smaller the margin of error gets smaller as well."
HOW MUCH OF A BIG DEAL IS THIS POLL RESULT, REALLY?
"Well it certainly bucked the trend," Wright says, "but it is not the biggest or most surprising change seen over the last two months. Really, it just emphasises that the electorate is still fluid and waiting to be convinced."
On Kiwiblog, Curia Market Research's principal David Farrar, who has provided polling services to three prime ministers and four opposition leaders, wrote it was indeed "puzzling" the polls diverged not once but "consistently in the last few weeks". He added: "My consistent advice is not to cherry pick the poll you like but look at the average of the polls."
HOW ACCURATE HAVE POLLS BEEN IN PREVIOUS ELECTIONS?
Pretty accurate. A look at the past three elections reveals that the pollsters have, collectively at least, done a good job at predicting the outcome. We looked at support for each of the four main parties in the Stuff Poll of Polls right before election day in 2008, 2011, and 2014, and compared that to the actual result. With a couple of exceptions, the Poll of Polls had the support for each party right to within 1 to 1.5 percentage points.
The biggest error in this group of elections came in 2011, when it over-estimated support for National by about 4 points (51.5 per cent versus 47.3 per cent) and under-estimated support for NZ First by 2.5 points (4 per cent versus 6.6 per cent).