Liars, damned liars, and people who respond to opinion polls

Uncategorized
Jun 01, 2015

“If this exit poll is right, I will publicly eat my hat on your programme,” said Paddy Ashdown on the BBC’s election night coverage. People on all sides struggled to believe the exit polls and came up with all sorts of explanations, until it turned out it was the opinion polls that got it wrong. There are several possibilities: people lied in all the polls before the exit poll or loads of people changed their minds once they had the pencils in their hands or people voted tactically. Maybe exit polls are just better, but they’re definitely different.

What is exit polls

Exit polls (http://www2.warwick.ac.uk/fac/sci/statistics/staff/academic-research/firth/exit-poll-explainer) are not just a proxy count of votes: they are modelled along with results of previous elections, previous local exit polls, and swings in party support. Results of previous elections are important once voter intransigence is assumed, that is, if people are likely to vote for the same party as last time. In one sense, an exit poll is a panel survey and inter-election exit poll comparisons are essentially trackers. In the same way, they are subject to the importance of sampling and of maintaining the same participants; in this case the ‘participant’ is the polling station rather than the individual. The sample for an exit poll is between 100 and 200 voters at about 100 polling stations, a total of more than 20,000. Most polling locations are retained from election to election but there can be changes to attend to changes in electorate, for example. As with all opinion polls, exit polls assume some error and build it into the regression models. The models then produce probabilities for each candidate winning each seat, with confidence intervals around predictions of both final percentages and seat numbers.

The important difference between exit polls and most opinion polls is in the questions posed. Exit polls can ask about actual votes for actual candidate names but national-level opinion polls tend to ask about parties. Party-prompt polls seem better suited to systems using electoral lists where votes are for parties whose members are ranked, so the higher one is up the list, the higher the probability of being elected. Some countries’ rules allow the popular vote to influence the order in which a party’s candidates are listed, so candidate-specific voting is possible but less powerful than in First-Past-the-Post (FPP) or proportional representation.

Curiously, one outlier poll actually got very close to the actual results, but it was so different from the preceding polls that Survation (http://survation.com/snatching-defeat-from-the-jaws-of-victory/?utm_source=dlvr.it&utm_medium=twitter) “chickened out” of publishing it. There’s nothing magic about what Survation did: it was a 1,000-respondent nationally representative telephone poll the day before the election that listed the candidates in each respondent’s constituency and asked how they planned to vote. At that juncture, there was less scope for confounding and, possibly, more social desirability pressure to keep one’s word. However, FiveThirtyEight (http://fivethirtyeight.com/liveblogs/uk-general-election-2015/?#livepress-update-21944073) modelled both candidate prompts and party prompts concluded, that generic questions about party-level voting intention were more accurate than specific ones.

So, was lots of people just fibbing?

An oft-repeated explanation in the past couple of weeks has been that Conservative and UKIP voters were “shy” in their responses to opinion polls. Eric Kaufman (http://blogs.lse.ac.uk/politicsandpolicy/the-shy-english-nationalists-who-won-it-for-the-tories-and-flummoxed-the-pollsters) points to other research that indicates reluctance on the part of UKIP supporters to answer questions about class, and that shows lower levels of trust in others. Evidence from polls of Scottish voters can, at least, eliminate the simple shy-Tory explanation: estimates of Conservative votes were consistent with the election result and it was the SNP that was under-estimated.

Tactical voting another peculiarity of the FPP system that may have had an impact on the gap between the opinion and exit polls. The process might go something like this: I’d like to vote for the Party X so that’s what I’ll tell the pollsters, but I live in constituency where Party Y holds the seat and where only Party Z can challenge them. I don’t want Party Y to win so I’ll vote for Z instead. (If it helps, insert UKIP, LAB, and CON for X, Y, and Z.) An attempt as mass vote-switching was orchestrated by a certain right-wing daily newspaper which ran a headline a couple of weeks before the election instructing citizens in 50 constituencies how to vote tactically to “help keep Labour out of Number 10”. Whether it worked or not is of less concern than whether a campaign of this sort should be considered a confounding variable in analysis of polls.

It’s very confusing, this exit poll business. People might have lied, or been shy, or voted tactically but the main difference is methodological. Think about between becoming engaged and getting married, each based on a different question. About 15% of engagements (http://content.time.com/time/magazine/article/0,9171,490683,00.html) are called off each year, and about 15% of those polled got cold feet, or changed their minds, or lied. If an opinion poll is “Will you marry me?” an exit poll is “Do you take this man to be your lawful wedded prime minister?”

Featured Jobs

IPPR (Institute for Public Policy Research)

London WC2N 6DF

November 18, 2018

Economic Insight

London, UK

January 13, 2019

Competition & Markets Authority

London

December 09, 2018

The Office for Students (OfS)

Bristol, UK

November 20, 2018

Cabinet Office

London

November 25, 2018

FocusEconomics

Barcelona, Spain

December 07, 2018

The Department for Environment, Food and Rural Affairs (Defra)

November 16, 2018

The Civil Service Fast Stream

London

November 15, 2018

The Competition and Markets Authority

London

November 26, 2018

The Institute for Public Policy Research

London WC2N 6DF

November 18, 2018

Ofgem

London, Canary Wharf

November 26, 2018

Frontier Economics

London WC1V 6DA

December 01, 2018

Economic Insight

London

December 17, 2018

Oxford Economics

Oxford, OX1 1HB

November 19, 2018

University of Oxford

Oxford

November 21, 2018

Public Health Wales NHS Trust

Cardiff CF10

November 14, 2018

Warwickshire County Council

Warwick, CV34 4TH

November 25, 2018

Competition and Markets Authority (CMA)

London

November 26, 2018

Our Partners

Logo for Bank Of England
Logo for Cma
Logo for Fca
Logo for Frontier
Logo for Heathrow
Logo for Home Office
Logo for Ofcom
Logo for Oxweb
Logo for Pwc
Logo for Three

Like what you see?

Post a job