Corporate Reputation

Can we depend on pollsters?

Why did pollsters get the results of the 2015 general election so wrong and what lessons have they learned?

The date for the European Union referendum has been set. Battle lines have been drawn. Unlikely alliances have been formed across political parties, with many Conservative grandees abandoning the Government ‘staying in’ line. British companies are already preparing themselves for a potential exit, the effect of which will be ‘seismic’, if former EU commissioner Neil Kinnock is to be believed.

As 23 June looms, an inevitable flurry of polls will be conducted to provide a temperature test of public opinion. But following the debacle around last year’s General Election, Britain’s polling companies know that they have a lot to do to rebuild their reputation. Their advance polls last year pointed to a Hung Parliament, leading pundits to speculate on the type of deal the Labour party would work with the Scottish National Party.

But the final result revealed a Conservative majority, with 330 seats against the pollsters’ predictions of 290, prompting red faces across the polling industry and a formal inquiry commissioned by the industry’s regulatory body, the British Polling Council, into what went wrong.

Ipsos MORI, the UK’s second largest market research company, recently hosted a debate on whether the results of the General Election signalled The Death of Polling, assessing what effect the misjudgements had on the industry globally and public trust. There, chief executive Ben Page concluded that, fortunately for them, the mistakes made by pollsters last June did not signal the death of polling, rather that polls were more likely to be looked at with ‘healthy scepticism’.

‘People are taking them with a little more understanding of limitations, which is not necessarily a bad thing,’ agrees Gideon Skinner, head of political research at Ipsos MORI. ‘We would never say that polling is without limitation.

‘What polling can do is provide data about attitudes to parties, to certain issues, what the public are most worried about. What polling is doing is asking opinions on something in a snapshot of time. We’re dealing with human beings. It is difficult to measure how people are going to behave.’

John Curtice, president of the British Polling Council, says that he knew from the moment the election results were revealed that there would need to be an independent inquiry into what went wrong. He says: ‘It was obvious that we had to respond to what was a substantial failure. Its importance was driven home after the interim conclusions of the inquiry were the second item on the BBC news homepage.’

If you want to find out what toothpaste people prefer, it makes no difference whether they take an interest in politics or not. But for an election poll, the very fact that they agree to answer a survey may make them more likely to be political

The results of that inquiry, which was conducted by Patrick Sturgis, professor of research methodology at the University of Southampton, were released this month, but its provisional findings were disclosed in January. Sturgis concluded that the pollsters’ selections of samples failed to represent every demographic in the electorate sufficiently. In the case of Ipsos MORI, it was not ‘shy Tories’ [Conservative voters unwilling to pledge their allegiance] that skewed the polls, but too many Labour supporters, who, in the end, did not turn out and vote.

Other pollsters had the same problem. ‘The samples of people who were answering surveys from all the different polling companies did not adequately represent the voting public,’ writes Stephan Shakespeare, chief executive of YouGov, in a retrospective blog post on the company’s website. ‘They may have looked representative in terms of age, gender, income and so on but they had too few Tories in them.

‘In particular, the groups we were measuring contained too many politically engaged people. For most of our work this is not a concern – if you want to find out what toothpaste people prefer, it makes no difference whether they take an interest in politics or not. But for an election poll, the very fact that they agree to answer a survey may make them more likely to be political.’

Curtice adds: ‘The number of people interested in politics doesn’t really change so perhaps the question should be about how to get people interested. [Pollsters need to] take more into account. Make more effort about getting older voters. Doing things quickly isn’t necessarily getting things done effectively.’

Polls help provide society with a mirror on itself. It’s a check for politicians and journalists

He points out that Conservative voters are a lot harder to get hold of and that more effort needs to be made in reaching out to people who might not answer their phone the first time around. But budget constraints can cause problems with this. ‘It takes longer and it’s more expensive. Newspapers are a dying industry: they don’t have the money to be spending lots on polling. We need to end this presumption by journalists that if a poll is more than two days old, it’s yesterday’s fish and chip paper. Public opinion actually doesn’t change all that much.’

There are, perhaps, bigger problems with the media’s relationship with polls, however. ‘We need to be clear about what research does and doesn’t mean,’ says Katharine Peacock, managing director of ComRes. ‘People will run away with certain statistics.’

Speaking at the Death of Polling event, Page concurred, saying that Ipsos had tried producing results charts with shading that showed the margins of error, but that they didn’t prove popular. ‘[Journalists] want the headline,’ he added.

‘Polls help provide society with a mirror on itself. It’s a check for politicians and journalists,’ says Curtice. ‘Polls should inform politicians and help keep them honest. Ideally, they should improve the quality of democratic debate. They should inform political discourse.’

However, whilst this may exacerbate perception that polls are getting things wrong, the responsibility falls to the pollsters to address the problem.

‘It is for us to earn their trust, it’s not someone else’s fault if they don’t trust us. The onus is on us,’ acknowledges Skinner. ‘Part of that might be increasing understanding and pointing to the range of work we do.’

Peacock agrees. ‘We have to look at the way we work with the media. We can’t underestimate the value of consultancy, of research consultants interpreting the data and providing genuine insight.’

Companies themselves have been looking into the problem and coming up with their own ways of addressing it. ‘We weren’t going to necessarily wait for the recommendations of the inquiry,’ adds Peacock. ‘We’ve got to make sure we’re constantly moving forward. There are always challenges but that’s the point, that we amend our methods and tackle them.’

‘We’re looking at our approaches, getting better at things like sampling and distinguishing between those who genuinely do turn out and those that don’t,’ says Skinner.

Ipsos MORI has been working on improving representation of politically disengaged or non-voters in its samples. The company has introduced newspaper weighting, to reduce the proportion of broadsheet readers. In four months, from September to December 2015, this reduced the proportion of claimed likely voters by an average of three percentage points per month, primarily at the expense of Labour share. They will continue trialling quotas as experiments continue.

For now we need to go the extra mile to make sure to include the parts of society that are less plugged in, but within decades we could be seeing instant mass participation in everyday public decisions that will make the polling debacle of 2015 seem like ancient history

For YouGov, the future of opinion polling is making surveys more accessible online, specifically on mobile. ‘The way to get more ‘normal’ young people taking part – a large part of the polling error in 2015 – is surely to build a mobile-based experience that makes taking part easier and more social, a more natural part of their online lives,’ says Shakespeare.

‘We are already doing this. We need to innovate faster to keep pace, not pretend the Internet never happened.

‘For now we need to go the extra mile to make sure to include the parts of society that are less plugged in, but within decades we could be seeing instant mass participation in everyday public decisions that will make the polling debacle of 2015 seem like ancient history.’

ComRes, too, has been innovating. ‘Our review will be ongoing but, with the release of our first post-election voting intention poll for the Daily Mail, we have applied a newly developed, innovative ComRes Voter Turnout Model,’ it confirmed in a blog post.

‘By simulating turnout among different demographics we have developed a means of weighting the results to the expected turnout profile. The advantage of this model is that demographic turnout patterns are more consistent from election to election than the demographics of party support.’

ComRes tested their new model by applying it to the results of the General Election, and found that it produced a five point Conservative lead, which is closer to the result than their analysis predicted last year.

Transparency is another key component of rebuilding trust in the polls, and has been further recommended by the provisional findings of the British Polling Council inquiry. ‘Transparency and engaging with people about the polls has always been a key part of our company’s ethos,’ says Skinner.

Indeed, both ComRes and Ipsos MORI emphasise that they are founding members of the British Polling Council and that all results and methodologies of their polls are shared online for the public to engage with.

Of course, it is in pollsters’ interests to make their research known, as Curtice notes that political polling is part of what makes up a research company’s brand. ‘It’s a tough task. Polling companies don’t actually make much money from predicting the outcome of General Elections but it gets their brand out into the mainstream. But there’s a risk that if they get it wrong they end up with egg on their face. If they get it manifestly wrong, potential clients will ask Are these any good?

‘We’ve got to be accurate – we’re creating what we’re going to be judged on,’ adds Skinner. ‘We do it because we want to get it right,’ asserts Peacock. ‘It’s a competitive industry, we want to get it the most accurate. Nobody was more distressed by the [General Election] outcome than us. It is the one time that market research is held to account.’

But the changes in sampling and modelling may not impact the performance of pollsters in the EU referendum. ‘It’s not something you can just snap your fingers and solve,’ says Skinner. ‘There will be more tests soon. The Scottish elections are coming up. The EU referendum is coming up. But it’s a long-term project as well.’ ‘The inquiry suggests there’ll be further changes and experimentations by companies themselves,’ says Curtice. ‘The EU referendum is not going to prove they’ve solved the problem. You can still get the General Election wrong and the EU right. There’s not much opportunity to test methods before 2020. Companies are going to engage in further tweaking and experimentation. It’s likely to still be in flux for another year at least.’

Even the inquiry’s provisional findings were cautious. ‘There will be no ‘silver bullet’, the risk of polling misses in the future can be reduced, not removed,’ it concluded.

Mark Pack, associate director at Blue Rubicon, believes that, despite the limitations of polling, there are few better alternatives. ‘The polling industry faces a twin pressure: its most high profile output (General Election polling) performed badly last time out and its reason for being is under threat from the rise of big data and digital monitoring, both of which provide alternative routes to insight and understanding.

‘However, its lifeline is that the very sample problem which undid the 2015 polls – not sampling a representative cross section of the population – is also a problem which digital listening and big data analysis often encounter. The polls may not have got to a representative sample, but then neither did Twitter.

‘What the pollsters need to restore their reputation is to fix the fundamentals – by changing their methodology, to win a right to be heard again through transparency – and in the British Polling Council they have one of the best industry trade bodies in this respect, and then to build trust through demonstrating success at predicting future elections.’

With the future unclear as to the outcome of the EU referendum, and with less than six months to go until the deciding votes will be cast, pollsters, it seems, will be awaiting the results with bated breath alongside the rest of the country.

This article first appeared in issue 104