Can brands trust polls after election failures?

Political polls have been wide of the mark in numerous elections around the world in the past 12 months, posing the question of whether surveys can be trusted any more, and if so, how brands can ensure market research results are relevant and reliable.

How did it all go so wrong? It is a question that pollsters have had to face since the Conservatives rolled in to Downing Street after winning a majority in May’s general election – a result that none of the final polls predicted.

YouGov’s CEO Stephan Shakespeare admits the entire industry “has got a lot to answer for”. It was not the only case in recent history where the polls proved to be wide of the mark, though it is perhaps the most embarrassing. As UK academics noted in a blog for the Washington Post last month, criticism has also emerged following the US mid-term congressional elections, the Israeli general election and recent referendums in Greece and Scotland.

Suzanne Lugthart, head of research and planning at property website Rightmove, says “The pre-election polls were trying to accurately predict a future reality. They failed because you can’t do that with an online survey of 2,000 people who choose to be on research panels, no matter how good your sampling and predictive algorithms are.”

Carol Sheppard, customer experience research manager at brewer Molson Coors, points out that political polling and brand surveys are “two very different entities”, but the challenges are often the same: budget constraints, time and unpicking the difference between what consumers say and what they actually do.

‘Shy Tories’ and claimed behaviour

The inconsistency between claimed and actual behaviour could be one of the factors that affected polls in the run-up to the general election, according to Shakespeare. But it is unclear to what extent or how it relates to other possible problems, such as sampling errors and the differing turnout between each party’s supporters.

“We do surveys because they are a quick and relatively cheap way of getting close enough to the truth”

Suzanne Lugthart, Rightmove

The problem undoubtedly occurs too in market research surveys. For example, Dŵr Cymru Welsh Water has been running campaigns to help prevent blockages in the country’s sewage systems caused by flushing more than paper down the toilet or pouring fats and oils down the sink. According to head of brand and campaign Morgan Lloyd, when people are asked in a survey situation if they ever do any of these things “the vast majority say no”. He continues: “But what’s clear to us is that a percentage must do. Much like it is socially acceptable to vote in a certain way, it is socially unacceptable to flush certain things down the toilet.”

People’s unwillingness to admit a behaviour or preference to a researcher is a source of polling error, for example the so-called ‘shy Tories’ who are believed to have told the UK election pollsters their vote was undecided when they actually planned to vote Conservative.

In some cases, commercial surveys can be more wrong than the general election polls. A margin of error of plus or minus 4% can be “perfectly acceptable” as part of a decision-making process, according to Rightmove’s Lugthart. However, she adds: “You could argue that the decisions they affect are not as important as who gets in to 10 Downing Street. We do surveys because they are a quick and relatively cheap way of getting close enough to the truth to inform decisions about brands and marketing.”

Indeed, under constraints of budget and more often time, marketers must regularly trade off between accuracy, cost and speed. Their willingness to do so will depend on the scale of the decision in hand. “The bigger the stakes, the more ways you’re going to look at it,” says Adrian Shaw, senior consumer insight manager at Camelot Group.

The Football Association performs textual analysis of free-form survey responses to understand Wembley visitors’ views better

The need for speed may have also played a part in the general election polling fiasco. Rather than asking questions relating to political issues first – so-called ‘warm-start’ polling – pollsters would have often gone straight for voting intention. “You’re trying to get at a very complicated piece of information, and to get at it with only two or three questions,” Populus founder Andrew Cooper explained recently.

With the purse strings tightened at many of the media groups that commissioned the polls, this approach saved money. For the same reason online polling also came to the fore, which often produced different results from phone polling, though not necessarily more or less accurate.

The effects of online polling

Online has, of course, created an opportunity to produce results quickly and for less money. This appeals to marketers and pollsters alike. The downside can be low engagement from consumers, which can affect accuracy. Results from the latest industry report by GreenBook, which were published in June, found “quality of data, quality of respondents and quality of insights” to be the biggest challenge for 14% of respondents. There was a feeling that “respondents are bored and disengaged” and “inundated with surveys, [which] leads to dubious responses”.

Lloyd at Welsh Water says that surveying online can offer a “quick and painless” experience for people. The fact that the online experience feels more anonymous also has its advantages. He explains: “If you ask someone in a face-to-face interview whether or not they speed, there’s a lot of pressure to give the answer that conforms to the social norm. On the phone, there will be a similar pressure. But the more you de-personalise, [for example online or through mobile], the more accurate the result will be.”

Mobile surveying is used by 52% of suppliers, according to the GreenBook report. More than a quarter (27%) also use mobile ethnography, or ‘lifelogging’. Methods that capture behaviour in the moment, such as mobile diaries or communities, can offer a more accurate and holistic reflection of what consumers do and how they feel along their purchasing journeys.

“In conducting ethnographic market research, looking for contradictions, where people say one thing and think another, is a key approach,” says Keith Goffin, professor of innovation and new product development at Cranfield School of Management. “When you know this, the political polls are not surprising.”

Lifelogging is not the only technique helping marketers get closer to the truth. The GreenBook report offers a Pandora’s box of emerging methods – from social media analytics and crowdsourcing to text analytics and research gamification. “We have to be smarter in terms of the technology we use,” says Lloyd at Welsh Water.

Rightmove has been using neuroscience for its ‘find your happy’ campaign to capture how people feel

New methodologies

Jane Frost, CEO at the Market Research Society, says the UK research sector “leads the world in methodological, technological and creative research innovation”. Sophisticated eye-tracking techniques to ascertain how consumers subconsciously navigate anything from food packaging to film trailers is one example.

There is also increased interest in neuroscience, from implicit reaction time to facial coding to investigating unconscious decision-making. An advocate of the method is Lugthart at Rightmove. “Whether you’re in B2B or B2C, one of the challenges is eliciting the truth from people,” she says. “Large-scale neuroscience is a much better way of capturing how people feel about brands and advertising than surveys or direct questioning.”

The approach has been particularly useful for the company’s ‘find your happy’ campaign. “We’re trying to get closer to the memories embedded in people’s minds – especially the little memories that are particularly emotive. People are notoriously bad about reporting back in surveys but with some of these new techniques we can get closer to these memories,” she adds.

According to GreenBook, technology is the top challenge (44%) for those in market research. It is also one of the top opportunities (32%). It is hardly surprising, then, that handling data is also regarded with similar ambivalence as both a challenge (30%) and an opportunity (33%).

Free-form feedback

The Football Association (FA) has decided to add textual analysis of free-form answers to its polling of visitors to Wembley Stadium. Its survey of supporters coming to watch England play regularly achieves between 4,000 and 6,000 responses – most of it based on ‘rating’ scales for food, cleanliness, staff and so on.

However, according to FA head of customer insight Ross Antrobus: “In the past two years, we have got free-form text feedback from more than 40,000 visitors. Their combined feedback amounts to nearly one million words.”

To code and analyse this unstructured data, Antrobus turned to research agency Simpson Carpenter, which cuts it into sentences or clauses before allocating each of these snippets to a topic, such as ‘food and drink’ or ‘the family enclosure’. Combined with the rating data, this tool allows Antrobus and his team to comprehend how visitors feel about certain topics.

He explains: “Using the interactive dashboard, I can select ‘families visiting for the first time’ together with ‘food and drink’. I can then see all the positive and negative comments, and look at what went well and any areas needing improvement. I can then feed this back to individual teams. It has allowed us to bring the voice of our customers into our reporting. And if my stakeholders are doubtful about what people are saying, I can [show them].”

It is clear that brands are using a number of new tools to both enhance the quality of their survey data and get closer to the truth. The GreenBook report suggests that 55% of respondents use market research as “just one of many types of data” used for decision-making. “For many decisions a quick survey combined with deep knowledge of your consumers plus a dash of instinct is a perfectly acceptable solution,”says Lugthart. “But if it’s important enough, researchers need the time and budgets to do the job properly.”

These were the two things that many general election polls seem not to have had.

How to ensure reliable research results

Reduce the pressure on people

The more important the question to the respondent, the higher the stakes. So, use questions that take the heat out of the situation, says Welsh Water’s head of brand and campaign Morgan Lloyd. “Rather than ask ‘do you put things you shouldn’t down the toilet’, we have started asking ‘do you think people flush things they shouldn’t down the toilet’.”

Combine data sources

Meshing different data together to find out what customers are thinking is both a challenge and an opportunity. The insight can be invaluable. For example, Watchfinder is about to conduct a large-scale customer survey of watch buyers. “We already know quite a lot about our customers owing to the information they share with us when they visit a website, complete a purchase or sell us their watch,” says digital marketing manager Sean Reilly.

“What we are hoping to get out of [the new survey] is a bit more about their interests outside watches, for example do they collect anything other than watches such as wine or cars. We can map this feedback from the customer to data from previous brand partnership campaigns such as the Classic Car Show sponsorship, to see if we are focusing the marketing team’s time and budgets in the right areas.”

Keep questionnaires short and simple

The more data you have on your customers, the shorter you can keep your surveys. Carol Sheppard, customer experience research manager at brewer Molson Coors,
is working with B2B International on customer surveys. She explains: “You also need to ask yourself what is the objective of having the customer on the phone.”

Recommended