Following the general election, where does research go from here?

There is no point hiding from the fact the last general election was bad for political polling. But what does all of this mean to marketers who want to know where public opinion research goes from here?

Stephan Shakespeare

Any marketer following the research industry over the past eight months (especially in light of the British Polling Council report into the election) would have seen plenty of mud-slinging, recrimination, self-justification and panic. Most of the statements from interested parties since have fallen into two basic camps.

The first claimed they got it right all along and published polls after the fact as “proof.” But if the results weren’t solid enough to release before election night, what credence do they have? Would they also hide the right result from commercial clients looking to launch a new product?

The second tried to claim that knocking on doors is the only way to get an accurate result, pointing to a study carried out between July and November that showed the correct Conservatives margin of victory. But those boosting door-knocking overlook some important points that are less to do with how people were sampled and more to do with when they were asked.

It is no surprise that the early respondents – asked before it became clear that Jeremy Corbyn would replace Ed Miliband – were more likely to say they voted Labour, and those sampled after the left-winger won were more likely to say they voted Conservative. Furthermore, this time-consuming and expensive exercise under-represented smaller parties.

In other words, door knocking is a good way to spend a long time burning through a lot of money to find out a headline you already know and getting the rest of the story wrong. If marketers took this approach to their market research they would just be finding out now that online is massive and would be looking to advertise to twenty somethings on Friends Reunited or Bebo.

The fact is that all research methods – phone, online and door-to-door – got something wrong. Most notably, we spoke to too many people who were already politically engaged. For our commercial work this particular issue doesn’t tend to matter – if you want to find out what toothpaste people prefer, it makes no difference whether they are interested in politics or not. But for an election poll, the very fact that they agree to answer a survey may make them more likely to be political.

The key thing in any piece of research – be it political polling, audience testing, market segmentation or anything else – is to reach a representative sample that gives a 360-degree view of the world as it is, not the world as it was. But it’s getting increasingly difficult to reach the full spread of people in the UK, especially if your approach is based on hoping that the right people will be in the right place at the right time.

Any practical solution aimed at hitting the correct range of increasingly hard to reach respondents will not involve knocking on doors that won’t be opened or making calls that won’t be answered. Instead, researchers have to go where the people are and where they have a good idea of who they are so they can get a representative sample. Logic and common sense dictate this will be online.

For example, following the election YouGov knew we didn’t have enough politically unengaged young people on our panel and so we have recruited more of them. Ensuring we have the right blend of respondents is only possible because we understand a lot about who is taking our surveys – from the type of car they drive to their favourite ads.

But while the issue thrown up by the election is currently only one for political polling, in the near future reaching consumers – especially young ones – on their terms will be a key challenge of research for marketing. The most sensible way to engage a representative sample of young people is to build a mobile-based experience, as we are doing. Such an approach takes the game to them, meaning taking part is easier and more social – making it a natural part of their online lives.

But for a lot of researchers and marketers asking “what went wrong with political polling?” is less interesting and important than “where does research go from here?” Marketers need accurate, up-to-date data at their fingertips so they can make better informed decisions about their audiences and general consumer behaviour. But they also need to know that research firms are thinking beyond the present and asking how they can undertake robust, representative research in the coming decade and beyond.

In a connected world the only way forward is connected data from online research.

Stephan Shakespeare is CEO and founder of YouGov


MW Insight

You can’t conquer the world from behind your desk. Click here to register for Insight ’16 or here for more information.


1 Comment

Can brands trust polls after election failures?

David Burrows

Political polls have been wide of the mark in numerous elections around the world in the past 12 months, posing the question of whether surveys can be trusted any more, and if so, how brands can ensure market research results are relevant and reliable.


There is one comment at the moment, we would love to hear your opinion too.

  1. Greg 8 Apr 2016

    While companies like YouGov ask their panellists to spend up to 15 minutes filling out surveys in return for a paltry £0.50 (which you can only cash out upon reaching £50 / completing 100 surveys), its hard to imagine them being able to create truely representative samples of any population. It seems baffling that the industry wide review of what went wrong at the election seems to have completely missed this flaw in most of the research companies’ research methods.

Leave a comment