emotion monitoring

The video ‘view’ has seemingly become the measure of how successful a piece of digital video is. Sainsbury’s CEO Mike Coupe, when discussing the supermarket’s Christmas ad campaign featuring Mog the cat on a recent press call, used YouTube views to back up his claim that it was the most successful supermarket campaign ever.

The campaign has indeed reached millions of people. Almost 30 million have watched ‘Mog’s Christmas Calamity’ since it went live on 12 November. But is this really a good measure of success? Just because people watched a video does it mean it resonated with them?

That is the question that the new generation of emotion tracking technologies is trying to answer.

Reassessing ad effectiveness

The latest news in this area comes from Realeyes. It has signed a deal with MediaCom to give its clients, including Coca-Cola, Volkswagen and Procter & Gamble, the ability to measure people’s emotions through webcam recordings. The tie-up comes hot on the heels of Apple’s acquisition of Emotient – an artificial intelligence technology that similarly analyses facial expressions and uses the data to determine people’s emotional response to ads.

There are already a number of services available for tracking the thoughts and feelings of people that have viewed an ad. Brain Juicer, for example, uses consumer panels to determine emotional reaction. More recently, viral video service Unruly has launched its ‘Future Video Lab’ which aims to provide real-time data on the emotional triggers that drive an effective video distribution strategy.

RealEyes' technology tracks people's emotions as they watch video ads
RealEyes’ technology tracks people’s emotions as they watch video ads

But Realeyes’ and Emotient’s newer offerings have been hailed as landmark deals for emotion tracking. Previously it could only be used on a small scale but Apple’s and MediaCom’s involvement opens it up to a whole new audience.

David Carr, strategy director at digital agency DigitasLBi, says: “It is interesting to see these solutions being used at scale. If they can get a large body of evidence and then allow brands to mine and benchmark their own campaigns it could be very powerful.”

Using emotion data to ads that resonate

The deals are part of a wider movement aimed at reassessing what drives ad effectiveness. Carr adds that they aim to finally provide the data that the ‘man men’ of the 1970s were looking for when they shifted marketing from being about product to about brand – as Coca-Cola did with its ‘I’d like to buy the world a Coke’ ad.

There appear to be two main uses for emotion data. The first is in pre-testing ad campaigns and is the main focus for Realeyes at the moment.

For example, brands that have created longer-form content could test which sections would work best as a teaser to get people watching the full video. It could also be used to work out if different soundtracks produce different emotional responses or if a particular scenario produces a bigger emotional response in one market than another.

“This technology enables us to predict how well assets will do. It enables us to stop campaigns and gives us the knowledge to know if a given asset is even good enough to air.”

Palle Finderup Diederichsen, head of EMEA, MediaCom Beyond Advertising

“It has already saved our clients quite some money in terms of us pulling the plug on campaigns. And it helps us by knowing how good an asset is so we know how much we can rely on organic reach or if we need to pay our way through. That’s how it works in the media planning phase.”

The technology could also help brands work out which consumers to seed content with, or when to stop showing the ad.

“An advertiser’s goal is never to serve as many ads as possible but to serve them only to people who might be receptive. Most digital platforms use algorithms to judge which ads users don’t like, and stop running those ads. This sort of technology is an extension of that system: if you don’t like my ad, I don’t want to show it to you anymore,” explains Alistair Dent, head of product strategy at iProspect.

Uses beyond digital

While there are clear uses for emotion tracking technology within the digital space, in future it could be used across media.

That has already started in outdoor advertising. At Birmingham’s main train station, New Street Station, Ocean Outdoor has erected digital screens that track the age and sex of passers-by in order to show more relevant advertising.

Ocean Outdoor's digital display in Birmingham can monitor passers-by
Ocean Outdoor’s digital display in Birmingham can monitor passers-by

Catherine Morgan, head of creative solutions at Ocean Outdoor, says the technology is 98% accurate on gender and between 85% and 90% accurate on age groups. Advertisers can then set a threshold to say they only want their ad shown if the audience is 60% female or 40% over the age of 30.

The technology can also be used to work out how many are looking at the screen and how long for. It is not hard to imagine a future where technology can track if people passing by are tired and offer them cups of coffee, or stressed and offer them some light entertainment.

However, amid all the excitement about emotion tracking technology and what the data could tell brands about the effectiveness of their creative, marketers should not forget to trust their instinct, says Carr.

“Hopefully marketers will still occasionally go with their guts. The most famous adverts – Gorilla from Cadbury, Surfers from Guinness – would all have failed in pre-testing.

“The technology might have a big body of data behind it but is has to learn over time. Nothing is 100% foolproof. This data provides useful benchmarks and comparisons and the data to prove the effectiveness of what we feel in our gut. But as with all good things the true test will be in how it’s used.”

How the new emotion tracking technology works

People’s facial reactions are recorded through their webcams with their consent. These are streamed to Realeyes’ cloud servers, where they’re securely processed – low-quality recordings are filtered out and the expressions analysed are by extracting data from 49 key facial points.

Six key emotions are measured – happiness, surprise, sadness, disgust, fear and confusion – which are then also used to measure engagement as well as overall positivity and negativity.

The results are aggregated and reported in an online dashboard in near-real time, enabling clients to make better and quicker business decisions.

A couple of hundred people are usually tested per video and they can be tested across 60 countries.