We mustn’t let the seductive siren call of AI lure the industry back in time

Work must be done to clean up the bias in data that is fed into many AI tools or the industry risks going backwards.

Source: Shutterstock

From predicting personalised content and customer behaviour, to agencies using it to optimise advertising campaigns, or brands training proprietary large language models suited to their specific brand identities, the use of AI in marketing is booming. So much so that the global market size of AI in marketing is projected to reach $72.1bn by 2030, a six-fold increase compared to 2022.

As always, few booms are ever capitalised on without financial and reputational risk. Under Armour, for example, created a furore earlier this year when it unveiled an ad featuring British boxer Anthony Joshua in a black and white, fast-cutting montage that used recycled footage made for previous commercials.

Critics unsurprisingly questioned the ethics of repurposing old work. Then there are the many brands concerned about mistakenly putting out work that infringes copyright, or who worry that feeding their customer information into an AI system could help train a competitor’s model, for example.

The risks are enough that marketers have begun adding clauses to agency contracts to prohibit the use of AI of any kind without prior authorisation. To explore these sorts of challenges, the Advertising Association wisely launched an AI task force last autumn aimed at helping the UK industry navigate both the promise and perils of AI.

I wonder, though, if we’re seeing enough specific focus on how marketers can leverage the best of what AI has to offer while also avoiding the gender bias and distortions inherent in AI. Bias that can and does skew AI-driven insights and recommendations for marketers. And if we’re not moving fast enough and working hard enough to de-bias AI and how it’s deployed in marketing – and it is indeed hard work – do we risk seeing some of our industry’s recent progress on how women are marketed to come undone?

Coca-Cola: The future is ‘AI meets human ingenuity’

The problem is most immediately noticeable when it comes to generating visual content. Many AI image generation models are trained on datasets scraped from the internet, which often perpetuate unrealistic and stereotypical portrayals of women. “These datasets frequently over-represent young, conventionally attractive women and depict them in sexualised or subordinate roles. As a result, when marketers use these AI tools to create visual content, they may inadvertently reinforce harmful gender stereotypes,” Rhonda Hadi, associate professor of marketing at Saïd Business School, University of Oxford, told me. (Hat tip here to Dove and the brand’s recent pledge to not use AI-generated imagery to represent women in its advertising and communications).

AI as a tool can both deepen existing gender bias and also help to close crucial data gaps that result from gender discrimination.

“The data bias problem is real,” adds Candina Weston, an AI specialist, consultant, and former CMO at Microsoft. “The first step in leveraging AI tech at scale must be to ensure the work is done to get the target audience data to the right level to get to the right output. This is an ongoing process, never a one-and-done, and is true with or without AI.”

And that’s a challenge because if, like me, you believe that too much data that’s supposed to be representative of women today is at best incomplete and at worst reductive, then it suggests that a lot of what AI is being used for in our industry is going to be off-base from the get-go. Somewhat ironically, perhaps, one solution for course-correcting for non-existent data (eg in the financial sector – remember it wasn’t until 1975 that British women could open a bank account in their name, so that’s a whole heap of important historical data about women’s creditworthiness that’s missing) is synthetic data – data that are artificially generated by algorithms or simulations, and can be used to train machine learning models.

Evidently, then, AI as a tool can both deepen existing gender bias and also help to close crucial data gaps that result from gender discrimination – a stark reminder that there’s nothing black or white about the arguments for or against AI in marketing. Like many of her peers, Tamara Rogers, global CMO at Haleon, believes marketing’s relationship with AI should be a delicate dance between ambition and prudence.

For example, while her teams have used AI tools such as CreativeX to drive improved marketing evaluation and analysis at scale, across hundreds of creative assets, she says: “This rich level of insight allows us to ensure that our marketing and advertising investment is focused on the best areas and is engaging and resonating with people in the right place, at the right time.” A benefit that clearly tracks Haleon’s ambition to drive greater health inclusion globally – there are limits to the parts of the campaign journey Haleon will deploy AI for. “It’s difficult, for example, for AI to replicate original idea generation, a core piece of creativity that human minds and skillsets can bring when designing content, particularly when needing to reflect local cultural contexts and other nuances within campaigns,” Rogers says.

Until data sets are rounded out and de-biased, the ultimate guard rail that marketers and their agencies must build in and elevate is that last-mile audit process.

So, reading from this that generative AI isn’t yet quite the large-scale threat some creative agencies may have feared, there’s an argument our industry should spend less time obsessing about something that for now remains existential and more time having essential conversations about the real bias in data currently feeding many of marketing’s AI tools. Not only due to the moral imperative but also because if we can fix the problem, we’re one step closer to reaching our industry’s holy grail – the ability to create bespoke marketing for individuals at a global scale.

“Improving data will allow us to get closer to hyper-personalisation – this is the piece I don’t think is spoken of enough but is so critical in enabling us to stop the unwanted stuff and get to the helpful stuff,” says Weston.

And let’s face it, as part of a historically underserved and ignored target, as a working mother with a mental load heavier than those dumbells sitting unused in my bedroom cupboard, I wouldn’t mind a bit of hyper-personalised, HELPFUL marketing coming my way.

Until data sets are rounded out and de-biased, the ultimate guard rail that marketers and their agencies must build in and elevate is that last-mile audit process, a level of human review for anything going directly to a customer (assuming those humans don’t also carry the same bias). I’d also like to think that, as exemplified by Dove, we’re evolved enough – as an industry – to know when to limit the kinds of areas in which we choose to deploy AI for marketing campaigns. We mustn’t let the seductive siren call of AI lure the industry back in time, undoing its hard (and ongoing) work to de-bias and unstereotype all that we do and produce.

Recommended