Time is running out for self-policing tech

Tech companies need to show consumers, government and activists they are taking their concerns seriously when it comes to data, illegal content and censorship. Failure to do so will result in them being regulated to an uncomfortable degree.

ad safety

Trust is one of the most important values. At the most basic level as human beings we have to trust others for our very survival. If we don’t we can’t be confident about what we eat, where we live, if we are safe. You can’t have love without trust and you can’t have commerce without it.

Whether you are saying ‘I will’ to your soon-to-be spouse, buying a pair of shoes or purchasing a new phone contract, you have to trust that the other party will live up to the bargain.

Alas, at a time when politicians from all sides are failing to honour their promises and principles, public trust in institutions is declining – no more so than around who they trust with their data. In part, this is thanks to our old friend GDPR, which has raised consumer understanding of their data privacy and rights.

In a recent Ipsos Mori Sustainable Business Monitor study from October 2018, more than 50% of people said that, thanks to GDPR, they are better able to control how their personal data is collected and used by companies or organisations. This is pretty amazing, and I have to say more than a little gratifying to those of us who were calling out the importance of the new regulations way back when.

These new levels of consumer understanding have, not surprisingly, increased concerns about data, particularly how tech companies are using it. In Ipsos Mori’s Influencing Customers survey, also from last October, more than half those who responded had concerns about how their data is collected, shared, used by companies or safeguarded against hackers and government surveillance. We are also seeing increasing concern around content, especially extremist content.

Is this the beginning of a tech backlash? These companies have, after all, effectively gone from bright idea to multibillion-dollar ubiquity within a blink of an eye. They haven’t had to manage regulation or reputation or crises as others have until recently.

It doesn’t mean they don’t have good people working for them – far from it, they have managed to recruit some very impressive people with heaps of experience and nous. But as organisations they haven’t experienced the crucible of turmoil until now, and they have suddenly found themselves spurned.

Tech companies have a choice between facing the concerns government, activists and consumers are expressing around data; or finding themselves regulated to an uncomfortable degree.

One of my first clients as a consultant was a confectionery company. I got lots of free chocolate but it was also the late 1990s and I had to tell the board that all those lovely products were coming to the attention of policy makers who were concerned about growing childhood obesity.

It wasn’t a good meeting – the board members were genuinely shocked, then furious before going into denial. It took many months to get them to understand the new world order and there persisted a feeling that they held me personally responsible for the fact they were no longer universally loved.

But they did take my advice and they did start making changes; not just in product formulation but also in how they responded to government, regulation and consumer challenges.

I think the tech companies are at the same stage as my chocolatiers were 20 years ago. They have a choice between facing the concerns government, activists and consumers are expressing around data; or finding themselves regulated to an uncomfortable degree. The recently published consultation paper on online harms reveals a real government appetite for scrutiny.

So what can tech companies do to assuage consumers that they do take their concerns seriously? Well, loads of things, obviously. My top four would be:

  1. Be utterly clear about how they use data and how it is shared – use simple language that a 12-year-old can understand about what they will do and won’t do with data.
  2. Fulfil the commitments already made on illegal content – I don’t want the companies to be become self-appointed censors but I do want them to do more to stop their platforms being used to share proscribed material like child pornography.
  3. Anticipate the next issue better – the tech companies will know better than me but for what it is worth I would be looking at trust in the age of AI and machine learning.
  4. Work as a sector to agree a robust and fair regulatory framework – do it to yourselves before they do it to you.

Of course, all of these have costs attached, but without significant investment I don’t think enough will change, or quickly enough. And as brands we need them to do something. They are an important part of the media mix and upon which many depend.

Whatever they decide to do, they had better get on with it. I think I can hear the closing bell beginning to ring in the Last Chance Saloon.

Recommended