Back in May I wrote about the growing unease we’re witnessing over what seems to be increasingly covert (and revealing) forms of data capture. Shortly after that post, following suspicious activity on my Facebook account and one fake page that had been set up by an imposter, I deleted both my profiles and filed an official complaint. Several days later the news broke about Facebook’s little experiment.
Whatever side of the line you’re on, I suspect that this recent, public foray into the intimate lives of online users is only a glimpse of the kind of experiments that monolithic companies such as these are running without our knowledge every day. The problem is that this kind of Big Data play is not just about number-crunching and correlations – in this case it’s about subjecting an unsuspecting sample of users to emotional contagion. And if you think that sounds insidious, that’s because it is.
Defined by New York psychoanalyst Gerald Schoenewolf¹ as “a process in which a person or group influences the emotions or behaviour of another person or group through the conscious or unconscious induction of emotion states and behavioural attitudes”, emotional contagion can fundamentally alter the emotional state of those to whom it is introduced. Now, leaving aside for the moment the effects that such manipulation could have on the more vulnerable members of an online community, let’s look at how this might impact a normative group.
Given that our decision-making processes rely heavily on our emotions² and the primary role of platforms such as Facebook is to serve up valuable user data to advertisers, it makes sense that emotional manipulation of this kind could be a useful tool in shaping the purchase decisions of the people who populate that platform. Basically, if a platform can change the way you feel, it can also change the decisions you make.
Now, as it turns out, the results of Facebook’s study revealed that these effects were relatively small. However, the crucial point here is not about the results themselves – rather, it is that Facebook would think of testing something so potentially disruptive on people without first asking for their agreement.
Treating your users, the lifeblood of your business, like oblivious lab rats (which, incidentally, experience complex emotions such as regret³) is not only unethical, it is indicative of a fundamental disregard for their emotional and psychological well-being.
For a psychologist, one of the golden rules of research is to always acquire informed consent from whoever is participating in the study. Asking your users to tick a box after inflicting upon them a punishingly long Data Use Policy that no one will ever read simply doesn’t cut it. If you are one of the few people who has ever bothered to read Facebook’s Data Use Policy, the only vaguely related clause you might find would be around allowing your data to be used for “internal operations, including troubleshooting, data analysis, testing, research and service improvement”. This hardly constitutes informed consent.
It is a strong statement to make, but in my opinion this nature of clandestine experimentation constitutes a breach of our human rights; namely, that “everyone has the right to life, liberty and security of person”. Given that so much of who we are as people is inexorably linked to our online selves and the relationships we create around them, surely our liberty and security of person should extend to this domain too.
The Facebook debacle comes at a perfect time. It hints at the future we risk stumbling towards unless we wake from our somnambulance and start having a serious discussion about what kind of internet we want to create, and what kind of values we would like our relationships with brands to be centred around.
Don’t get me wrong; I’m for data use and personalisation. But only the kind that stems from a reciprocal, transparent exchange between a customer and the brands with which they have consensually elected to share data.
 Schoenewolf, G., (1990). Emotional contagion: Behavioral induction in individuals and groups. Modern Psychoanalysis; 15, 49-61
 Bechara, A., Damasio, H., and Damasio, A. R. (2000) Emotion, Decision Making and the Orbitofrontal Cortex. Cerebral Cortex, 10(3), pp. 295-307.
 Steiner, A. P. and Redish, A. D. (2014) Behavioral and neurophysiological correlates of regret in rat decision-making on a neuroeconomic task. Nature Neuroscience 17, pp. 995–1002