Pressure mounts on Facebook following emotion experiment
Pressure is mounting on Facebook after both the data regulator and the body that represents the UK ad industry raised concerns over an experiment conducted by the social network that manipulated users’ emotions.
The Information Commissioner’s Office in the UK has said it is investigating whether Facebook broke data protection laws. The data regulator said it will be speaking with Facebook about the issue but it is too early to say whether the social network has broken any laws.
“We’re aware of this issue and will be speaking to Facebook, as well as liaising with the Irish data protection authority, to learn more about the circumstances,” the regulator said in a statement.
ISBA, the self-proclaimed voice of UK advertisers, says the idea that the social network can monitor and manipulate the emotions of users “is of concern”, although it will not jump to any conclusions.
David Ellison, marketing services manager at ISBA, says: “The investigation into whether Facebook has broken data protection laws hasn’t even started, so we must not jump to conclusions. However, the idea that the emotions of users can be monitored and manipulated, without their prior knowledge or consent, is of concern. While Facebook has claimed that ‘there was “no unnecessary collection of people’s data”, this is a psychological study that Facebook admit could have been handled better.
“We are encouraged that Richard Allen, Director of Policy at EMEA, said: ‘It’s clear that people were upset by this study and we take responsibility for it. We want to do better in the future and are improving our process based on this feedback.’ ISBA will be tracking the result of the investigation into the study.”
In response Richard Allen, Facebook’s director of policy in Europe, says the social network takes responsibility for any upset cause by the study and is happy to answer any questions from regulators.
“We want to do better in the future and are improving our process based on this feedback. The study was done with appropriate protections for people’s information and we are happy to answer any questions regulators may have,” he adds.
Facebook conducted research in 2012 in conjunction with two US universities, Cornell and the University of California at San Francisco, on nearly 700,000 users to see if it could alter the emotional state of users and prompt them to post more positive or negative updates by manipulating the news feed. The results found that people who saw more negative content were more likely to post negative updates, with the same result for positive content.
Facebook has defended the research, with Adam Kramer, who co-authored the report on the research, saying the social network though it was “important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out”. However, he has admitted the firm did not “clearly state its motivations” in the paper and has apologised for any anxiety caused.