Facebook is facing heavy criticism after it was revealed the company deliberately manipulated users’ emotions without their knowledge in a newsfeed test.

The experiment, which affected around 700,000 people, was conducted in conjunction with researchers from Cornell University and the University of California San Francisco.
Despite the lack of users’ consent, the academics were allowed to change the order in which their friends’ posts appeared in their newsfeeds.
The test subjects were split into two groups and, depending on which group they were in, would see an increased number of positive or negative posts. They were then studied to see if the type of posts they saw most frequently affected their own mood.
I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused
Although the project was carried out in 2012, it has only now come to light following the publication of the researchers’ paper in the Proceedings of the National Academy of Sciences of the United States.
The researchers claim their data collection was justified as it “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research”.
Facebook, meanwhile, has said there was “no unnecessary collection of people’s data” and that “none of the data used was associated with a specific person’s Facebook account”.
Ethical concerns
However, many people have called into question the ethics of the project, including Labour MP Jim Sheridan, a member of the Commons media select committee, who has called for an investigation into the matter.
James Grimmelmann, a professor of law at the University of Maryland, tweeted that the “application should have set off more alarms (at Cornell and at PNAS) than it did”.
Indeed, even the editor of the paper, professor of psychology at Princeton University Susan Fiske, said she “was concerned” about the ethics of the experiment, according to The Atlantic.
But, upon speaking to the researchers, she was satisfied the project was ethical because “their local institutional review board (which reviews researchers’ conduct in experiments that involve humans) had approved it – and… on the grounds that Facebook apparently manipulates people’s newsfeeds all the time”.
However, his claim has been called into question by Forbes.
Citing “a source familiar with the matter”, the outlet said the study was “approved through an internal review process at Facebook, not through a university institutional review board”.
Adam D. I. Kramer, a data scientist at Facebook and one of the paper’s authors, has sought to defend the experiment, saying “at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it”.
“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused,” he added.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.