A study detailing how Facebook secretly manipulated the news feed of about 700,000 users to study “emotional contagion” has prompted anger on social media.
For one week in 2012 Facebook tampered with the algorithm used to place posts into user news feeds to study how this affected their mood.
The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
The researchers wanted to see if the number of positive or negative words in messages they read affected the content in their status updates.
Indeed, they found the manipulated users began using negative or positive words depending on what they were exposed to.
You think #FacebookExperiment is the first time a corp. has manipulated ppl’s emotions? Have a Coke and a smile.
— Gregory Belmonte (@jevanyn) June 29, 2014
Reconsider using Facebook, use my information without my consent=unethical I might remove my account! #FacebookExperiment
— Brandon (@bhulmer) June 29, 2014
Results of the study spread when the online magazine Slate and The Atlantic website wrote about it on Saturday.
“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study authors wrote.
“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
While other studies have used metadata to study trends, this appears to be unique because it manipulates the data to see if there is a reaction.
The study was legal according to Facebook’s rules but questions are being asked about whether it was ethical.
“#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT… Yeah, time to close FB acct!” read one Twitter posting.
Other tweets used words like “super disturbing,” “creepy” and “evil,” as well as angry expletives, to describe the experiment.
Susan Fiske, a Princeton University professor who edited the report for publication, told The Atlantic she was concerned about the research and contacted the authors.
They in turn said their institutional review boards approved the research “on the grounds that Facebook apparently manipulates people’s News Feeds all the time”.
Fiske admitted to being “a little creeped out” by the study.
Facebook told The Atlantic that they “carefully consider” their research, and have “a strong internal review process”.
Facebook, the world’s biggest social network, says it has more than one billion active users.