Facebook manipulated 70,000 user posts for research paper

  • 29 June 2014
  • 7 replies
  • 52 views

Userlevel 7
Badge +52
Facebook conducted a secret study of 70,000 users back in 2012 according to a research paper released by the Proceedings of the National Academy of Science (PNAS) journal.

The study took place during the 11th-18th of January 2012 and involved “Secretly manipulated” news feeds. According to the research paper Facebook engineers adjusted the emotional content of the 70,000 users’ posts to see if “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”
Full Article

7 replies

Userlevel 7
Badge +56
This is a terribly unethical experiment.  I'm going to stop using Facebook over it until such time as they issue an apology and censure the researchers involved.
Userlevel 6
I agree with you @ . Maybe I will post a link to the article on my page or start a petition demanding an apology.
Userlevel 7
The following is a updated articles on Facebook Manipulated 70,000 user posts.
 
"Quote" Facebook emotional manipulation test turns users into 'lab rats'
 
 
Anger grows even as Facebook researcher posts apology for causing users anxiety
 
Computerworld - Users and analysts were in an uproar over news that Facebook manipulated users' News Feeds to conduct a week-long psychological study that affected about 700,000 people.
News reports said that Facebook allowed researchers to manipulate the positive and negative information they saw on the social network in order to test the emotions of users. The study, which was conducted Jan. 11 to Jan. 18, 2012, was published in the Proceedings of the National Academy of Sciences.
 
ComputerWorld/ Full Read Here/ http://www.computerworld.com/s/article/9249462/Facebook_emotional_manipulation_test_turns_users_into_lab_rats_
 
Userlevel 7
By Simon Phipps | InfoWorld  Posted on June 30 2014
 
Conducting psychology experiments on users without their knowledge isn't a matter of terms of use, privacy, or security -- it's a question of decency and ethics
 
 http://www.infoworld.com/sites/infoworld.com/files/media/image/facebook_privacy_hp.jpgWhen you agreed to Facebook's terms and conditions, did you know you were agreeing to become a subject in a psychology experiment? This weekend, we learned that Facebook permitted an academic research team to conduct an experiment on a huge number of Facebook's users back in 2012.
The researchers adjusted the contents of Facebook timelines for nearly 700,000 users so that either positive or negative news dominated. They found that positive news spread positive responses, and negative news spread negative responses.
 
Infoworld/ Full Read Here/ http://www.infoworld.com/t/technology-business/facebooks-big-problem-ethical-blindness-245281
 
Userlevel 7
Badge +56
Apparently their TOS didn't include the term "research" until 4 months after the study.  Busted!
http://www.forbes.com/sites/kashmirhill/2014/06/30/facebook-only-got-permission-to-do-research-on-users-after-emotion-manipulation-study/
Userlevel 7
Summary: News of Facebook experimenting on its users' emotional states has rattled everyone. Worse, the tool used to perform the experiments is so flawed there's no way of knowing if users were harmed.

By Violet Blue for Pulp Tech | July 1, 2014 -- 12:42 GMT (13:42 BST)
 
If there's one thing we've learned from zombie movies, it's that when the word "contagion" is associated with humans getting experimented on without their knowledge at the hands of a cold, massive corporation -- things never end well. On June 2, the Proceedings of the National Academy of Sciences published "Experimental evidence of massive-scale emotional contagion through social networks." It made headlines last weekend, which can be succinctly described as a 'massive scale contagion' of fury and disgust. In "Experimental evidence" Facebook tampered with the emotional well-being of 689,003 unknowing users to see how emotional contagion could be controlled; basically, how to spread, or avoid the spread of, its users' feelings en masse.
 
ZDNet/ Full Read Here/ http://www.zdnet.com/facebook-unethical-untrustworthy-and-now-downright-harmful-7000031106/
Userlevel 7
By Kelly Fiveash, 4 Jul 2014
 
Facebook's ethical standards do not meet those of most researchers who conduct studies on human subjects, the journal which published the "secret", emotion-manipulative research on nearly 700,000 of its users has said.
The journal of the Proceedings of the National Academy of Sciences (PDF), has now made a statement about its publication of the controversial paper Experimental evidence of massive-scale emotional contagion through social networks. The study was co-authored by Facebook's data scientist Adam Kramer and researchers from the University of Cornell in San Francisco.
 The journal did not apologise for publishing the research, which has come under fire from privacy groups and data regulators in the UK and Ireland as the data subjects, well Facebookers, were allegedly not explicitly asked for their consent
 
The Register/ Full Read Here/ http://www.theregister.co.uk/2014/07/04/pnas_concerned_over_facebook_emotion_contagion_study/

Reply