E Online Support - Free Software Download
Welcome to E Online Support-Question Bank!
Latest Reviews
Facebook reveals news feed experiment to control emotionsFacebook reveals news feed experiment to control emotions
Last Updated:
30 June, 2014
Category:
News and Updates / Internet
Facebook, the world`s biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.
 
It has published details of a vast experiment in which it manipulated information posted on 689,000 users` home pages and found it could make people feel more positive or negative through a process of "emotional contagion".
 
In a study with academics from Cornell and the University of California, Facebook filtered users` news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users` exposure to their friends` "positive emotional content", resulting in fewer positive posts of their own. Another test reduced exposure to "negative emotional content" and the opposite happened.
 
The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."
 
Lawyers, internet activists and politicians said this weekend that the mass experiment in emotional manipulation was "scandalous", "spooky" and "disturbing".
 
On Sunday evening, a senior British MP called for a parliamentary investigation into how Facebook and other social networks manipulated emotional and psychological responses of users by editing information supplied to them.
 
Jim Sheridan, a member of the Commons media select committee, said the experiment was intrusive. "This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he said. "They are manipulating material from people`s personal lives and I am worried about the ability of Facebook and others to manipulate people`s thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."
 
A Facebook spokeswoman said the research, published this month in the journal of the Proceedings of the National Academy of Sciences in the US, was carried out "to improve our services and to make the content people see on Facebook as relevant and engaging as possible".
 
She said: "A big part of this is understanding how people respond to different types of content, whether it`s positive or negative in tone, news from friends, or information from pages they follow."
 
But other commentators voiced fears that the process could be used for political purposes in the runup to elections or to encourage people to stay on the site by feeding them happy thoughts and so boosting advertising revenues.
 
In a series of Twitter posts, Clay Johnson, the co-founder of Blue State Digital, the firm that built and managed Barack Obama`s online campaign for the presidency in 2008, said: "The Facebook `transmission of anger` experiment is terrifying."

Post Your Comments
Facebook   Linkedin   Skype   Twitter