There is no doubt that Facebook has become a social media giant but it will play with people’s emotion was never expected. Yes! Facebook hasÂ reveled that it played withÂ emotions of overÂ 689,003 of users through a vast experiment.
Facebook is the best human research lab ever. Thereâ€™s no need to get experiment participants to sign pesky consent forms as theyâ€™ve already agreed to the siteâ€™sÂ data use policy.
AÂ team of Facebook data scientistsÂ are constantly coming up with new ways to study human behavior through the social network. When the team releases papers about what itâ€™s learned from us, we often learn surprising things about Facebook instead â€” such as the fact that itÂ can keep track of the status updates we never actually post.
Facebook has played around with manipulating people before â€” gettingÂ 60,000 to rock the vote in 2012Â that theoretically wouldnâ€™t have otherwise â€” but a recent study shows Facebook playing a whole new level of mind gamery with itsÂ
guinea pigsÂ users. As first noted by TheÂ New ScientistÂ andÂ Animal New York, Facebookâ€™s data scientists manipulated the News Feeds of 689,003 users, removing either all of the positive posts or all of the negative posts to see how it affected their moods. If there was a week in January 2012 where you were only seeing photos of dead dogs or incredibly cute babies, you may have been part of the study. Now that the experiment is public, peopleâ€™s mood about the study itself would best be described as â€œdisturbed.â€
The researchers, led byÂ data scientist Adam Kramer, found that emotions were contagious. â€œWhen positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,â€ according toÂ the paperÂ published by the Facebook research team in the PNAS. â€œThese results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.â€
The experiment ran for a week â€” January 11â€“18, 2012 â€” during which the hundreds of thousands of Facebook users unknowingly participating may have felt either happier or more depressed than usual, as they saw either more of their friends posting â€™15 Photos That Restore Our Faith In Humanityâ€™ articles or despondent status updates about losing jobs, getting screwed over by X airline, and already failing to live up to New Yearâ€™s resolutions. â€œ*Probably* nobody was driven to suicide,â€Â tweetedÂ one professor linking to the study, adding a â€œ#jokingnotjokingâ€ hashtag.
The researchers â€” who may not have been thinking about the optics of a â€œFacebook emotionally manipulates usersâ€ study â€” jauntily note that the study undermines people who claim that looking at our friendsâ€™ good lives on Facebook makes us feel depressed. â€œThe fact that people were more emotionally positive in response to positive emotion updates from their friends stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively,â€ they write.
They also note that when they took all of the emotional posts out of a personâ€™s News Feed, that person became â€œless expressive,â€ i.e. wrote fewer status updates. So prepare to have Facebook curate your feed with the most emotional of your friendsâ€™ posts if they feel youâ€™re not posting often enough.
So is it okay for Facebook to play mind games with us for science? Itâ€™s a cool finding but manipulating unknowingÂ usersâ€™ emotional states to get there puts Facebookâ€™s big toe on that creepy line. Facebookâ€™sÂ data use policyÂ â€” that Iâ€™m sure youâ€™ve all read â€” says Â Facebookersâ€™ information will be used â€œfor internal operations, including troubleshooting, data analysis, testing,Â researchÂ and service improvement,â€ making all users potential experiment subjects. And users know that Facebookâ€™s mysterious algorithms control what they see in their News Feed. But it may come as a surprise to users to see those two things combined like this.Â When universities conduct studies on people, they have to run them by anÂ ethics boardÂ first to get approval â€” ethics boards that were created because scientists were getting too creepy in their experiments, getting subjects to think theyÂ were shocking someone to deathÂ in order to study obedience andÂ letting men live with syphilisÂ for study purposes.Â AÂ 2012 profile of the Facebook data team noted, â€œUnlike academic social scientists, Facebookâ€™s employees have a short path from an idea to an experiment on hundreds of millions of people.â€
In its initial response to the controversy around the study â€” a statement sent to me late Saturday night â€” Facebook doesnâ€™t seem to really get what people are upset about, focusing on privacy and data use rather than the ethics of emotional manipulation and whether Facebookâ€™s TOS lives up toÂ the definition of â€œinformed consentâ€Â usually required for academic studies like this. â€œThis research was conducted for a single week in 2012 and none of the data used was associated with a specific personâ€™s Facebook account,â€ says a Facebook spokesperson. â€œWe do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether itâ€™s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of peopleâ€™s data in connection with these research initiatives and all data is stored securely.â€
Ideally, Facebook would have a consent process for willing study participants: a box to check somewhere saying youâ€™re okay with being subjected to the occasional random psychological experiment that Facebookâ€™s data team cooks up in the name of science. As opposed to the commonplace psychological manipulation cooked up by advertisers trying to sell you stuff.
The post Facebook used its news feed to control users’ emotions: Forbes appeared first on Hack Read.
View full post on Hack Read
Other Sites You May Like: