Facebook’s recent study that was conducted in 2012 and published just recently revealed that data scientists conducted a massive-scale experiment and “tweaked” over 600,000 user’s news feeds to see their response when they gave more positive or negative posts than usual than they normally would. The experiment itself, conducted over the course of a week, was to drive “emotional contagion”. The psychological concept that states that to more you interact with the moods of others determines how your mood is affected, either positively or negatively.
“In these conditions, when a person loaded their News Feed, posts that contained emotional content of the relevant emotional valence, each emotional post had between a 10% and 90% chance (based on their User ID) of being omitted from their News Feed for that specific viewing.” the paper said.
“We also observed a withdrawal effect: People who were exposed to fewer emotional posts (of either valence) in their News Feed were less expressive overall on the following days, addressing the question about how emotional expression affects social engagement online. This observation, and the fact that people were more emotionally positive in response to positive emotion updates from their friends, stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively”
The obvious result was that people who saw a reduced number in positive posts became less positive themselves. A reduction in negative posts made users become more positive. The paper has been published for public viewing and the experiment itself was widely criticized. When anyone hears that your free social media platform is experimenting on users, there’s no doubt that there’s going to be a fallout to deal with the news. However, the social media giant have defended their research and are in full legal rights to do this for data analysis purposes. The same study which you agree to take part in when you first sign up for Facebook.
Facebook’s spokesmen to Forbes, reportedly had this to say.
“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” says a Facebook spokesperson. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”