X

How Facebook conducts experiments on your emotions

Some 689,003 lucky Facebook users were unwittingly part of an experiment in which their news feed was altered to make it more or less positive. Was this ethical?

Chris Matyszczyk
3 min read

morrissey.png
Can Facebook secretly make you more miserable than Morrissey? TheAmbitiousOutsider/YouTube screenshot by Chris Matyszczyk/CNET

Remember those days when your lover pretends to be in a bad mood just to see how you'll react?

It seems that your most intimate virtual friend has been doing the same thing. No, not that virtual friend to whom you bare all late at night. I'm talking the other virtual friend to whom you bare all throughout the day -- Facebook.

Recently, I wrote about a study in which people who were confronted by happier Facebook posts ended up writing happier Facebook posts too.

As with most research, this had mostly humorous elements. One can't deduce that just because someone has written a cheery Facebook post, they are themselves cheery. However, these researchers -- from Cornell, Facebook, and UC San Francisco -- seemed keen on doing so.

What has now emerged, however, is that the "participants" in this study had no idea they were participating. More importantly, they had no idea that their news feeds were being manipulated to manipulate their emotions.

As the Telegraph reports, Facebook relied on a box that you happily tick when you're overjoyed to be joining the site.

By your flagrant tick, you are agreeing to Facebook performing "internal operations, including troubleshooting, data analysis, testing, research and service improvement."

The paper, called "Experimental Evidence Of Massive-scale Emotional Contagion Through Social Networks," explained that the researchers didn't actually see the 689,003 Facebookers postings. They merely relied on observing key words such as "excited."

Some, though, are questioning whether this sort of research is remotely ethical. The idea that Facebook and its academic cohorts had the specific intention of making some of its users miserable is a thought to behold.

Could they be sure that none of these unwitting guinea pigs were in an emotionally vulnerable state already?

As the Atlantic reports, Susan Fiske, a Princeton psychology professor who edited the study, admitted she had qualms.

She said: "I was concerned until I queried the authors and they said their local institutional review board had approved it -- and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time. ... I understand why people have concerns."

It's a touching logic. Because Facebook is always messing with your News Feed, why worry if it's messing with your mind?

I contacted Facebook to ask whether the company thought the researchers' approach marginally underhand. I also asked whether these results might assist in the creation of certain types of advertising.

After all, if you can actively change your customers' moods to the negative, you might also offer them the precise commercial pick-me-up to make them feel better.

Adam D. I. Kramer, a member of Facebook's core data science team and a co-author of the study, addressed concerns about study in a Facebook post Sunday afternoon.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper."

Updated at 7:30 p.m. PT with Kramer comment.