Facebook Study on Users' Emotions Draws Ire | Adweek Facebook Study on Users' Emotions Draws Ire | Adweek
Advertisement

Sneaky Facebook Study on Users' Emotions Draws Ire

Research 'may not have justified all of this anxiety,' study lead admits

If you use Facebook and found yourself momentarily feeling either better or worse in early 2012, an algorithm may have caused your shift in mood. And that's what has some social media users upset today.

A controversial research study published by the Proceedings of the National Academy of Sciences (PNAS) on June 17 started to gain digital traction over the weekend. It revealed that Facebook for one week in January 2012 worked with Cornell University and the University of California-San Francisco to test the emotional reactions of nearly 700,000 users to pieces of content. The users weren't notified of their participation and unknowingly helped the researchers learn that people who read fewer positive words were found to write more negative posts, while the reverse occurred when consumers were exposed to fewer negative sentiments.

The information-gathering practice isn't likely to be illegal since Facebook users sign away many privacy rights when they agree to participate on the social platform. And the study's gray ethical issues can be probably be debated ad nauseam. 

Blogs such as AnimalNewYork.com began posting about the study on Friday, and consumers began expressing disdain via social media channels for the sneaky research practice, which of course has led to media outlets pouncing on the development. 

The brouhaha built to the point where Adam Kramer, the digital giant's research lead on the study, defended the study on his Facebook page late Sunday afternoon. 

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."

Kramer also acknowledged the hullabaloo head on.

"In hindsight, the research benefits of the paper may not have justified all of this anxiety," he stated. "Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone."

It's unclear whether Kramer had sign-off from the Menlo Park, Calif.-based social media company's marketing execs to make a public statement. And Facebook hadn't responded to inquiries in concerns to Kramer's actions at press time. 

And Susan Fiske, a professor of psychology at Princeton University who edited the PNAS study, informed The Atlantic that the research appeared questionable when she first read it.

Fiske told the publication that she "was concerned until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time... I understand why people have concerns. I think their beef is with Facebook, really, not the research."

Meanwhile, Facebook's  1.2 billion users seemed to have become OK with a lot of the interest-level data collection that comes with the advertising that makes the site free for people to utilize. But manipulating their emotions appears to go too far.

In short, all of this commotion could have been avoided if Facebook invited consumers to opt in for the research. Though it probably would have created less-than-ideal scientific results since people may have self-consciously reacted to posts if they knew their behaviors were being recorded.

Rest assured, if Facebook ever figured out a way to segment audiences in real-time according to mood, advertisers would wake up early to get in line. 

Advertisement

Advertisement