Facebook executives continued to respond to the controversy over the recent study by social scientists from the social network, Cornell University, and the University of California-San Francisco, in which the researchers randomly selected 689,003 Facebook users and tinkered with the number of positive or negative stories that appeared in their News Feeds to gauge the results of those users’ moods. But the latest to chime in, Head of Global Policy Management Monika Bickert, was not as apologetic as Data Scientist Adam Kramer, one of the study’s co-authors, or Chief Operating Officer Sheryl Sandberg.
You’ve pointed out a couple of interesting issues, and one is the tension between legislation and innovation. In the specific incident that you’re referring to — although I’m not really the best expert, and probably our public statements are the best source for information there — I believe that was a week’s worth of research back in 2012. And most of the research that is done on Facebook, if you walk around campus and you listen to the engineers talking, it’s all about, “How do we make this product better? How do we better suit the needs of the population using this product? And how do we show them more of what they want to see and less of what they don’t want to see?”
And that’s innovation. That’s the reason why when you look at Facebook or YouTube, you’re always seeing new features. And that’s the reason why if you have that annoying friend from high school who always posts pictures of their toddler every single day, you don’t see all those photos in your News Feed.
So it’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation. At the same time, if we want to make sure we don’t see that legislation, it’s incumbent upon us to make sure we’re transparent about what we’re doing and that people understand exactly why we’re doing what we do.
This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication, we apologize. We never meant to upset you.
We take privacy and security at Facebook really seriously because that is something that allows people to share.
And Kramer wrote in a Facebook post earlier this week:
The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.
Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04 percent of users, or 1 in 2,500), for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ Timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses it.
Readers: What is your take on this controversial study?