Facebook Promises Revamped Research Policies

Facebook chief technology officer Mike Schroepfer issued the social network’s strongest response to date to the controversy over a 2012 study in which the News Feeds of 689,003 randomly selected Facebook users were manipulated in terms of positive or negative stories to gauge their emotional effects, promising in a Newsroom post that changes would be made to the way Facebook conducts research, including clearer guidelines, review teams, training, and a portal for all of the company’s research.

ResearchGraphic650Facebook chief technology officer Mike Schroepfer issued the social network’s strongest response to date to the controversy over a 2012 study in which the News Feeds of 689,003 randomly selected Facebook users were manipulated in terms of positive or negative stories to gauge their emotional effects, promising in a Newsroom post that changes would be made to the way Facebook conducts research, including clearer guidelines, review teams, training, and a portal for all of the company’s research.

In the 2012 study, social scientists from Facebook, Cornell University and the University of California-San Francisco randomly selected 689,003 Facebook users and tinkered with the number of positive or negative stories that appeared in their News Feeds, finding that the “emotional contagion” effect worked both ways.” The researchers complied with Facebook’s data-use policy and did have access to actual posts.

The study was widely reported on this past June, leading to:

Schroepfer wrote in the Newsroom post:

I want to update you on some changes we’re making to the way we do research at Facebook.

Facebook does research in a variety of fields, from systems infrastructure to user experience to artificial intelligence to social science. We do this work to understand what we should build and how we should build it, with the goal of improving the products and services we make available each day.

We’re committed to doing research to make Facebook better, but we want to do it in the most responsible way.

In 2011, there were studies suggesting that when people saw positive posts from friends on Facebook, it made them feel bad. We thought it was important to look into this, to see if this assertion was valid and to see if there was anything we should change about Facebook. Earlier this year, our own research was published, indicating that people respond positively to positive posts from their friends.

Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism. It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.

Over the past three months, we’ve taken a close look at the way we do research. Today we’re introducing a new framework that covers both internal work and research that might be published:

Guidelines: We’ve given researchers clearer guidelines. If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin. The guidelines also require further review if the work involves a collaboration with someone in the academic community.

Review: We’ve created a panel including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy and policy teams, which will review projects falling within these guidelines. This is in addition to our existing privacy cross-functional review for products and research.

Training: We’ve incorporated education on our research practices into Facebook’s six-week training program, called bootcamp, that new engineers go through, as well as training for others doing research. We’ll also include a section on research in the annual privacy and security training that is required of everyone at Facebook.

Research website: Our published academic research is now available at a single location and will be updated regularly.

We believe in research, because it helps us build a better Facebook. Like most companies today, our products are built based on extensive research, experimentation and testing.

It’s important to engage with the academic community and publish in peer-reviewed journals, to share technology inventions and because online services such as Facebook can help us understand more about how the world works.

We want to do this research in a way that honors the trust you put in us by using Facebook every day. We will continue to learn and improve as we work toward this goal.

Readers: What did you think of Schroepfer’s post?

Research image courtesy of Shutterstock.