Facebook Engineering Director Arturo Bejar On Conflict Resolution, ‘Compassion Research’

By David Cohen 

Facebook Engineering Director Arturo Bejar spoke with the San Jose Mercury News, but the conversation had nothing to do with coding or new features on the social network. Instead, they discussed conflict resolution.

Bejar told the newspaper he is in charge of what Facebook calls “compassion research,” or using social science tools to help the social network deal with posts that make its users feel ill at ease.

He discussed changes Facebook has been making to its reporting process for content that disturbs or offends users, saying that users’ options had been limited in the past to defriending or blocking other users, or asking the social network to remove the content, but Facebook has been working to foster dialog between parties, allowing each side to state their cases.

Highlights from Bejar’s Q&A with the Mercury News follow:

We were looking at things that were getting reported as violations of our community standards, which cover things like hate speech and pornography. But what we found were things that had nothing to do with that. A great majority were just people playing around, having their photograph taken in all kinds of social settings.

We didn’t understand why people were reporting them, until we realized the person submitting the report was in the picture and the person who uploaded the picture is usually their friend. So that made us realize there was a social thing happening, and there is something we can do to help people resolve that.

First, we want to capture the emotion of the person raising the issue. Second, we want to provide messages that incorporate everything we know about politeness.

It turns out that if you want to have polite communication, you should address somebody by their name, and you make it a little bit indirect. Instead of saying, “This photo is incredibly annoying to me,” you say: “There’s something about this photo that I don’t like.”

We tested different phrases and found that words make a huge difference. “Would you please take it down” has a higher response rate than “Would you mind taking it down.”

One thing we learned is that many youths would rather reach out to an older teen, maybe a sibling or a cousin, who can relate to their situation. We give them the option of sending a message (electronically) to a trusted older person, and we encourage them to do that. We also get a copy of that report, and we do look at it in the context of our community standards, and we will take action accordingly.

Facebook is about connecting people to each other. Sometimes those connections result in misunderstandings, and we want to provide tools for when that happens. If you post a photo of me that’s upsetting to me, having Facebook resolve that does not do anything for our friendship.

Readers: Have you found yourself in a situation where you felt the need to report content to Facebook? If so, how was the situation handled?