Trust is a tricky thing. Do you trust that someone is telling you the truth? Do you trust them with your secrets? Do you trust that they will continue to behave in a similar way tomorrow as they do today? Do you trust that they are trying to do the right thing?
If there were any doubt that Facebook is in a trust crisis, it was wiped away by a series of defensive, apologetic blog posts from the company on Monday. In the most frank of the posts, product manager Samidh Chakrabarti identifies foreign interference, misinformation and political polarization as three aspects of social media that threaten not just trust in Facebook, but in democracy. All three, Chakrabarti admits, played an outsized role in the 2016 US Presidential election, and they pose a danger for democratic elections to come.
On Tuesday, COO Sheryl Sandberg told a Brussels audience that the company would do more to boost privacy and prevent abuse on its site, including hiring 20,000 people by the end of the year to monitor and remove harmful content, plus a corresponding investment in artificial intelligence.
“We know that tech companies need to do better and that we at Facebook need to do better. We have a lot to improve,” Sandberg said. “We have not done enough to stop abuse of our technology.”
Trust in media platforms is collapsing
Why is Facebook concerned? A new Edelman study argues that trust in media has eroded in countries worldwide. The sharpest year-over-year decline in overall trust in institutions has been in the United States, which is in a “trust collapse.” Social platforms like Facebook and search platforms like Google are viewed as part of the media in general and as the least trustworthy part.
While platforms were briefly seen as more trustworthy than journalism, platforms now have a widening credibility gap, as users worry about false information and their own ability to sort good sources from bad. In all but a few countries, when it comes to trust in media, social media has gone from being part of the solution to the core of the problem.
Edelman’s trust study illuminates the dilemmas faced by Facebook and other social media platforms. Increasingly, they’re identified with the news and entertainment media they distribute; they provide information to an audience that’s increasingly disconnected from the news and a connected audience that’s increasingly skeptical of all media. They’re vulnerable to manipulation by bad actors, who exploit news discovery algorithms for their own ends. They take the blame for news failures and the losses of trust that result. Facebook’s changes to its News Feed could be seen as a version of Edelman’s suggested remedy for this trust collapse: guard information quality, protect consumers, and safeguard privacy. Facebook’s defensive crouch might be its best strategy as it seeks to regain its users’ trust.
To be fair, Edelman’s 2018 Trust Barometer has little good news for anyone, apart from isolated sectors in individual countries. (As a company, it’s known for its skills at crisis response and management; it’s no small wonder it’s equally capable of identifying a crisis that demands a response.) But the picture of trust in social media platforms is especially bleak.
Media is now the least trusted institution worldwide, and distrusted by a majority of the population in 22 of the 28 countries surveyed. Out of those surveyed, 63 percent say they can’t tell the difference between good journalism and rumor or falsehoods, with 59 percent saying this process is becoming harder. Only 36 percent say that media is doing a good or very good job of guarding information quality, with 59 percent saying that because of media’s failures, they’re no longer sure what’s true and what’s not.
But perhaps surprisingly, trust in journalism and journalists has risen. It’s trust in media platforms like Facebook and Google that’s collapsed, including an 11 percent drop in the United States, the largest decline worldwide. And in the US, even high-information, high-education, high-income citizens, who are historically more likely to trust official institutions, now trust media and media platforms at the same low 42 percent rate as the rest of the population. People trust journalism where they find it, but fewer Americans trust how they find it.
Facebook’s response to a trust crisis
In this context, Facebook’s public apology tour and transformation of its news feed makes a good deal more sense. The platform is de-emphasizing its role as a media provider and emphasizing its role in maintaining personal communications between users. It’s a familiar refrain: Facebook is a tech company, not a media company. (Edelman’s study suggests that tech companies, on the whole, are still well-trusted.)
Facebook’s also trying to put in community-based mechanisms to make media on its platform more trusted. The company knows that its own users don’t trust it to be a fair arbiter of media credibility; Facebook’s own interests will always push it towards what gets shared more, what generates more engagement, what drives more time on site. And the company is deathly afraid of being accused, as they were in 2016, of putting its thumb on the scales in favor of one kind of news (liberal or conservative, print or digital, happy or sad) over another.
Self-regulation to rebuild trust and avoid government oversight
Meanwhile, the ISBA—a group of top British advertisers—is calling for Facebook and Google to submit to an independent body to monitor their platforms and create common policies for removing inappropriate content.
“This would build confidence in the platforms themselves and would be good for their reputations,” says Phil Smith, the ISBA’s director-general. It would also, the advertisers hope, keep at bay the threat of direct regulation by the government.
This is the core of the trust problem. Facebook is trying to build a global marketplace for local commerce, advertising, and interpersonal communication. Media supports that marketplace by giving users something worthwhile to read and share.
To the extent that media has become hijacked for political gain, it has to be neutralized if possible, cauterized if necessary. Otherwise, it becomes toxic for Facebook.
To the extent that Facebook becomes associated with disinformation, it can never be a commerce platform.
To the extent that it becomes associated with bad actors, it can never be a safe place for personal information.
And to the extent that those actors can trigger political changes offline, it can never be certain that governments won’t take a heavier hand in regulating everything that happens on the site.
That’s what makes “fake news” an existential problem for Facebook, Google, and anyone else who deals in information. It’s why Facebook is taking the steps it is taking, and explaining them so deliberately. It’s a PR offensive to solve a substantial problem. But when it comes to trust, perception is not far from reality.