The social network is also introducing easier ways for users to report fake news and flag questionable posts as disputed, and while those stories can still be shared, they will contain warnings, and they are not eligible to be part of promoted posts or ads.
Facebook is also targeting the publishers behind fake news stories and hoaxes, taking steps such as eliminating domain spoofing so that these publishers cannot pass their content off as coming from legitimate news sources, as well as analyzing publishers’ sites to determine if further action is necessary.
We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully. We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third-party organizations.
Mosseri described the changes to reporting questionable stories:
We’re testing several ways to make it easier to report a hoax if you see one on Facebook, which you can do by clicking the upper-right-hand corner of a post. We’ve relied heavily on our community for help on this issue, and this can help us detect more fake news.
And on flagging disputed stories and sharing those stories, he added:
We believe providing more context can help people decide for themselves what to trust and what to share. We’ve started a program to work with third-party fact-checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact-checking organizations identify a story as fake, it will get flagged as disputed, and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed.
It will still be possible to share these stories, but you will see a warning that the story has been disputed as you share.
Once a story is flagged, it can’t be made into an ad and promoted, either.
We’re always looking to improve News Feed by listening to what the community is telling us. We’ve found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. We’re going to test incorporating this signal into ranking, specifically for articles that are outliers, where people who read the article are significantly less likely to share it.
Facebook is a new kind of platform, different from anything before it. I think of Facebook as a technology company, but I recognize that we have a greater responsibility than just building technology that information flows through. While we don’t write the news stories you read and share, we also recognize we’re more than just a distributor of news. We’re a new kind of platform for public discourse–and that means we have a new kind of responsibility to enable people to have the most meaningful conversations and to build a space where people can be informed.
With any changes we make, we must fight to give all people a voice and resist the path of becoming arbiters of truth ourselves. I believe we can build a more informed community and uphold these principles.
Readers: What are your thoughts on Facebook’s new steps to eliminate fake news and hoaxes?