The social network said in a Newsroom post that it will continue to enforce its policies with a priority on preventing and disrupting harm across its platform.
As the company has stated repeatedly this past week, its automated systems will be relied upon more in detecting and removing content and disabling accounts, and it cautioned that there will likely be more mistakes made and longer review periods.
Facebook wrote, “Reviewing content can be challenging, and working from home presents new obstacles in providing support to our teams, but we’re working to ensure that our content reviewers have the resources and help they need during this time.”
When users report content that potentially violates the social network’s policies, they will receive a message telling them that fewer content reviewers are available, and content with the greatest potential to harm Facebook’s community will be prioritized, meaning that some reported content will not be reviewed quickly, and some may not be reviewed at all.
Facebook also detailed changes to the appeals process, writing, “Normally, when we remove content, we offer the person who posted it the option to request that we review the content again if they think we made a mistake. Now, given our reduced workforce, we’ll give people the option to tell us that they disagree with our decision and we’ll monitor that feedback to improve our accuracy, but we likely won’t review content a second time.”
On the advertising front, director of product management for business integrity Rob Leathern responded to reports earlier this week that ads for face masks continued to appear on the social network, despite the ban it put in place earlier this month.
Leathern said in a tweet, “In addition to masks, we’re now also banning hand sanitizer, surface disinfecting wipes and Covid-19 test kits in ads and commerce listings. This is another step to help protect against inflated prices and predatory behavior we’re seeing. We’ll be ramping up our automated enforcement for ads and commerce next week. If we see abuse around these products in organic posts, we’ll remove those, too.”
Facebook concluded in its Newsroom post, “We’re working hard to minimize any impact on people as they use Facebook, Instagram and Messenger during this time, but we know some may feel this impact either when reporting content to us or appealing content we remove. We’re doing everything we can to keep our global teams and the community that uses our applications safe while continuing to provide the services people and businesses rely on.”