The social network wrote in a blog post, “Facebook groups are a place for people to not only share their interests with others, but share their experiences and seek support. Many people are reaching out to their communities right now, both online and offline, to discuss racial injustice, share personal experiences and organize ways to support. Some admins may be unsure of how to manage these important discussions in their communities, especially for groups originally created around a topic unrelated to social issues.”
Facebook suggested that group admins take the time to educate themselves and their teams on the issues before engaging with or shutting down conversations, adding that they should apply the same care as they would with other sensitive topics by conducting research and determining how the community should approach these discussions.
Opportunities should be created for members of impacted communities to hold admin or moderator roles if they do not already do so, either from within the group or by reaching out to diverse organizations.
Facebook wrote, “Make space for, and offer, diverse organizations and advocates to talk to your group or host live trainings, if they are willing, and be sure your content reflects a myriad of diverse people and voices.”
The social network also suggested that community leadership teams consider updating their rules and moderation principles in order to protect all members of the community, sharing specific examples of content that is or isn’t allowed, as well as clear lists of topics along those lines.
Group admins may want to consider adding a post outlining how the group will handle these issues and reiterating community rules, providing a forum for discussion and feedback.
Facebook suggested that group admins and moderators decide whether or not to use post approval in order to cut off potential conflicts before they erupt and spot members who may be bad actors.
Another idea posed to group leaders is hosting a discussion on a single thread to be posted and moderated by an admin or moderator, with Facebook writing, “Starting a discussion yourself will allow people to talk about these issues, but in a direction that adheres to your group rules and keeps discourse respectful between members. If your community responds well to this idea, you can begin to post this weekly as a place for people to come together and discuss issues important to them.”
If a topic comes up that members indicate interest in discussing, but that topic doesn’t really fall under the community’s stated mission, Facebook suggested opening up a chat or subgroup, or starting a new group entirely. The social network wrote, “If you do decide to open up a new group, be sure your team is all aligned and you have a moderation plan in place.”
Facebook stressed that group admins must be agile and respond to changes, as well as willing to change discussion topics and rules to reflect those changes.
The social network also reminded group admins that all of the responsibility shouldn’t fall on one person, recommending that they compile a team of admins and moderators and establish a schedule. Comments on posts that may require active moderation can also be temporarily disabled during times when no one is available to moderate, such as late-night for the bulk of the group, and hours when moderators are available should be shared with the group. Facebook wrote, “Admin burnout is real, especially during highly active moments within your groups.”
Finally, Facebook recapped some of the tools that are available to group admins and moderators:
- Announcements can be used to share information on how posts and conversations on topics such as racial injustice, the news, legislation and links to organizations will be handled, and to set clear conversation boundaries. Facebook said, “Remember to pin your post to the top of all announcements, so that all members coming in the group see it first.”
- Rules should be established if they haven’t already been crafted, or revisited.
- The keyword alerts tool alerts admins and moderators when certain words appear in posts or comments, and the social network wrote, “While many admins may use keyword alerts for profanity or other terms that may need immediate attention, this tool is great to use for any relevant topics your team wants to keep track of.”
- Post approval gives the leadership team time to review posts, align on a response and ensure that related posts can be actively moderated.
- The moderation team should have a discussion on when the ability to mute, remove or ban members should be used, and that policy should be clearly posted where members can see it.
- Group leadership should consider using Facebook Live or Messenger Rooms to host regular “office hours” for members to join live and discuss topics and rules.
- As mentioned above, temporarily turning off comments when moderators are unavailable can help preserve healthy discussion.