Instagram Tweaked Its Policies on Removing Accounts for Content Violations

Notifications will enable affected users to appeal deletion decisions on posts

The new notifications people with potential violations will start seeing
Instagram

Instagram is changing its policy on disabling accounts to mirror the rules already in place at parent company Facebook.

The photo- and video-sharing network’s current policy is to disable accounts with a certain percentage of violating content.

Instagram will now begin removing accounts with a certain unspecified number of violations during a certain unspecified time period. The exact numbers were not revealed in an effort to prevent bad actors from manipulating the system and stopping just short of reaching the threshold for action.

Instagram said in a blog post that the change will enable it to more consistently enforce its policies and hold people accountable for the content they post to its platform.

People who are at risk of having their accounts disabled will now be notified via a new process.

The new notifications will give those people the opportunity to appeal content that is deleted, with topics initially covered including nudity/pornography, bullying/harassment, hate speech, drug sales and counterterrorism. Instagram said more categories will be added in the coming months.

If content is found to have been removed in error during the appeal process, those posts will be restored, and the violations will be wiped from the account’s record.

Instagram said appeals have always been available via its Help Center, and the experience will be added directly to its mobile applications over the next few months.