Last week, Instagram announced new AI-powered tools to help combat online bullying. This move was not the first step in the Facebook-owned service's efforts towards limiting the effects of bullying incidents on its platform. Today, the photo-sharing service has unveiled some changes with regards to its content violation and account disabling policies.
To start off, accounts that now garner a certain number of content violations within a given timeframe will now be disabled as well. Previously, only accounts that posted a certain percentage of violating content were removed. The company believes this change will help keep people accountable for whatever they post on Instagram, while also serving the purpose of making its policies consistent across platforms.
Moreover, users whose accounts are at a risk of being disabled will now be informed through a warning notification, as can be observed in the image above. This will provide them an opportunity to avoid a further violation and keep their accounts intact. If someone still believes that their account has ended up being wrongfully disabled, they are currently only able to appeal the decision through Instagram's Help Center. However, the aforementioned feature will be available directly within the platform in the coming months.
Additionally, when one receives a notification regarding content violation, they will now also be able to appeal the deleted content. This may revert the deletion, and presumably remove the association of the warning with said user's account as well. Although appeals will specifically be available for Instagram's nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies for now, the service intends to expand the scope soon as well.