Facebook Will stop Bullying

  • Samanta Blumberg

Facebook initiates a new strategy to exclude online bullying and unacceptable behavior.

Facebook's moderators have received a new arsenal of admin tools. These tools will allow monitoring and controlling public communities more efficiently. It seems, the primary purpose here is to mitigate online conflicts.

One of such tools is dubbed Conflict Alerts. It’s an AI-based algorithm that detest inappropriate and toxic language. Once “contentious or unhealthy conversations” happen in a community, it will promptly raise an alarm. 

Another tool — nameless so far — will work as a freezing spell that keeps potential conflicts and overly heated debates at bay. For instance, we have a quite toxic commentator posting hateful things all the time. 

To prevent a conflict instigation, moderators can limit this person’s posting. So instead of 10 posts a day, they’ll be able to leave just one a week, and so on. Until they change their behavior and become more respectful.

This entire arsenal will be conveniently stored in the Admin Home — visual office/tool hub, which allows FB admins access these feature immediately. 

As an icing on top, administrators will also learn more about every bad poster. Facebook will record a dossier of every user’s posting activity, including troublemakers.

Knowing how often a person posts and how often their comments get complained about/removed, moderators can sketch out a “psychological profile” and keep a closer eye on them.

It is expected that the know-how will help make Facebook a healthier online platform, tracking numerous trolls, hate commentators, fake news posters, and so on.

The new features were sculpted according to the feedback received from the FB moderators. At the moment, it is estimated that 70 million people joined their ranks.