When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Facebook saw a drop in hate speech prevalence during Q4 2020

There is typically a huge amount of pressure on Facebook to better police and manage the content available on its platform, primarily due to how widely it is used. The company takes steps from time to time to experiment with different types of content, with the latest being reduction in political posts appearing on your News Feed. Now, it has shared its Community Standards Enforcement Report for Q4 2020.

An iPhone on a wooden table with an angry emoji on the screen
Image via Pexels

The report can be viewed in detail here; it contains data about different metrics for user engagement on the platform between October and December 2020. According to Facebook, hate speech prevalence faced a decline from 0.10-0.11% in the previous quarter to 0.07-0.08% in this one. This essentially means that there were 7-8 views of hate speech in every 10,000 views of content. Similarly, violent content dropped to 0.05% (0.07% in Q3 2020) and adult nudity slumped to 0.03-0.04% (0.05-0.06% in Q3 2020).

The firm says that these drops in numbers have been made possible due to its modification of processes which rely on different signals to determine whether a particular piece of content is problematic or not. This proactive approach allowed Facebook to take action against problematic content before it was reported by users. Its investments in AI also drove this process.

Other highlights for Facebook include how it took action against:

  • 6.3 million pieces of bullying and harassment content, up from 3.5 million in Q3 due in part to updates in our technology to detect comments
  • 6.4 million pieces of organized hate content, up from 4 million in Q3
  • 26.9 million pieces of hate speech content, up from 22.1 million in Q3 due in part to updates in our technology in Arabic, Spanish and Portuguese
  • 2.5 million pieces of suicide and self-injury content, up from 1.3 million in Q3 due to increased reviewer capacity

With regards to Instagram in Q4 2020, it took action against:

  • 5 million pieces of bullying and harassment content, up from 2.6 million in Q3 due in part to updates in our technology to detect comments
  • 308,000 pieces of organized hate content, up from 224,000 in Q3
  • 6.6 million pieces of hate speech content, up from 6.5 million in Q3
  • 3.4 million pieces of suicide and self-injury content, up from 1.3 million in Q3 due to increased reviewer capacity

Facebook indicated that its content review manpower is being brought back together, but will likely remain relatively scaled down until COVID-19 vaccines are more widely available. As such, content related to self-harm and suicide are being prioritized for review, for the time being.

In 2021, Facebook will be providing more metrics about content moderation, enabling greater transparency. It will also allow its figures to be validated by a third-party which will be auditing its content moderation systems.

Report a problem with article
Fire TV stick and remote
Next Article

Fire TV Stick 4K with three months of Prime is 50% off today at Amazon UK

An animated graphic emphasizing design components of Android apps
Previous Article

Google launches App Quality to aid developers in building engaging Android apps

Join the conversation!

Login or Sign Up to read and post a comment.

14 Comments - Add comment