Meta Launches new Content Moderation Tool HMA that can scan for terrorist content

Meta, the parent company of Facebook, has developed an open-source tool that it claims can fight terrorist and violent extremist content online.

The tool, better known as Hasher-Matcher-Actioner (HMA) can identify suspicious content, including copies of images or videos that violate certain guidelines and have been flagged by users as inappropriate. It can then act against such material collectively.

Meta says Hasher-Matcher-Actioner (HMA) tool can be adopted by a range of companies that want to fight against terrorism on their platforms and stop the spread of it. It is particularly useful for smaller companies that don’t possess the vast reserves of resources as bigger ones.

Firms desirous of using this product can simply run all their content through the hash-sharing database and follow the steps described above to help themselves and Meta is keeping the violent and abusive content off the internet.

The decision to make the tool available publicly comes shortly before the California-based company assumes the chair of the Global Internet Forum to Counter Terrorism (GIFCT) board in January.

It’s a group that brings together member companies, governments, and civil society organizations to tackle terrorist and violent extremist content online. Meta has been its founding member.

Meta reports that it spent an amount of $5 billion globally on safety and security last year, and has a dedicated team of hundreds of people working round the clock with experts from law enforcement and national security agencies to counter-terror work. The new initiative will strengthen it furthermore.

Recently, Meta also took steps to combat data scraping.

Report a problem with article
Next Article

Firefox 108 is out with Windows 11 Efficiency Mode support, and more

Previous Article

How to make phone calls using Wi-Fi when you have bad reception