When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Meta fined by US government for implementing racist and discriminatory algorithms

A Meta logo with a sack of gold coins below it on a white background

The US government has sued Meta in the New York City for blocking certain users from seeing online real-estate ads based on their nationality, race, religion, sex, and marital status. The complaint says that Meta violated America's Fair Housing Act (FHA) that protects people looking to buy or rent properties from discrimination.

According to the act, it is illegal for homeowners to refuse to sell or rent their houses or advertise homes to specific demographics, and to evict tenants based on their demographics as well. Prosecutors sued Meta alleging that its algorithms discriminated against users by targeting housing ads based on their "race, color, religion, sex, disability, familial status, and national origin."

Damian Williams, US Attorney from the Southern District Court of New York, said:

"When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the FHA, just as when companies engage in discriminatory advertising using more traditional advertising methods.

Because of this ground-breaking lawsuit, Meta will—for the first time—change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this Office will proceed with the litigation.”

Meta has agreed to pay a fine of $115,054 to end the matter. The company has also agreed to make necessary changes in its algorithms for the ad targeting system. The US government cannot issue a heftier fine on the company as it is the maximum penalty fee for violating the act.

The settlement says that Meta will stop using an advertising tool for housing ads called "Special Ad Audience" tool that relies on the discriminatory algorithm to find users who 'look like' other users based on FHA-protected characteristics.

Meta has also agreed to develop a new system over the next six months to address racial and other disparities caused by its use of personalization algorithm. The case marks the first time that Meta has been taken to court for its ad targeting and delivery system.

You can read more about the case here.

Source: US DoJ via The Register

Report a problem with article
The Samsung ISOCELL HP3 Camera Sensor
Next Article

Samsung reveals its smallest 200MP camera sensor for thin and compact smartphones

An Amazon Alexa smart device on the left with the confused guy meme on the right
Previous Article

Amazon wants Alexa to speak to you in the voice of your deceased relative

Join the conversation!

Login or Sign Up to read and post a comment.

4 Comments - Add comment