When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Facebook reportedly knew its algorithms were dividing people yet did nothing

A new report from The Wall Street Journal claims that Facebook's top executives knew about the issues with the algorithm but they did not take action to fix the problems.

According to the report, an internal study conducted by Facebook found that the algorithms weren’t bringing people together. They were driving people apart. The study was presented to the senior executives at Facebook who chose to ignore it despite clear warnings about the effects it could have on the society. “Our algorithms exploit the human brain’s attraction to divisiveness. (...) If left unchecked," the company would be feeding "more and more divisive content in an effort to gain user attention & increase time on the platform”, noted one of the slides from the report presented to the company's executives in 2018. A separate internal report from 2016 confirmed the effects as it noted that 64% of the Facebook users joined extremists groups because the algorithm recommended it to them.

The report further identified Joel Kaplan, Facebook’s vice president of global public policy and former chief of staff under President George W. Bush as the person who shifted Facebook's focus away from polarization and downplayed concerns about the algorithms. Kaplan took a larger role at Facebook around the 2016 elections and critics have pointed out his favouritism towards conservatives. Kaplan is believed to be responsible for Facebook's controversial political ad policies which prevented fact-checkers from vetting campaign ads. Facebook noted that the decision to let political ads run without fact-checking was made to protect free speech.

The Wall Street Journal blamed Kaplan for weakening or killing off proposals designed around the social good. The WSJ also noted that Kaplan actively blocked the proposals to reduce the influence of “super-sharers” to drown out less-active users. According to Facebook's own metrics, super-shares are people who interact with more pieces of content and hence, have more influence on the platform when compared to less-active users. Even then, Kaplan pushed back and did not allow the proposed changes crafted by News Feed integrity lead Carlos Gomez Uribe.

A spokesperson gave the following statement to The Verge in response to the report from The Wall Street Journal:

We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve. Just this past February we announced $2M in funding to support independent research proposals on polarization.

In 2016 and 2017, Chris Cox, Facebook’s chief product officer at the time established "Common Ground" which sought to promote neutral content on the platform. However, a document presented by the Common Ground team in mid-2018 indicated that the new project would reduce the engagement online and will require Facebook to “take a moral stance.” Common Ground did not receive the approval from Facebook as the company considered it to be biased against conservatives. The Common Ground team has since been disbanded and many senior officials from the "Integrity Teams" left the company.

Source: The Wall Street Journal (paywall)

Report a problem with article
Next Article

Tesla reduces car prices in North America and China

Previous Article

Deal of the Day: eufy Security, Wi-Fi Video Doorbell 26% off for just $99.99

Join the conversation!

Login or Sign Up to read and post a comment.

5 Comments - Add comment