Facebook has been under fire recently for deleting or censoring content that showed nudity or violence, but which otherwise would have been deemed to be of public interest. Now, the company is looking at changing the way it displays stories and images to account for such content.
Last month Facebook repeatedly deleted content that contained the Pulitzer-prize winning image, Terror of War, which depicted a group of children, including a naked 8-year old, running away from a napalm strike in Vietnam. Soon afterwards the social network took offense and removed an animated cancer-awareness video, that used two yellow circles to depict breasts. The company recently also removed a video of an unarmed black man being shot by police, though it claims that was a glitch.
In all such cases, the company came under heavy criticism for mishandling its global editorial role, which comes with having billions of monthly users on your networking site. Since then, Facebook has been taking feedback and is now ready to try out a different approach that should account for such content.
Going forward the social network says it will allow more content that is explicit, violent, or otherwise infringes upon its guidelines if it’s deemed to be “newsworthy, significant, or important to the public interest”.
The company isn’t clear on who will actually have the power to decide what’s newsworthy and what’s not, though it did say it’s looking to work with law enforcement, journalists, publishers, and other experts to ensure the content is appropriate.