When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Bumble set to introduce AI to warn you before opening lewd pictures

The dating app Bumble is set to launch a new feature in June powered by artificial intelligence that will help users avoid unexpected, lewd pictures. The app already blurs out images by default but will present a warning message in future if the image is deemed to be inappropriate by the AI.

When Bumble receives the new feature this summer, other dating apps which belong to the same parent group will get the feature too; these apps include Badoo, Chappy, and Lumen. The new feature is dubbed Private Detector by Bumble and will give users the ability to view, block, or report the image to moderators. The tool supposedly has a 98 percent accuracy record so there’s little chance that anything will get through unsuspectingly.

In conjunction with this feature, the Bumble CEO Whitney Wolfe has been working with with the Republican State Representative, Morgan Meyer, to draft a bill that would make it an offence to send revealing images online. Those caught breaking the rule face a fine of up to $500 if it becomes law. Justifying the bill, Meyer has said:

“Something that is already a crime in the real world needs to be a crime online.”

The tool could give Bumble an advantage over its biggest rival, Tinder. Currently, Tinder doesn’t allow matches to send any pictures whatsoever. If users know that they can send images on Bumble but also be protected from lewd pictures then that could be another selling point that Bumble uses to increase its userbase.

Source: The Verge

Report a problem with article
Chrome logo on a light blue and black background
Next Article

Dark mode arrives in Chrome 74 for Windows, but you may need to enable it manually

Previous Article

The Windows Media Center SDK is now (unofficially) on GitHub

Join the conversation!

Login or Sign Up to read and post a comment.

6 Comments - Add comment