Bumble has announced it will launch a “Private Detector” feature that will automatically detect inappropriate and lewd photos with the assistance of AI. This new add-on was created to detect images in real time with 98-percent accuracy. When a photo is shared within a chat, it will be automatically blurred. After being blurred, users will be alerted that they have been sent something inappropriate. Following this action, the user can choose whether they’d like to view the image, block the image, or report the image to the app’s moderation team.
Whitney Wolf Herd, founder and CEO of Bumble and Andrey Andreev co-created the women-centric app as a safe online dating space for women back in 2014. The new feature will launch on Bumble, Badoo, Chappy and Lumen — each of these companies are under the same dating parent company founded by Andreev. According to The Verge, Bumble is one of the few dating apps that allow images to be sent back and forth in a chat and already blurs all images by default.
In an official press release, Andreev shares more about the new feature:
“The safety of our users is without question the number one priority in everything we do and the development of ‘Private Detector’ is another undeniable example of that commitment. The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behaviour on our platforms.”
Alongside the new feature, Herd has been working with Texas legislators to develop and pass a bill that makes the sharing of unwarranted nude photos a crime. The Verge reports that if passed, the bill could mean a law would be in place that would create a punishable fine of up to $500 USD.
The “Private Detector” feature will officially launch on Bumble in June.