Bumble blurs out unwanted pictures with new AI tool

A new safety feature by Bumble is launched and in this article we will explore their new AI tool called: Private Detector.

Private Detector

Private Detector works by using A.I. to automatically blur a potential nude image shared within a chat on Bumble or Bumble For Friends. It’ll then notify you that you’ve been sent something that’s been detected as inappropriate; it’s up to you to decide whether to view or block the image. (You can also easily report the image to Bumble. We don’t tolerate any bad behavior, including sending unsolicited obscene photos!)

The Private Detector feature joins a roster of safety initiatives we’ve rolled out since our founding in 2014 to help keep you safe while using Bumble Date, Bizz, and BFF. These include a ban on guns and other weapons of violence in profile pictures, a ban on hate speech, and video chat and voice call within the Bumble app so you can meet new people without sharing your phone number or email before you’re ready. (We also use photo verification to help validate that the person looks like the photos on their profile.)

While Private Detector is designed to help keep our community safe from unsolicited nudes within our app, the internet at large can feel like the wild west, with online harassment all but openly tolerated everywhere from social media DMs to AirDrop. What’s more, when we originally researched the issue, we found there was no legislation in place to deter this sort of digital indecent exposure, also known as “cyberflashing.”

Discuss your opinion about this new AI tool in my Telegram community and let me know if you found this article handy!

Shopping Cart
Scroll to Top