Jump to content


Bumble Makes Nude-Detecting AI Open Source To Wipe Out More Unsolicited Pics


Recommended Posts

Since 2018, Bumble has been tackling a social phenomenon known as ‘cyberflashing’, or the gross instances when you receive unsolicited nudes online. 

To address this issue, the company launched its Private Detector AI feature on its application a year later, which works by automatically blurring potential nude images within chats. Users will have the choice of viewing or blocking the image and can report the other party. 

Hoping to protect more people online from these unwanted pictures, Bumble’s data science team recently released a whitepaper into the inner workings of Private Detector, with an open-source version now available for use by others in the technology community. 

Read full article

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.

  • Create New...
The Creative Network


The Creative Finder

The Bazaar

Trendingger (BETA)

Community Resources

Become a member

  • Sign up for free
  • Pro/Business Accounts
  • Log into your account

    Forum Rules & Guidelines

    Terms of Use

    DMCA Copyright Notice

    Privacy Policy


    Contact Us

    Advertise with us

  • Express self-serve ads
  • Other advertising inquiries
  • Popular Categories

  • Artificial Intelligence (AI)
  • Innovation
  • Accessibility
  • Creative Ad Ideas
  • Climate Change & Sustainability
  • Copyright
  • Humor
  • Inclusivity
  • Travel
  • WTF
  • Creative Disciplines

  • 3D
  • AR / VR
  • Architecture
  • Art
  • Automotive
  • Branding
  • Character Design
  • Comics
  • Fashion Design
  • Furniture Design
  • Graphic Design
  • Illustration
  • Industrial Design
  • Interior Design
  • Logo Design
  • Packaging Design
  • Product Design
  • Street Art
  • Typography
  • UI/UX
  • Video Games