Bumble’s new AI safety feature Private Detector was officially introduced to the social app over the weekend.
The piece of software, announced in April, is designed to protect users from unsolicited sexual images.
Pictures sent on Bumble’s in-app chat that are believed to contain exposed male and female genitalia are now automatically blurred. The recipient can then choose whether to view the image, report it or ignore it.
During initial testing, Private Detector was described as working with an accuracy of 98%.
Founder and CEO Whitney Wolfe Herd said in a statement: “We were always cognizant that the sending of unsolicited nude images was a critical problem, especially for young women.
“The ‘Private Detector’ was designed to deter and prevent such behavior in the same manner that our other safety features are designed to combat issues like abuse, catfishing, and ghosting, and you can expect us to continue to invest heavily in technologies that champion user safety above all else.”
The new feature is expected to be introduced to the Badoo and Chappy apps at some point in the future. Lumen, the final product in MagicLab’s portfolio, doesn’t allow images to be sent.
A 2018 survey from the UK government found that 41% of female millennials have received an unwanted sexual image in the past.
The sending of unsolicited images is now classified as a Class C misdemeanour in Texas, after Wolfe Herd helped with the introduction of a new bill on 1st September 2019.
Read more here.