FeaturedNews

Badoo Group Creates AI Software to Combat Unwanted Sexual Images

Badoo and Bumble have been working together to create AI-powered software which automatically detects when users are trying to send sexual images to one another.

The tool, which is known as ‘Private Detector’, is designed to blur any pictures of exposed male or female genitalia. The recipient will then choose whether to view the image, do nothing, or report it to one of Badoo’s 5,000 human moderators.

Private Detector will be made available on the Badoo, Bumble, Chappy and Lumen platforms in June. Results from initial testing claim that it works with 98% accuracy.

Badoo Founder Andrey Andreev said in a statement: “The safety of our users is without question the number one priority in everything we do and the development of ‘Private Detector’ is another undeniable example of that commitment.

“The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behavior on our platforms.”

Bumble CEO Whitney Wolfe Herd has been working with US legislators for several months to try and outlaw unsolicited sexual images, likening them to public nudity.

At the beginning of March, a new bill was introduced in Texas, the home of Bumble’s main office, which will make sharing sexually explicit content without consent a public misdemeanour. After 1st September 2019, anyone caught breaking this law could be fined up to $500.



A statement from Wolfe Herd read: “The digital world can be a very unsafe place overrun with lewd, hateful, and inappropriate behavior. There’s limited accountability, making it difficult to deter people from engaging in poor behavior.

“[‘Private Detector’ is] something that’s been important to our company from the beginning – and is just one piece of how we keep our users safe and secure.”

The UK Government is also reportedly planning to review whether “online flashing” should be considered a criminal offence. This comes after a government funded study found that 41% of UK women aged 18-36 have received an unwanted sexual image.

Only 5% of men who participated in the same survey admitted to sending a naked photo that wasn’t asked for.

Instagram introduced a similar photo-blurring policy in early 2017. Posts may be blurred if they are reported by a user and moderators deem them to be sensitive content, though not in direct violation of the community guidelines.

The GBTQ niche has had its own problems with unsolicited sexual images in the past. Adam Cohen Aslatei, an advisor to Badoo-backed Chappy, recently said that Grindr needs to clean up its overly sexualized communityif it is to reach its potential as a leader in the modern gay dating landscape.

Read more here.

Dominic Whitlock

Dominic is the Editor for Global Dating Insights. Originally from Devon, England he achieved a BA in English Language & Linguistics from The University of Reading. He enjoys a variety of sports and has a further passion for film and music.

Global Dating Insights is part of the Industry Insights Group. Registered in the UK. Company No: 14395769