Bumble has made its cyberflashing detection tool open source, so that other technology services can adopt it themselves. The tool uses artificial intelligence to detect obscene images and automatically blurs them before alerting the recipient.
The tool known as ‘Private Detector’ is now widely available so that other companies can use it to tackle abuse, Mashable wrote.
“Open-sourcing this feature is about remaining firm in our conviction that everyone deserves healthy and equitable relationships, respectful interactions, and kind connections online”, said Rachel Haas, Bumble’s Vice President of member safety.
Bumble’s campaign against online sexual harassment goes beyond just sharing this tool. The company has been promoting laws to make cyberflashing illegal, both in the UK and the U.S.
A 2018 survey from YouGov found that 4 in 10 women aged 18-34 have been sent unsolicited sexual images.