Burn survivors and individuals with facial differences have come forward to share their experiences with the censorship they experience online. They share that AI tools will automatically take down their photos, believing it to be “graphic content”.
Tonya Meisenbach is a burn survivor who decided to share her story. Despite earning brand deals with L’Oréal among others, she noticed that her TikTok videos were being taken down almost immediately after being posted.
Refinery29 investigated and found TikTok’s justification was that it does “not allow content that is excessively gruesome or shocking, especially that promotes or glorifies abject violence or suffering”. This was obviously distressing for Meisenbach to hear.
“We’ve been aware of people having their photos and accounts removed or blurred, dating back about five years now”, said Phyllida Swift, the president of Face Equality International, which works towards equality for individuals with facial differences.
Swift explained that when AI models are being trained to identify human faces, they are presented with thousands of examples, but hardly any of burn survivors or individuals with other facial differences.
But the problem isn’t limited to just TikTok, it’s something that can happen with any AI-powered censorship tool.
Refinery29 reports that Bumble and Tinder have almost removed photos in which facial differences are visible. Joshua Dixon, who lost 80% of his face in a childhood incident, shared that both dating apps had removed photos he posted on his profile.
Not only has Dixon faced abuse from users on dating apps, but to have the platforms themselves censor him is adding insult to injury.
No doubt, Bumble has been doing positive work to highlight issues caused by AI technology.
However, a spokesperson from the dating app shared, “Bumble prioritizes a safe and empowering community through a combination of automated features and a dedicated human support network. We strive to continuously improve our systems. Despite our efforts, mistakes may occur.”