Social Media Speech Codes Have ‘Consistency’ Problems
Attempts to censor content on platforms like Facebook and YouTube have come under fire this year for a lack of transparency and consistency.
YouTube, for example, recently removed a video debunking a Sandy Hook massacre conspiracy theory while leaving the theory itself online.
Engadget says such decisions are “causing legitimate concerns about censorship and denying the media the tools they need for accurate reporting.”
Content from pro-ISIS or Neo-Nazi channels is easy to find on YouTube, as is pornographic content, but profiles of some specific criminals have disappeared after they received publicity.
Facebook has a history of evading questions about its removals policy, notably around decisions to restrict certain livestreams.
Bumble is known for intervening and publicly shaming / banning users who insult women on its dating service.
It is often unclear as a to whether the motivation for censoring content is to stop radical online behaviour or to protect a company’s reputation, Engadget notes.
Their article concludes by focusing on murderers’ online presence: “By controlling access to the posts and videos made by society’s monsters, social media companies are effectively controlling the narrative of what transpired and presenting a sterilized version of events wherein they are taken to task over the role that their services played in the shooters’ development and radicalization.”
Read more here.