Ofcom Consults on Rules on Cyberflashing, Self-Harm Content
Ofcom has launched a public consultation proposing updates to its regulatory guidance and codes of practice to address two newly designated priority offences under the UK’s Online Safety Act: cyberflashing (unsolicited nude images) and encouraging or assisting serious self-harm.
In December 2025, the Government added these offences to the list of over 130 priority illegal harms. As a result, regulated online services – including social media, dating, gaming, and messaging apps – must now assess the risk of such content appearing on their platforms and implement appropriate safety measures to mitigate it.
Ofcom is consulting on revisions to its Risk Assessment Guidance, Risk Profiles, Register of Risks, Illegal Content Judgements Guidance, and Illegal Content Codes of Practice. The proposals suggest applying several existing measures to these new offences, such as:
- Easy-to-use reporting and complaints systems for illegal content.
- Properly resourced and trained content moderation teams.
- Swift removal of illegal content once detected.
- Algorithm testing to evaluate how design changes affect the risk of harmful content being recommended.
- Options for users to block, mute, or disable comments.
- Crisis prevention information in response to self-harm-related searches.
- Easy reporting of problematic predictive search suggestions.
The consultation is open until 5pm on Friday, April 24, 2026. Responses can be submitted via email or the online form. Ofcom expects to publish final decisions in summer 2026. Many platforms already fall under user-to-user service rules and must now explicitly evaluate risks related to cyberflashing and self-harm encouragement. Failure to conduct adequate risk assessments or implement sufficient protections could lead to enforcement action, including fines or service restrictions.

