Ofcom has issued new guidelines calling on social media platforms to take stronger action against online abuse, particularly targeting women and girls, including tougher measures to prevent coordinated “pile-ons.”
The recommendations, released on Tuesday under the Online Safety Act (OSA), outline practical steps for tech companies to curb misogynistic harassment, coercive control and the non-consensual sharing of intimate images.
While voluntary, Ofcom warned that failure to comply could lead to formal proposals to strengthen the law.
A key measure urges platforms such as X to introduce limits on the number of replies a post can receive, aiming to reduce situations where a user is overwhelmed by abusive responses.
Ofcom also wants platforms to deploy hash-matching technology to swiftly detect and remove intimate images shared without consent, including revenge porn and explicit deepfakes.
The technology works by comparing reported images against a database of digital hashes, enabling platforms to automatically remove harmful content.
Other recommendations include prompts discouraging users from posting abusive comments, time-outs for repeat offenders, blocking misogynistic users from earning ad revenue, and tools that let users mute or block multiple accounts at once.
Dame Melanie Dawes, Ofcom’s chief executive, said she had heard shocking accounts of abuse: “We are sending a clear message to tech firms to step up and act to protect their female users against the very real online risks they face today.”
Ofcom will publish a compliance report in 2027 and has warned that the OSA could be tightened if platforms fail to meet expectations.
However, Internet Matters, a children’s online safety nonprofit, cautioned that many companies may ignore the guidance unless it becomes mandatory.

