The Charity Commission has updated its guidance on charities’ use of social media following the introduction of the Online Safety Act 2023 (OSA).
The changes focus on charities that operate online forums, messaging services or other digital platforms where users can communicate with one another and share content.
Many charities now use these channels to support beneficiaries, provide advice and build online communities. Services such as live chat, peer support forums and instant messaging are becoming increasingly common, particularly where users may prefer digital communication over telephone or face-to-face contact.
However, platforms that allow user interaction also create risk. Charities may need to deal with harmful content, safeguarding concerns, abusive behaviour or unlawful material posted by users.
The updated guidance makes clear that charities must actively consider these risks and ensure appropriate policies, moderation and reporting processes are in place.
What charities need to consider
Charities operating online forums or messaging services should first assess whether the OSA applies to their platform.
Ofcom’s online regulation checker can help organisations determine whether their service falls within scope.
Where the legislation applies, charities may need to
- carry out an illegal content risk assessment
- assess whether children can access the service
- introduce age assurance or access controls where appropriate
- implement measures to protect children from harmful content
- maintain records and regularly review safety measures
These obligations may apply to closed community forums as well as publicly accessible platforms.
Reviewing social media and moderation policies
The Charity Commission expects charities to ensure trustees, staff and volunteers understand how online channels are managed and moderated.
Policies should clearly set out
- acceptable use standards
- reporting and escalation procedures
- moderation responsibilities
- safeguarding measures
- when content should be removed or users blocked
Charities should also consider whether additional moderation tools are needed. This may include
- pre-approval settings for comments or posts
- keyword filtering
- restricting who can comment
- automated moderation tools
- user reporting functions
For some organisations, additional staff training may also be appropriate, particularly where teams are regularly exposed to distressing or abusive content.
Safeguarding and reputation
Online spaces can provide valuable support for beneficiaries and service users. They can also expose charities to reputational, regulatory and safeguarding risks if not properly managed.
Trustees should be satisfied that
- online activity supports the charity’s purposes
- users are kept safe
- policies remain up to date
- political neutrality rules are understood
- complaints and harmful content are dealt with appropriately
The Charity Commission’s wider guidance on social media continues to apply, including the requirement for charities to act in the charity’s best interests and comply with relevant laws.
How we can help
Our charities team advises charities, trustees and not-for-profit organisations on governance, safeguarding, regulatory compliance and risk management.
If you would like advice on reviewing your social media policies or understanding your obligations under the Online Safety Act, please contact the team.

