Top Tips for Reporting Harmful Content Online
Social media has transformed communication, offering many benefits not only for young people but for all internet users, however, it can also expose users to harmful or worrying content like cyberbullying, harassment, misinformation, and impersonation. As a partner in the UK Safer Internet Centre, SWGfL operates the Report Harmful Content service, which supports individuals in the UK aged 13 and over with reporting online harm.
This article will look at some of the top tips Report Harmful Content has for anyone reporting harmful content.
Stay Calm
We know that seeing harmful content online can be distressing, but the best thing you can do is remain calm. This helps in carefully assessing the situation and determining the next steps. If the content involves someone being in immediate danger, a life-threatening situation, or forced participation in illegal behaviour, contact the police immediately by calling 999.
Gather Evidence
If the behaviour is harmful and worrying but not illegal, you can begin by gathering evidence.
Collecting evidence is crucial, especially since harmful content can be quickly deleted. Screenshots, saved messages, timestamps, URLs, and platform information are all helpful when making reports and seeking help.
This evidence is particularly important on end-to-end encrypted platforms such as WhatsApp, where the platform or police forces may not be able to access private messages. The more context you provide, the better services like Report Harmful Content can assist.
Use the Correct Reporting Channels
After gathering evidence, report the harmful content using the appropriate channels. You should start with the platform where the content was found, as most social media platforms have built-in reporting tools. If you’re unsure how to report, Report Harmful Content offers guidance for various platforms looking at the best ways to report different types of content you may see.
Choosing the right reporting category at this stage is really important, as the initial reports might be processed by automated systems.
If your report is unresolved after 48 hours, you can escalate the issue through Report Harmful Content, which will review your case and, if necessary, escalate it to the platform. However, removal can’t always be guaranteed and is under the discretion of the platform the content is hosted on.
Take Care of Yourself
Even after harmful content is addressed, you may still feel worried or upset. Report Harmful Content advises ceasing communication with the perpetrator, blocking accounts, and talking to someone you trust for support.
If you feel like you need to talk to someone about what you have seen or experienced online, you can find support from dedicated organisations here.
Use the Report Harmful Content Button
Schools and organisations can add the Report Harmful Content Button to their websites, enabling users to easily access expert advice and report harmful content. If you’re interested in downloading the Button for your organisation, you can find out more here. For more information, visit the Report Harmful Content website.