This is why we launched Report Harmful Content

06 Dec 2019 UK SIC

Report Harmful Content (RHC) is the UK’s national reporting centre for harmful online content. Thanks to funding from the EU Commission and the work of practitioners from UK Safer Internet Centre partners SWGfL, we have been trialling, testing, and fine-tuning the service since January 2019 and, as of December 2019, the service is now officially launched and fully-functional!

It’s taken a lot of hard work, lessons learned, and development, but we are delighted with the service and the fact we have helped hundreds of clients in the last year.

We had a brilliant time at the official launch of Report Harmful Content, sharing the results of our year in beta testing with representatives from industry, government, and other NGOs.

There was unanimous agreement after the event that the highlight was the anonymised victim impact statement shared by Elena, one of our helpline practitioners. This testimonial from one service user moved everyone in the room and demonstrates the real-world value of the service.

Sonny’s story – the power of removing harmful content

Sonny* was 2 when I first saw his video online. His case and his mum’s continued fight to rid the internet of harmful content relating to him is the epitome of why Report Harmful Content exists.

As a previous service user of the Professionals Online Safety Helpline (POSH), there was a limit in what we could do to support Sonny’s mum. We didn’t want to turn her away but she didn’t fall within the remit of our work or funding. We encouraged her to reach out to professionals working alongside Sonny (teachers, social workers) and for them to contact us on her behalf but, ultimately, what followed was back and forth about reporting processes that only Sonny’s mum could follow. This meant that we would only know about it via a 3rd party when they had the chance to update us, which meant that nobody was giving or getting what they wanted. We wanted to do better and we knew we could.

Sonny is now 6, and content showing him circulates once every year in response to a specific event, following him his whole life. I cannot begin to imagine the impact this has on him or his family.

Sonny’s mum reached out to children’s workforce professionals every year, who would in turn contact POSH on her behalf and each time we worked on her case for months going back and forth.

This year, that changed thanks to the launch of RHC. Using the RHC website, Sonny’s mum was able to find all the correct privacy reporting forms for social media she needed to submit and she was able to escalate cases to us when content wasn’t removed. We could finally support her directly and this resulted in much quicker takedowns of all the content. I don’t think this will be the last time I see Sonny’s video but I know, thanks to RHC, it will be the last time it takes months to remove it.

We are living in a time of great change and even bigger consequences: climate change, privacy violations, Brexit (to name just a few). Similarly, it’s only right that the one of the greatest changes in the last few decades would bring with it enormous consequences.

* Name changed to protect privacy

Research reveals 52% of content reported to social media involved harassment or bullying

Sonny’s story is moving, but it wasn’t the only point of interest on the day.

Having operated for almost one year in its beta phase, RHC supported hundreds of clients and its practitioners were able to observe and analyse patterns and common themes in the cases they received.

An analysis of the cases from the last year painted a fascinating picture of the state of online harms and the way digital content reflects this.

Some of the highlights from the research include:

Most common harms reported

Bullying and harassmentImpersonationOnline abuseThreats
52% of reports32% of reports23% of reports23% of reports

Least common harms reported

Suicide and/or self-harm contentUnwanted sexual advancesViolent contentPornographic content
3% of reports1% of reports1% of reports1% of reports

Demographics

FemaleMaleNot stated
59%35%6%
13-18-years old19-30-years old31-50-years old>50-years old
23%31%43%3%

Cases involving several harms

The research into Report Harmful Content case data revealed two clusters of related harms.

A remarkable 70% of all reports to the service could be categorised into just two clusters.

Impersonation, privacy violations, and bullying and harassment

  • Comprised 52% of cases
  • Most commonly occurred on Instagram
  • 65% of reports made by females
  • Perpetrator is known to our client in 82% of cases

Abuse, threats, and hate speech

  • Comprised 18% of cases
  • Most commonly occurred on Facebook
  • 50/50 gender split
  • 92% of reports were made by bystanders, not victims, with the perpetrator often unknown to our client

What’s next for Report Harmful Content – and how can you help?

World domination!

Or maybe just continuing to support victims of harmful content online.

Now that Report Harmful Content is out of beta testing and fully operational, we will be helping more people with removing and reporting content that contravenes platforms’ community guidelines.

Our partnerships with industry are getting stronger and stronger, we have the support of so many fantastic organisations and people – and you’re one of them.

So we need you to keep on supporting us. Spread the word, signpost to us from your websites, in your staffrooms, on your social media pages.

However you do it, we need you to tell the world about Report Harmful Content!

You might reach someone like Sonny’s mum.

And you might change their life around.

Share your feedback:

This field is for validation purposes and should be left unchanged.