Report Harmful Content Releases Annual Report
Report Harmful Content has just released its annual report into the findings surrounding harmful content online.  This report presents results of mixed-methods research carried out on all cases dealt within the first year of operation (January 2019-December 2019). In the year analysed, the RHC website received 9,282 visitors and practitioners dealt with 164 unique cases. The service’s popularity rapidly increased in September, following the official service launch, and continued to grow until the end of the year.
What was found?
Cases involving bullying and harassment were most common, followed by impersonation, abuse and threats.
RHC found that online harassment and abuse disproportionately affected women and was often perpetrated by ex-partners.
Three common trends were identified:
- A combination of impersonation, bullying and harassment and privacy violation. This trend disproportionally affected women and intersected with offline domestic violence and coercive control.
- A combination of abuse, threats and hate-speech. Within this, the most common type of hate-speech reported was racism/xenophobia.
- Clients inadvertently viewing harmful content (e.g. violence or pornography) rather than being victim to or witnessing targeted, harmful behaviour.
Strengths of the service
- In the majority of instances, practitioners were able to directly assist clients in reporting harmful content online.
- In the remaining cases, content was deemed to be either criminal or it was found to be located on platforms with which RHC do not have partnerships. In these instances, practitioners provided advice and onward signposting.
- Of the content escalated to industry, 92% was successfully actioned (e.g. removed/ restricted/ regained access to) and 62% was done so within 72 hours, demonstrating a high level of service speed and efficiency.
- The service offered vital emotional support, alongside signposting to other agencies and services, either for additional emotional support or practical assistance.
- The report identifies multiple ways in which the RHC service can be developed so as to respond to the growth and diversification of the types of reports received.
Emerging issues for the service
- Law enforcement action: 19% of RHC clients reported content which was deemed to be criminal and thus referred to law enforcement. Of that 19%, however, 47% got back in touch with RHC, often reporting that the police had dismissed them and incorrectly informed them that their issue was non-criminal.
- Inconsistency: Responses from industry platforms often showed a lack of clarity around what type of content would be removed. This commonly occurred in relation to cases involving a clash of characteristics protected under the equality act (Equality Act 2010), in particular gender reassignment and sex.
- Cultural and religious context: RHC dealt with a number of clients from particular cultural and religious backgrounds who reported the exposure of private and/or intimate material. This type of content often did not meet legal or platform thresholds for harmful content and, as such, there were issues in securing its removal and safeguarding clients.
- Mental health: One of the most significant issues to be identified was the widespread impact of online harms on mental health; 32% of RHC clients reported negative mental health impacts as a result of viewing or being the victim of harmful content online, with 13% reporting suicidal ideation.