Reporting Racism Online
Following the Euro 2020 Final on Sunday 11th July, the internet saw a huge range of support and celebration for the players who worked extremely hard to reach a major tournament final. However, unfortunately, there were also a lot of negative and hate filled messages shared online and offline directed towards Black players. The racist abuse we have seen aimed at the England players on social media is utterly appalling and completely unacceptable.
Our mission is to make the internet a safe and great place for everyone to be, and the behaviour shown on social media is something which needs to change. We know that in this space social media platforms need to work harder to stop the spread of racism online and more needs to be done to bring those sharing these abhorrent views to account.
“Protect equal rights and opportunities online and offline: Everyone should feel that they are welcome, celebrated, fairly represented and given a safe space to be themselves online. In the last month, 4 in 10 UK young people have seen people bullying or attacking someone online because of their sexuality, race, religion, disability or gender identity. But this isn’t just an online issue. Government should ensure all children are given equal rights online and offline.”
We are hopeful that the Government’s Online Harms Bill will play a significant part in tackling racist behaviour online. However, we also see this step needs to be matched by equally significant steps; to ensure users have the knowledge and skills to use these services safely, to be clear about what is acceptable and what is not acceptable on these services, to know what to do to report online hate when they see it and seek support, and have clear understanding and expectations of these processes. We want to make sure that education is not an afterthought in this area, or in the area of online safety as a whole.
What you can do if you see racism online
As users of the internet, we all have a part to play in reporting any inappropriate or harmful behaviour online, including any racist comments or accounts. 80% of teens have seen online hate aimed at a particular group in the past year, so it is key that all of us know how to report and are able to help young people in doing so. Each platform has its own mechanism for doing this, but most racist content on social media can be reported as hate speech. Find out more about how to report below:
Report Harmful Content platform
Report Harmful Content is a national reporting centre that has been designed to assist everyone in reporting harmful content online. Empowering anyone who has come across harmful content online to report it by providing up to date information on community standards and direct links to the correct reporting facilities across multiple platforms.
Reporting racism on Twitter:
To report a tweet or reply for racism on Twitter you need to click on the three dots to the right of the post, from here you can select the reason for the report such as ‘it is abusive or harmful’ and select the option ‘it directs hate against a protected category’.
Reporting racism on Facebook:
The best way to report abusive content or spam on Facebook is by using the Report link near the content itself. This should look like three dots on the right of the comment or post you want to report, from there you can select ‘find support or report post’ and select ‘hate speech’.
Reporting racism on Instagram
To report a comment on Instagram, press and hold the comment you want to report and slide this to the left. One you have done this a speech bubble with an exclamation mark should appear, from here you can click ‘report this comment’ and choose the reason for the report.
To report a Story or Post on Instagram click on the three white dots on the upper right-hand corner when viewing the story, there is an option here saying ‘report’. The reporting options include ‘hate speech or symbols’.
Reporting racism on TikTok
To report a video or comment on TikTok, first press and hold your finger on the content you want to report, from here there should be a report option including reasons such as ‘hate speech’ or ‘harassment and bullying’.
If you feel that your report hasn’t been actioned
Places of support or resources
In the UK there are a number of great charities and initiatives doing work around online hate. Find out more about these here:
Kick It Out – Kick It Out is English football’s equality and inclusion organisation. Working throughout the football, educational and community sectors to challenge discrimination, encourage inclusive practices and campaign for positive change.
Glitch – a UK charity that wants to make the Internet a safer place for everyone. Through campaigns, advocacy and impactful educational programmes, their aim is to transform the narrative around online abuse.
Show Racism the Red Card – the UK’s largest anti-racism educational charity, delivering of educational workshops to young people and adults in schools, workplaces and at events held in football stadiums.
The Anti-Bullying Alliance – a unique coalition of organisations and individuals, working together to achieve their vision to: stop bullying and create safer environments in which children and young people can live, grow, play and learn.
Stop Hate UK – working to challenge all forms of Hate Crime and discrimination, based on any aspect of an individual’s identity. Stop Hate UK provides independent, confidential and accessible reporting and support for victims, witnesses and third parties.
Talk it over – A research-led resource designed to support educators in facilitating empathetic, honest, and evidence-based conversations on online hate and how to tackle it with secondary aged pupils.