Meta Announces New Protections to Protect Teens from Harmful Content Online

22 Jan 2024 UK SIC

Meta has announced new content policies to help protect teenagers from viewing harmful content across Facebook and Instagram. The recent updates will include placing all under 18’s into the most restrictive content control settings categories on Instagram and Facebook. Additionally, there will be restrictions on harmful search terms on Instagram Search, and teenagers will receive notifications prompting them to enhance their privacy settings across Instagram.

These changes aim to make it more challenging for young people to access sensitive content through the apps, with Meta specifically focusing on restricting self-harm and suicide-related content from individual’s feeds. Meta announced in their latest blog post:

“While we permit individuals to share content discussing their own struggles with suicide, self-harm, and eating disorders, our policy is not to recommend this content, and we have been concentrating on ways to make it more difficult to locate. When people search for terms related to suicide, self-harm, and eating disorders, we will begin hiding these related results and redirect them to expert resources for help.”

Report Harmful Content

The recent updates by Meta signify a positive stride in reinforcing efforts to protect young people and adults from harmful online content. Individuals in the UK aged 13 and above, who have encountered harmful content online, can use Report Harmful Content—a national service operated by SWGfL—to report content across various major social media platforms. It also provides information about additional support services available to those affected by harmful content.

For additional guidance on updating Facebook and Instagram privacy settings, SWGfL’s Social Media Checklists offer step-by-step instructions to help keep social media accounts secure.

Share your feedback:

This field is for validation purposes and should be left unchanged.