Appropriate filtering

Establishing appropriate levels of filtering

Guide for education settings and filtering providers about establishing ‘appropriate levels of filtering’

Accredited Filtering System

The UK Safer Internet Centre is pleased to announce that for the new year, we are releasing a new accreditation process for filtering providers. Given the critical role that filtering and monitoring services play in maintaining a safe and secure online environment in educational institutions, a comprehensive accreditation process is essential. The aim is to offer a credible, robust, and transparent benchmark against which to assess and validate the platforms schools may choose to implement. This accreditation seeks to validate the performance of providers systems to the UK Safer Internet Centre’s definitions. This information should help schools to audit their current provision, make informed decisions, and review market solutions. 

Schools in England (and Wales) are required to ensure children are safe from terrorist and extremist material when accessing the internet in school, including by establishing appropriate levels of filtering. Furthermore, the Department for Education’s statutory guidance ‘Keeping Children Safe in Education’ obliges schools and colleges in England to “ensure appropriate filters and appropriate monitoring systems are in place and regularly review their effectiveness” and they “should be doing all that they reasonably can to limit children’s exposure to the above risks from the school’s or college’s IT system” however, schools will need to “be careful that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching and safeguarding.” Ofsted concluded as far back as 2010 that “Pupils in the schools that had ‘managed’ systems had better knowledge and understanding of how to stay safe than those in schools with ‘locked down’ systems. Pupils were more vulnerable overall when schools used locked down systems because they were not given enough opportunities to learn how to assess and manage risk for themselves.”

To further support schools and colleges in England to meet digital and technology standards, the Department for Education published Filtering and Monitoring Standards in March 2023 (as part of a broader suite of educational technology standards and guidance) . In addition to aspects of both filtering and monitoring systems, these standards detail the allocation of roles and responsibilities, and that schools and colleges should be checking their filtering and monitoring provision at least annually. These standards were included within Keeping Children Safe in Education in 2023.

The Welsh Government has published a common set of agreed standards for internet access provides the tools for schools to make informed choices over filtered provision whether delivered by the local authority or another provider

Previously included within the Scottish Government national action plan on internet safety, schools in Scotland are expected to “have policies in place relating to the use of IT and to use filtering as a means of restricting access to harmful content.”

The aim of this document is to help education settings (including Early years, schools and FE) and filtering providers comprehend what should be considered as ‘appropriate filtering’.

It is important to recognise that no filtering systems can be 100% effective and need to be supported with good teaching and learning practice and effective supervision. As such, filtering systems should be recognised as one of the tools used to support and inform the broader safeguarding provision in settings.

Illegal online content

The Online Safety Act now sets out the kinds of illegal content and activity that includes content relating to:

Child sexual abuse: Content that depicts or promotes sexual abuse or exploitation of children, which is strictly prohibited and subject to severe legal penalties.

Controlling or coercive behaviour: Online actions that involve psychological abuse, manipulation, or intimidation to control another individual, often occurring in domestic contexts.

Extreme sexual violence: Content that graphically depicts acts of severe sexual violence, intended to shock or incite similar behaviour, and is illegal under UK law.

Extreme pornography: Pornographic material portraying acts that threaten a person’s life or could result in serious injury, and is deemed obscene and unlawful.

Fraud: Deceptive practices conducted online with the intent to secure unfair or unlawful financial gain, including phishing and scam activities.


Racially or religiously aggravated public order offences: Content that incites hatred or violence against individuals based on race or religion, undermining public safety and cohesion.

Inciting violence: Online material that encourages or glorifies acts of violence, posing significant risks to public safety and order.

Illegal immigration and people smuggling: Content that promotes or facilitates unauthorized entry into a country, including services offering illegal transportation or documentation.

Promoting or facilitating suicide: Material that encourages or assists individuals in committing suicide, posing serious risks to vulnerable populations.

Intimate image abuse: The non-consensual sharing of private sexual images or videos, commonly known as “revenge porn,” intended to cause distress or harm.

Selling illegal drugs or weapons: Online activities involving the advertisement or sale of prohibited substances or firearms, contravening legal regulations.

Sexual exploitation: Content that involves taking advantage of individuals sexually for personal gain or profit, including trafficking and forced prostitution.

Terrorism: Material that promotes, incites, or instructs on terrorist activities, aiming to radicalise individuals or coordinate acts of terror.


Schools should satisfy themselves that their filtering system manages this type of content, specifically that the filtering providers:

  • Are IWF members and use IWF services to block access to illegal Child Sexual Abuse Material (CSAM)
  • Integrate the ‘the police assessed list of unlawful terrorist content, produced on behalf of the Home Office’

Schools should ensure that these blocklists (IWF and CTIRU) are included with their filtering system and that anyone in your school or college should not be able to disable these blocklists or remove items from them (including any system administrator).

Inappropriate Online Content

Recognising that no filter can guarantee to be 100% effective, schools should be satisfied that their filtering system additionally manages the following inappropriate) content (and web search) including ‘Primary Priority Content’ and ‘Priority Content’ (as described by the Online Safety Act)

  • Gambling – enables gambling
  • Hate Speech/ Discrimination – Content that expresses hate or encourages violence towards a person or group based on something such as race, disability, religion, sex, or sexual orientation. Promotes the unjust or prejudicial treatment of people with protected characteristics of the Equality Act 2010
  • Harmful Content – Content that is bullying, abusive or hateful.  Content which depicts or encourages serious violence or injury.  Content which encourages dangerous stunts and challenges; including the ingestion, inhalation or exposure to harmful substances.
  • Malware / Hacking – promotes the compromising of systems including anonymous browsing and other filter bypass tools as well as sites hosting malicious content, including ransomware and viruses.
  • Mis/Disinformation – Promotes or spreads false or misleading information intended to deceive, manipulate, or harm, including content undermining trust in factual information or institutions
  • Piracy and copyright theft – includes illegal provision of copyrighted material
  • Pornography – displays sexual acts or explicit images and text
  • Self-Harm and eating disorders – content that encourages, promotes, or provides instructions for self harm, eating disorders or suicide
  • Violence against women and girls (VAWG) – Promotes or glorifies violence, abuse, coercion, or harmful stereotypes targeting women and girls, including content that normalises gender-based violence or perpetuates misogyny.

This list should not be considered an exhaustive list and providers will be able to demonstrate how their system manages this content and many other aspects.

Regarding the retention of logfile (Internet history), as the data controller, schools should understand their filtering providers data retention policies including the duration to which all data is retained and have associated data sharing agreements. Logfiles (Internet history) should include the identification of individuals and/or devices.

Providers should be clear how their system does not over block access so it does not lead to unreasonable restrictions. Welsh Government highlight that “It is critical that filtering standards are fit for purpose for 21st century teaching and learning, allowing the access schools require whilst still safeguarding children and young people.”

Given the extent of personal data involved with some filtering systems, Schools and Colleges should consider undertaking a Data Protection Impact Assessment and ensure that this aligns with the organisational policies.

Filtering system features

Additionally, and in context of their safeguarding needs, schools should consider the required features of their filtering system

  • Context appropriate differentiated filtering, based on age, vulnerability and risk of harm – also includes the ability to vary filtering strength appropriate for staff
  • Circumvention – the extent and ability to identify and manage technologies and techniques used to circumvent the system, for example VPN, proxy services, DNS over HTTPS and ECH.
  • Control – has the ability and ease of use that allows schools to control the filter themselves to permit or deny access to specific content. Any changes to the filter system are logged enabling an audit trail that ensure transparency and that individuals are not able to make unilateral changes.
  • Contextual Content Filters – in addition to URL or IP based filtering, Schools should understand the extent to which (http and https) content is dynamically analysed as it is streamed in real time to the user and blocked.  This would include AI or user generated content, for example, being able to contextually analyse text and dynamically filter the content produced (for example ChatGPT).  For schools’ strategy or policy that allows the use of AI or user generated content, understanding the technical limitations of the system, such as whether it supports real-time filtering, is important.
  • Deployment – filtering systems can be deployed in a variety (and combination) of ways (e.g. on device, network level, cloud, DNS).  Providers should describe how their systems are deployed alongside any required configurations and/or limitations.  As technology and security standards evolve, relying solely on network-level filters may become increasingly challenging and less effective. Schools might consider combining network-level filtering with device-level configurations tailored to school-owned and managed devices.
  • Filtering Policy – the filtering provider publishes a rationale that details their approach to filtering with classification and categorisation as well as how the system addresses over blocking
  • Group / Multi-site Management – the ability for deployment of central policy and central oversight or dashboard
  • Identification – the filtering system should have the ability to identify users and devices to attribute access (particularly for mobile devices) and allow the application of appropriate configurations and restrictions for individual users.  This would ensure safer and more personalised filtering experiences.
  • Mobile and App content – mobile and app content is often delivered in entirely different mechanisms from that delivered through a traditional web browser.  To what extent does the filter system block inappropriate content via mobile and app technologies (beyond typical web browser delivered content). Providers should be clear about the capability of their filtering system to manage content on mobile and web apps and any configuration or component requirements to achieve this.
  • Multiple language support – the ability for the system to manage relevant languages
  • Remote devices – with many children and staff working remotely, the ability for school owned devices to receive the same or equilavent filtering to that provided in school
  • Reporting mechanism – the ability to report inappropriate content for access or blocking
  • Reports – the system offers clear granular, historical information on the websites users have accessed or attempted to access
  • Safe Search – the ability to enforce ‘safe search’ when using search engines
  • Safeguarding case management integration – the ability to integrate with school safeguarding and wellbeing systems to better understand context of activity

Schools and Colleges should ensure that there is sufficient capability and capacity in those responsible for, and those managing, the filtering system (including any external support provider). The UK Safer Internet Centre Helpline may be a source of support for schools looking for further advice in this regard.

Filtering systems are only ever a tool in helping to safeguard children when online and schools have an obligation to “consider how children may be taught about safeguarding, including online, through teaching and learning opportunities, as part of providing a broad and balanced curriculum”. To assist schools and colleges in shaping an effective curriculum, UK Safer Internet Centre has published ProjectEVOLVE.

Risk assessment

UK Safer Internet Centre recommends that those responsible for Schools and Colleges undertake (and document) an online safety risk assessment at least annually or whenever any substantive changes occur, assessing their online safety provision that would include filtering (and monitoring) provision. The risk assessment should consider the risks that both children and staff may encounter online, together with associated mitigating actions and activities.

A risk assessment module has been integrated in 360 degree safe.  Here schools can consider identify and record the risks posed by technology and the internet to their school, children, staff and parents.

Checks and documentation

Schools and Colleges should regularly check that their filtering and monitoring systems are effective and applied to all devices. Checks should be conducted when significant changes take place (for example, technology, policy or legislation), in response to incidents and at least annually. These checks should be recorded, including details about the location, device and user alongside the result and any associated action.

SWGfL testfiltering.com enables users to test fundamental capabilities of their filtering system and to inform improvement

Filtering on mobile devices

Schools and colleges should satisfy themselves that filtering systems are correctly working across all their devices’ and across all internet connections, including their mobile devices. If your school owns mobile devices such as iPads or other tablets as part of your teaching strategy, then consider the following practices to ensure filtering is in place (you may need the help of your ICT support to do this):

For schools and colleges in England, the following DfE guidance is relevant

  • Audit the mobile device estate by detailing all the mobile devices they have.
  • Understand and detail the applications (apps) they use and how these are managed (installed and deleted). Specifically, ensure that apps can be centrally, and routinely, removed from mobile devices. This is best achieved through the use of a Mobile Device Management (MDM).
  • Identify who is responsible for mobile devices as well as filtering and monitoring solutions at the school, ensuring that the DSL is also aware (if different).
  • Test to provide confidence that the schools filtering and monitoring solution is working across all mobile devices, across installed apps (not just internet browsers) and in various physical locations. Does filtering continue when away from the school network? Schools can use testfiltering.com to help in this regard.
  • Identify any vulnerable users of mobile devices, paying particular attention to ensure harmful content is not accessible on specific devices

Generative AI

New technology is enabling users to generate personalised content in real-time based on prompts and schools are being encouraged to exploit these potential advantages for “faster planning and record-keeping”[1]. The real-time nature and proliferation of these system present a challenge to schools when it comes to filtering this type of content. Filtering systems should effectively and reliably prevent access to harmful and inappropriate content generated by Generative AI systems. Schools should reflect on the following, as part of any risk assessment, when considering their systems and deciding what generative AI systems they allow students and staff to use:

– The level to which your filtering system can block content in real-time
– Assessing which generative AI systems the school which to approve for use after assessing safety features, and data protection
– Developing a policy around the use of generative AI systems
– Assessment of your ability to generate reports on the usage or generative AI systems within school


Further Governmental considerations for adopting Generative AI technologies in schools:

England – Generative artificial intelligence (AI) in education – GOV.UK (Jan 2025): The DfE’s Generative AI: Product Safety Expectations sets out clear guidance for ensuring AI tools used in schools are safe by design, including expectations for risk assessment, content moderation, transparency, and reporting—providing a helpful benchmark when determining which generative AI platforms should be accessible through school filtering systems.

This detail has been developed by the SWGfL, as a partner of the UK Safer Internet Centre, and in partnership and consultation with the 80 national ‘360 degree safe Online Safety Mark’[1] assessors and the NEN Safeguarding group (www.nen.gov.uk).

Be in the know

You’ll get knowledge, skills and tools to make the internet safer
for young people at your care. Each sent once per month.

Share your feedback:

This field is for validation purposes and should be left unchanged.