2024 Appropriate filtering and monitoring definitions published

28 May 2024 UK SIC

Following a public consultation, The UK Safer Internet Centre (UKSIC) has published its ‘appropriate’ filtering and monitoring definitions for 2024.  The definitions help both schools and providers understand what is considered ‘appropriate’. 

Alongside the DfE’s introduction of statutory guidance, UKSIC first published its filtering and monitoring definitions in 2016 to help both schools and providers understand what should be considered as ‘appropriate’.   The DfE also published further guidance highlighting new Filtering and Monitoring Standards in March 2023.

Included alongside the 2024 version are documents summarising the substantive changes introduced from 2023 and can be found below:

Appropriate Filtering

Appropriate Monitoring

Schools and providers can obtain further help and support from the Professionals Online Safety Helpline – helpline@saferinternet.org.uk

Filtering and Monitoring Definitions Public Consultation

UK Safer Internet Centre public consultation for the proposed 2024 Appropriate filtering and monitoring definitions – UK Safer Internet Centre attracted 5 responses that collectively suggested

  • General Concerns:
    1. Flexibility in managing filtering under specific scenarios like social network access is necessary to accommodate technological and practical limitations.
    1. The distinctions between discrimination and hate speech should be clarified to avoid redundancy and enhance understanding.
  • Technical Challenges:
    1. Concerns were raised about the feasibility of completely prohibiting administrators from disabling filters, suggesting instead mechanisms that alert or log such actions.
    1. The necessity of network-level filtering was emphasized, especially in contexts where device-level systems could be bypassed or disabled.
  • Proposed Modifications:
    1. Adjustments to the deployment descriptions are suggested to encompass a variety of filtering methods including on-device, network-level, and cloud-based systems.
    1. Real-time analysis of all content, including AI-generated and user-generated content, is recommended to ensure comprehensive monitoring and filtering.
  • Policy Integration:
    1. Monitoring systems should be integrated with existing school policies to ensure they are aligned with broader educational and safeguarding strategies.
    1. Suggestions for removing sections believed to be ineffective were made to streamline the approach and focus on impactful measures.

Summary of Responses Regarding the Inability to Disable Illegal Content Filters

A common theme across the responses to the UK Safer Internet Centre’s consultation is the challenge of ensuring that filters for illegal content, specifically child sexual abuse and terrorism, cannot be disabled by anyone at the school including the filtering system administrator. Here are the key points:

  • Technological Challenges: Several respondents noted that certain legitimate educational activities might inadvertently be impacted by strict filtering, such as the use of social networks for educational purposes, which can conflict with HTTPS decryption protocols necessary for filtering.
  • Administrative Flexibility: There were concerns that rigid filtering without the possibility of administrative override could lead to operational inefficiencies, particularly in scenarios requiring the use of specific network devices or during troubleshooting processes.
  • Legitimate Exceptions: The submissions mentioned situations where schools need to adjust filters to accommodate devices that do not support standard security certificates, or to allow access to broader educational content which might inadvertently be blocked by overly stringent filters.

Response and Justification

  • Legal and Ethical Obligations: UK law, alongside international norms, unequivocally prohibits access to content involving child exploitation and terrorism. These laws reflect societal values that prioritize the safety and welfare of children and national security over operational convenience.
  • Technological Systems: Modern filtering technologies are sophisticated enough to offer granular control, allowing exceptions to be made safely without disabling critical filters entirely. For example, category-based filtering can be applied rather than blanket domain blocks, thus maintaining access to legitimate educational content without compromising on illegal content blocks.
  • Risk Management: Any decision to alter filtering criteria must be accompanied by a robust risk assessment, ensuring that any changes do not inadvertently allow access to illegal content. This process should be transparent and involve safeguarding leads within the educational institution.

Alternative Measures

Instead of allowing the disabling of filters for illegal content, schools and colleges could consider the following measures:

  • Layered Filtering Approaches: Use multiple layers of filtering, where critical filters for illegal content are non-negotiable and cannot be disabled, but other layers can be adjusted to meet educational needs.
  • Enhanced Monitoring and Alerts: Implement systems that monitor and immediately alert IT administrators and safeguarding officers if there is an attempt to disable critical filters. This ensures accountability and rapid response to any such actions.
  • Regular Audits and Compliance Checks: Conduct regular audits of filtering and monitoring systems to ensure they are functioning as intended and compliance with legal standards is maintained.
  • Educational Exceptions Protocol: Establish a clear protocol for making exceptions, requiring approvals from multiple stakeholders, ensuring that any decision to bypass certain filters is well-documented and justified from an educational perspective.  This would be helpful when troubleshooting any technical issues affecting the filtering system.
  • Advanced Technical Training: Provide advanced training for IT staff on managing complex network environments without compromising on the integrity of critical internet filters.

By focusing on robust, transparent, and accountable processes, schools can ensure that they comply with legal standards while also addressing the practical and educational needs of their environments. These measures uphold the principle that there is no justified scenario for accessing illegal content, aligning with the UK Safer Internet Centre’s stance.

Share your feedback:

This field is for validation purposes and should be left unchanged.