Introduction
The UK’s Online Safety Act (OSA) came into force on 26 October 2023 and places new legal duties and responsibilities on online service providers to prevent, detect and remove illegal content.
Although the Act is aimed primarily at the largest social media and video-sharing services many organisations including universities will be expected to play an active role in preventing online abuse, grooming, and exposure to content that is illegal or harmful.
Ofcom is leading work to implement the Act’s provisions and are taking a phased approach to bringing duties into effect.
[Note that Further education institutions are exempt from the provisions of the OSA although remain bound by their duties to safeguard and promote the welfare of children. These duties are most recently outlined in the Statutory guidance Keeping children safe in education 2024 (publishing.service.gov.uk)
The mechanics of the Act
The Online Safety Act 2023 (legislation.gov.uk) is a substantial piece of legislation with 241 sections and 286 pages of text.
It applies to “user-to-user services” (U2U) (services that allow users to post content online or to interact with each other) and search services as well as pornographic content services.
New safety duties
The Act operates by imposing new duties (section 10) to implement systems and processes to reduce illegal activity risks and to take down illegal content when it does appear. Different safety duties apply depending on the category of content in question.
The Act also places new legal duties and responsibilities on online service providers to protect children from some legal but harmful material. Examples here are materials which promote or glorify eating disorders, self-harm and suicide. It imposes the most significant obligations on services with the most substantial reach (section 95).
- Category 1 is reserved for the highest risk, highest reaching user to user platforms such as Meta, Tik Tok and X.
- Category 2A covers companies with the highest reach search services
- Category 2B includes other platforms with high-risk functionality.
Education
Category 2B is where universities providing user to user services (U2U) such as chat rooms or bulletin boards on websites and other online environments are likely to come within the scope of the Act.
In education the online world is a huge part of the lives of students. The Act helpfully defines children as those under the age of 18 years which means that many of the online environments that students encounter are likely to be in scope.
To what extent the duties in the Act apply to the internally facing online environments of universities themselves remains to be fully determined because the Act does provide exemptions for services meeting specified conditions that are only available to closed groups.
Risk assessment guidance
Central to meeting the obligations in the Act is the requirement on in-scope service providers to carry out substantial and ongoing risk assessments (section 9). Ofcom have undertaken to publish risk assessment guidance, setting out how all services can meet their obligations under the regime.
Enforcement
- The OSA designates Ofcom as the independent regulator for online safety and gives Ofcom powers to take action against services and businesses that do not follow their new duties.
- Ofcom is required to develop guidance and codes of practice that will set out how these new duties are to be fulfilled.
- When this guidance and these codes of practice are published the obligations on universities will become clearer.
Next steps
The extent to which a university is within scope of the Act determines what obligations arise.
Steps that can be taken right away include:
- Establish whether or not the online services and activities that your institution offer fall within the scope of the OSA.
- Decide whether a senior member of staff should be responsible for understanding and overseeing the legislation and for getting your institution ready for the new online safety duties, and the risk assessments that will need to be done in future.
- Carry out a risk assessment to examine whether the institution facilitates the creating or sharing of any content which falls within the Ofcom categories of illegal harm. Annex 5: Service Risk Assessment Guidance (ofcom.org.uk)
- If not in place already, establish a mechanism and procedure for reporting and removing illegal content on your websites and platforms. Establish a complaints procedure to provide adults and children with clear, accessible and easy-to-use ways to report problems if harms arise.
The publication of the Ofcom codes of practice and guidance will greatly assist universities and others as they evaluate their response to this substantial and important legislation. The expectation is that these will be available in the latter part of 2024 or early 2025.
What do you think?
To what extent do the duties in the Online Safety Act 2023 apply to the online activities at your university?
The UK’s Online Safety Act is a step change in how illegal and harmful content is regulated in the online world.
Although the act is aimed primarily at the largest social media and video-sharing services universities have the opportunity to play a significant role in ensuring that learners will encounter a supportive and inclusive learning environment online.
Sources used in the compilation of this information include:
- Online Safety Act: explainer – GOV.UK (www.gov.uk)
- Legal but harmful online content | Children’s experiences | NSPCC Learning
- The UK’s Online Safety Act (taylorwessing.com)
Jisc Advice and guidance
Meet the challenges you face through the provision of trusted advice and impactful support.