Ofcom - Protecting children from harms online

Sam Preston 8 May 2024 2 min read
Ofcom - Protecting children from harms online feature image

Yesterday, the UK online safety regulator Ofcom announced new proposals designed to improve online safety for children and young people.

Use of technology and internet access is firmly embedded in children’s lives and in how we deliver services to children. For example many schools require pupils to have access to a computer or tablet in order to access parts of the curriculum. But whilst access to online materials presents many positive experiential and learning opportunities for children, the online world also presents the serious risk of exposing them to harm.

In order to fulfil their duties under the Online Safety Act, Ofcom have published their proposals to better regulate online services, launching a 3-month consultation period.

Under the measures introduced in the Online Safety Act, online services, which includes search engines and social media apps, are required to take steps to prevent harmful content being accessed or served to children. Content deemed harmful includes pornography, self-harm, eating disorders and suicide. They are also required to minimise exposure to other harmful content which may consist of bullying, violent, hateful or abusive content or the promotion of potentially dangerous challenges.

As detailed in the consultation, Ofcom are proposing a raft of measures, over forty new requirements, which will introduce and place more stringent measures on online services. Outlined in their consultation document, the new measures fit broadly into three key areas:

Robust age checks:
The introduction of ‘highly effective age-checks to prevent children from accessing the entire site or app, or age-restricting parts of it for adults-only access’
Safer Algorithms:
Filtering out the most harmful content and down ranking other harmful content from children’s feeds and providing a mechanism where children can feedback so the algorithm can learn what content they don’t want to see.
Effective moderation:
Content moderation systems and processes to take quick action on harmful content, ‘safe search’ settings for children, which can’t be turned off and must filter out the most harmful content. In addition, online service providers will be required to have clear policies on ‘what kind of content is allowed, how content is prioritised for review, and for content moderation teams to be well-resourced and trained.’

Perhaps most importantly, the proposals require all online services to identify if children are likely to access their site and if so conduct robust risk assessments to identify the potential risks that may occur from their services, functionality and algorithms. They will be required to implement measures to mitigate any identified risks.

So this all sounds very positive however, it would be foolish not to recognise that meeting the new proposals will be a challenge for online services to deliver, particularly those relying on algorithms to push content. Currently, the algorithm model is based on the recognition of engagement. As engagement increases, the algorithm promotes it by delivering more and more related content. We’ve all experienced this- we search for an item and suddenly all the links, images, adverts and pop-ups appearing on our screens are for similar items, alternatives or related products. Big tech doesn’t do this as a benefit, they make money this way.

So under the new requirements, algorithms will be required not only to recognise the type of user and harmful content, but to do the opposite of the current model and reduce access to content.

This is where the ‘big stick’ of regulation will play its part in fostering engagement with online service companies and, in my opinion, so it should. Whilst keen to work with them to achieve a safer online environment for children, enforcement will be a key Ofcom role and historically we know balancing the two roles can be difficult. Take schools for example. Whilst Local Authorities and Inspectorates produce frameworks and best practice guidance for schools, ultimately their role involves judgement which has a direct impact on the relationship. I know from personal experience that the role of ‘critical friend’ is a challenging one and very difficult to establish effectively.

For the preventative elements of Ofcom’s proposals to be carried out, the proposed personal liability for those in senior online service company roles and severe penalties will be needed to make the industry conform. We cannot allow the online service industry to cling to past rhetoric such as free speech or the one-size-fits-all excuse as given to explain end-to-end encryption, not included in the Ofcom proposals. (End-to-end encryption is needed for safe bank transfers and the like, not for online services used by children).

Whilst there will no doubt be protracted debate on Ofcom’s proposals we must be clear we all have a duty of care to protect children and young people, even online service companies.

Sam Preston

SSS Learning Safeguarding Director