In its inaugural guidance for tech platforms, Ofcom emphasizes the importance of combating online grooming. The communications watchdog has issued this warning as part of its directive for tech platforms to adhere to the Online Safety Act.

This guidance encompasses various aspects, including addressing illegal content, particularly child abuse online.

Startling statistics from Ofcom indicate that more than one in ten individuals aged 11 to 18 have received explicit or partially explicit images online.

This initial code of practice, unveiled by Ofcom in its capacity as enforcers of the Online Safety Act, is focused on addressing issues such as child sexual abuse material (CSAM), grooming, and fraud.

Ofcom is keen on soliciting input from tech platforms regarding its proposed measures. These measures include compelling major platforms to modify default settings to exclude children from recommended friends lists.

Furthermore, platforms must ensure that children’s location information remains private in their profiles and posts, and they must prevent them from receiving messages from individuals not in their contacts list.

Additionally, tech platforms must allocate sufficient resources to their content moderation teams to ensure effective enforcement. Ofcom will also mandate certain platforms to implement hash-matching technology for identifying CSAM.

This method involves converting images into numerical “hashes” and comparing them with a database of hashes associated with known CSAM images.

A match indicates the presence of a known CSAM image. Professor Alan Woodward of Surrey University notes that this method is already widely employed by social media and search engines.

However, it’s important to note that this hashing method won’t be applied to private or encrypted messages. Ofcom explicitly states that this guidance does not propose any measures that would compromise encryption.

The bill contains provisions that could potentially be used, under specific conditions, to require private messaging apps like iMessage, WhatsApp, and Signal to scan messages for CSAM. Nevertheless, these provisions have been the subject of intense debate, as end-to-end encryption is central to these apps’ security and privacy features.

Ofcom clarifies that these powers won’t be subject to consultation until 2024 and are unlikely to be enacted before around 2025.

The feasibility of implementing these powers in a manner that preserves encrypted communication privacy remains a subject of uncertainty.

In an interview with the BBC, Ofcom’s chief executive, Dame Melanie Dawes, acknowledged the challenges and emphasized the need for tech companies offering encrypted messaging services to explore ways to combat child abuse on their platforms.

Ofcom’s task is formidable, given the extensive scope of its first guidance, which spans over 1,500 pages.

It will potentially affect over 100,000 services, many of which are based outside the UK, and could involve compliance requirements for as many as 20,000 small businesses.

Dame Melanie also acknowledges the challenge of managing public and campaigner expectations.

Regardless of their decisions, Ofcom recognizes that it may face criticism for being either too stringent or not strict enough with tech platforms.

Nonetheless, Ofcom’s primary role is to be proportionate, evidence-based, and fact-driven in its regulatory approach.

Furthermore, Ofcom clarifies that its role is not to directly receive reports of harmful content; instead, it focuses on ensuring that tech firms establish effective systems for users to report illegal or harmful content to them.

Last Updated: 09 November 2023