The British media and telecommunications watchdog, Ofcom, released its first version of codes of practice and guidance for internet companies, outlining what they must do to address unlawful harms on their platforms, including fraud, hate crimes, terrorism, and child sex abuse.
TakeAway Points:
- Ofcom, the British media watchdog, released its first edition of guidelines and codes of practice outlining what tech companies should be doing to address unlawful harms on their platforms.
- Under the Online Safety Act, a comprehensive regulation that mandates digital firms take more action to prevent unlawful content online, the measures constitute the first set of obligations imposed by the regulator.
- The act was not yet fully operative even though it was signed into law in October 2023, but Monday’s development essentially signifies the safety duties’ legal implementation.
Strict OSA released
The measures form the first set of duties imposed by the regulator under the Online Safety Act, a sweeping law requiring tech platforms to do more to combat illegal content online.
The Online Safety Act imposes certain so-called “duties of care” on these tech firms to ensure they take responsibility for harmful content uploaded and spread on their platforms.
Though the act passed into law in October 2023, it was not yet fully in force—but Monday’s development effectively marks the official entry into force of the safety duties.
Ofcom said that tech platforms will have until March 16, 2025, to complete illegal harm risk assessments, effectively giving them three months to bring their platforms into compliance with the rules.
Once that deadline passes, platforms must start implementing measures to prevent illegal harm risks, including better moderation, easier reporting, and built-in safety tests, Ofcom said.
“We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year,” Ofcom Chief Executive Melanie Dawes said in a statement Monday.
Risk of severe fines and termination of services
Under the Online Safety Act, Ofcom can levy fines of as much as 10% of companies’ global annual revenues if they are found in breach of the rules.
For repeated breaches, individual senior managers could face possible jail time, while in the most serious cases, Ofcom could seek a court order to block access to a service in the U.K. or limit its access to payment providers or advertisers.
Ofcom had been under pressure to beef up the law earlier this year after far-right riots in the U.K. instigated in part by disinformation spread on social media.
The duties will cover social media firms, search engines, messaging, gaming, and dating apps, as well as pornography and file-sharing sites, Ofcom said.
Under the first-edition code, reporting and complaint functions must be easier to find and use. For high-risk platforms, firms will be required to use a technology called hash-matching to detect and remove child sexual abuse material (CSAM).
Hash-matching tools link known images of CSAM from police databases to encrypted digital fingerprints known as “hashes” for each piece of content to help social media sites’ automated filtering systems recognize and remove them.
Ofcom stressed that the codes published Monday were only the first set of codes and that the regulator would look to consult on additional codes in spring 2025, including blocking accounts found to have shared CSAM content and enabling the use of AI to tackle illegal harms.
“Ofcom’s illegal content codes are a material step change in online safety, meaning that from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws that protect us in the offline and the online world,” British Technology Minister Peter Kyle said in a statement Monday.
“If platforms fail to step up, the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites,” Kyle added.