- British media watchdog Ofcom published its first-edition codes of practice and guidance laying out what tech firms should be doing to tackle illegal harms on their platforms.
- The measures form the first set of duties imposed by the regulator under the Online Safety Act, a sweeping law requiring tech platforms to do more to combat illegal content online.
- Though the act passed into law in October 2023, it was not yet fully in force — but Monday's development effectively marks the official entry into force of the safety duties.
LONDON — The U.K. officially brought its sweeping online safety law into force on Monday, paving the way for stricter supervision of harmful content online and potentially massive fines for technology giants like Meta, Google and TikTok.
Ofcom, the British media and telecommunications watchdog, published its first-edition codes of practice and guidance for tech firms laying out what they should be doing to tackle illegal harms such as terror, hate, fraud and child sexual abuse on their platforms.
The measures form the first set of duties imposed by the regulator under the Online Safety Act, a sweeping law requiring tech platforms to do more to combat illegal content online.
Get top local stories in DFW delivered to you every morning. >Sign up for NBC DFW's News Headlines newsletter.
The Online Safety Act imposes certain so-called "duties of care" on these tech firms to ensure they take responsibility for harmful content uploaded and spread on their platforms.
Though the act passed into law in October 2023, it was not yet fully in force — but Monday's development effectively marks the official entry into force of the safety duties.
Ofcom said that tech platforms will have until March 16, 2025 to complete illegal harms risk assessments, effectively giving them three months to bring their platforms into compliance with the rules.
Money Report
Once that deadline passes, platforms must start implementing measures to prevent illegal harms risks, including better moderation, easier reporting and built-in safety tests, Ofcom said.
"We'll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year," Ofcom Chief Executive Melanie Dawes said in a statement Monday.
Risk of huge fines, service suspensions
Under the Online Safety Act, Ofcom can levy fines of as much as 10% of companies' global annual revenues if they are found in breach of the rules.
For repeated breaches, individual senior managers could face possible jail time, while in the most serious cases, Ofcom could seek a court order to block access to a service in the U.K. or limit its access to payment providers or advertisers.
Ofcom had been under pressure to beef up the law earlier this year after far-right riots in the U.K. instigated in part by disinformation spread on social media.
The duties will cover social media firms, search engines, messaging, gaming and dating apps, as well as pornography and file-sharing sites, Ofcom said.
Read more
UK regulator warns of big fines, service suspensions for tech giants ahead of new safety rules
Australia threatens fines for social media giants enabling misinformation
EU fines Meta €797 million over abusive practices benefiting Facebook Marketplace
Under the first-edition code, reporting and complaint functions must be easier to find and use. For high-risk platforms, firms will be required to use a technology called hash-matching to detect and remove child sexual abuse material (CSAM).
Hash-matching tools link known images of CSAM from police databases to encrypted digital fingerprints known as "hashes" for each piece of content to help social media sites' automated filtering systems recognize and remove them.
Ofcom stressed that the codes published Monday were only the first set of codes and that the regulator would look to consult on additional codes in spring 2025, including blocking accounts found to have shared CSAM content and enabling the use of AI to tackle illegal harms.
"Ofcom's illegal content codes are a material step change in online safety meaning that from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws which protect us in the offline and the online world," British Technology Minister Peter Kyle said in a statement Monday.
"If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites," Kyle added.