As of today, the 17th of March, social media platforms will be required by law to put measures in place to protect people in the UK from illegal content and criminal activity online.

According to the new Online Safety Act, the illegal content being targeted includes fraud, terrorism, and child sexual abuse material. Social media platforms will also have to take action against illegal harms like encouraging suicide, extreme pornography and selling drugs.

Tech companies had until yesterday to carry out an appropriate harms risk assessment, which would determine how likely users are to find illegal content on their platforms, or how they could be used to commit criminal offences. They now must implement measures to quickly remove illegal material when they become aware of it and to reduce the risk of “priority” criminal content from appearing at all.

The Online Safety Act covers more than 100,000 services from social media sites like Facebook and X to other websites such as Google and OnlyFans.

The UK government technology secretary, Peter Kyle, said that the new laws on illegal material was “just the beginning.”

“In recent years, tech companies have treated safety as an afterthought. That changes today.”

After US Vice-President JD Vance claimed that free speech in the UK was “in retreat,” Kyle explained that the act was written into law for the greater good, as he explained on LBC radio.

“Our online safety standards are not up for negotiation. They are on statute and they will remain,” he said.

How much will companies be fined for breaking these laws?

Companies that fail to comply with the new laws could face fines of up to £18 million, or 10% of their worldwide revenue. In the case of Meta or Google, this could mean billions of pounds lost. In the most extreme cases, services will risk being taken down.

Who is overseeing the Online Safety Act?

Ofcom is managing the act and has published a code of conduct for tech platforms to follow to prevent breaking the law. There are 130 “priority offences” that the companies must tackle first.

Some of the key codes include hiding children’s online profiles and locations from unknown users, introducing measures that allow women to block and mute users harassing them, setting up a reporting channel for organisations that deal with online fraud cases, and using “hash matching” technology to identify illegal images, prevent distribution of terrorist content or revenge porn.

Dame Melanie Dawes, Ofcom’s CEO, explained how important these new regulations are to ensure online safety.

“The safety spotlight is now firmly on tech firms and it’s time for them to act,” Dawes said. “We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.

“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”

The consequences for tech companies that don’t comply with the Online Safety Act could be colossal.

Share this post