TikTok, Google, Facebook, and Instagram parent Meta are among the corporations targeted by a broad but contentious new internet safety bill passed by British MPs.
The government claims the online safety bill passed this week will make Britain the safest online. But digital rights groups warn it endangers online privacy and speech.
The UK’s new law supports European and global efforts to curb the US-dominated tech industry. Last month, the EU’s Digital Services Act took effect with identical measures to clean up social media for 27-nation users.
A closer look at British law:
Since 2021, the massive bill has been in the works.
Social media platforms must remove child sexual abuse, hate speech, terrorism, revenge porn, and self-harm posts under the new rule. They must also prevent such content and provide users more controls, including barring anonymous trolls.
The government argues the bill takes a “zero tolerance” approach to child protection by holding platforms accountable for online safety. Platforms must block porn, bullying, and content praising eating disorders or suicide instructions, even if it is not unlawful.
Porn websites and social media platforms must verify users are 18 or older.
Cyberflashing—sending pornographic images—is illegal under the bill.
What if Big Tech doesn’t comply?
Any online company that UK users can access is subject to the law. Noncompliant companies face fines of up to 18 million pounds ($22 million) or 10% of global revenues.
UK regulators can prosecute and imprison computer company executives who decline to provide information. If their corporation ignores regulators’ child sex abuse and exploitation warnings, they’ll be criminally accountable.
British telecom regulator Ofcom will enforce the legislation. In a “phased approach” to implementation, the government will prioritize illegal content.
No specifics have been offered about how the law will be enforced.
CRITICS SAY?
Digital rights groups fear the bill threatens online liberties.
If tech companies have to ensure content is not harmful for children, they may have to sanitize their platforms, require users to upload official ID, or use privacy-intrusive face scans to estimate their ages.
The law also pits tech corporations against the British government over encryption. Regulators can force encrypted messaging services to incorporate “accredited technology” to scan messages for terrorist or child sex abuse content.
Experts warn that would open a backdoor for private communications, making everyone less protected.
Last month, Meta announced that all Messenger chats will have end-to-end encryption by year’s end. The UK government urged the corporation to secure safeguards to prevent child sex abuse and exploitation.