Online safety bill is doomed to failure

The draft suffers greatly from mission creep, internal inconsistencies, and a woeful lack of clear definition

At first glance, the Online Safety Bill (OSB) should be an easy ride. A recent YouGov poll found that 81% of UK adults have cross-party support in parliament, are promoted by prominent children’s charities, and want to stop children being harmed by social media. In order to prevent it, we hope that a senior technical manager will be appointed and held legally responsible.

But even with good intentions, OSB as it is currently directed to lords will almost certainly fall short of its purpose of protecting children from online harm. Instead, it may introduce measures that are harmful to privacy, innovation, and freedom.

Since its launch in 2019, OSB has crossed the desks of four different prime ministers and four culture secretaries. Inevitably, many hobby horses have joined the draft bill in the meantime, but it suffers greatly from mission creep, internal contradictions, and a woeful lack of clear definition.

No one would dispute the bill’s purpose, but monitoring content on the web is no easy task. Freedom of expression and the right to privacy must be balanced against the potential for harm. It’s a precision engineering job, not a sledgehammer.

definitional flaw

The bill is full of ambiguous terms. Surprisingly, there is no clear definition of what constitutes a “substance that is harmful to children.” That’s for government ministers to decide, and it looks like the (possibly heavily reinforced) Ofcom will turn to the police after the bill becomes law. The lack of oversight is worrying, to say the least.

Oh, but we’re going to Silicon Valley! Companies that violate their policy of removing illegal content can be fined up to 10% of their global revenue, which is a big blow for Facebook. and this week, put tech executives in jail The issue, which was found to be “deliberately” exposing children to harmful content or “accompaniment” to ignoring regulatory warnings, was introduced at the request of several lawmakers. . It’s been pretty scary so far, but executives who “act with integrity” have nothing to fear, according to current Secretary of Culture Michelle Donnellan. .

So who gets caught in the crossfire in a battle between a grandstanding MP and a tech giant battling it out in court? Wikipedia and other joint ventures.

OSB is for all platforms, large and small, rich and poor. There are classifications, but the lines are not clear. The EU Digital Services Act distinguishes between centralized content moderation carried out by employees and the decentralized moderation model used by Wikipedia. Not in OSB.

Wikimedia Foundation Vice President Global Advocacy Rebecca McKinnon said: BBC The law affects not only large companies with professional content moderators, but also public interest websites such as Wikipedia, which are moderated by volunteers.

Small groups and individuals running Minecraft servers, as well as volunteers managing Mastodon instances, hobbyists running video sharing servers or blockchain nodes, code sharing sites like GitHub, and developers for privacy. In addition, they may fear strict control of the law. Software enhancements – the potential list is almost endless. The bill would apply to almost any site used by UK users, not just Twitter and Facebook.

OSB does not describe how such actors should protect themselves from users who upload harmful material, or the expected costs of complying. Even before legal costs are considered, what is likely to be pennies on a large platform can be a success or a failure for the admins of smaller servers.

mission creep

“Think of the Children” has historically often been a convenient cover for promoting controversial measures, and OSB is no exception.

definitely activated the government Long-term battle with cryptoThe MP claims to be concerned about end-to-end encrypted (E2EE) messaging apps like WhatsApp (of course they are used by ministers to conduct government business off the record, in which case they are obviously fine). OSB has drawn controversy, including provisions that undermine end-to-end encryption in private messaging services.

You can share illegal material through E2EE apps such as Signal and ProtonMail. They definitely do. But again, you can encrypt the material and send it via Gmail or Dropbox if you prefer. Banning or backdooring something that keeps citizens’ information safe is not the answer and will inevitably have negative consequences. There is no such thing as a backdoor for good people only. Technicians have been explaining it for decades. Moreover, privacy is a basic human right.

Proton CEO Andy Yen said in an interview, “In a democratic society, privacy must be embraced and protected. Even if there are some negative externalities, the no-privacy alternative is better.” It’s bad,” he said. Wired.Yen considers OSB and similar laws in Europe to be “Trojan horses”. [break encryption] It’s easy to see why he thinks so.

The latest draft features even more mission creeps. OSB is coming Illegal sites that show people crossing the channel in small boats in a ‘positive view’What does that have to do with child safety? there is nothing. This is yet another hobby horse ridden by Tory MP Nathalie Elfiecke. Leaving aside the ethics of such a ban, what does “positive light” mean in this context? Presumably it should mean whatever the Minister decides.

Free speech for me, not for you

Such arbitrary provisions allow governments to explicitly use OSB to suppress dissatisfied opinions, or to use measures aimed at protecting children in lieu of controlling political discourse. You may be accused of doing so.

Given the vague definition of a hazardous substance, such a thing is likely to happen anyway. , simply remove the borderline material and exclude it from the list where it is not shown (shadow banning) or apply an automatic filter.

“The fear of prison sentences leads to excessive moderation where legitimate content is removed. ,” said Monica Houten’s policy manager. Freedom of Expression at the Open Rights Group.

Interestingly, the press is not covered by OSB, which raises another question. What about the user-generated content defined as harmful that appears in the comments under articles and in materials the newspaper posts on his Twitter and Facebook?

The government removed the previous “legal but harmful” clause after concerns about free speech were expressed by members of parliament. It will be interesting to see if the horse trade is over and the same voices are raised against other current provisions.

How will the tech giants react?

Frankly, Mark Zuckerberg has no plans to join Wormwood Scrubs. That would require extradition, which the US is highly unlikely to agree to. Perhaps the law could require some UK-based subordinate (maybe Nick Clegg?) to take the lap if Meta violates its own rules. But many of the firm’s attorneys will make mincemeat of the “good faith” clause.

When the pressure comes, Meta or Musk, reluctantly, decide that the world is a big place and it’s not worth spending much time on a small island of 70 million people, implementing a UK-only catch-all. There is likely to be. age verification After uploading the filter software, who knows? Perhaps it will spread articles about VPNs and how to bypass blocks in the press.

Although unlikely, it risks discouraging investment in the UK tech sector. Governance threats due to ministerial whims, attacks on crypto, and possible imprisonment for start-up owners make other pastures look greener.

Ironically, OSB could power giant tech companies at the expense of smaller ones.

next step

Anyway, it may never get there. The bill is still subject to a lot of back and forth between the Senate and House, and unless it is cracked down significantly, whatever happens will face a number of legal challenges, leading to its withdrawal and manipulation. If it goes through at all, it is deeply concerning to give ministers the power to decide what is harmful, what is positive, and who is acting in good faith.

Policing large-scale global content is a difficult problem. I can understand people wanting to see heavy-handed platforms punished when they do something wrong. Of course, everyone wants to prevent children from viewing or sharing harmful things. I am sorry that you chose to Tear it up and try again.

https://www.computing.co.uk/opinion/4063001/online-safety-doomed-fail Online safety bill is doomed to failure

Show More
Back to top button