
Chris Stokel-Walker, Columnist, Pathfounders (Platforms)
Two things can be true at the same time: big tech companies complain too loudly, for too long, about what they perceive as overly draconian regulatory intervention that most people would say are sensible considerations to protect consumers. And sometimes regulation can, in fact, be overly draconian and interventionist.
Which makes the brouhaha over the EU’s planned Chat Control bill, which would decimate user privacy in the goal of making it easier to catch child predators – who anyway would take steps to avoid detection of the type that the regulation makes possible – all that more confusing.
The proposed Chat Control regime would have made it compulsory for messaging platforms like WhatsApp and Signal to scan the messages of users, even those that ordinarily would be encrypted, to ensure that they didn’t contain any illicit content.
Requiring platform providers to snoop and scan, using AI, the private communications of millions of European users, as the Chat Control bill tabled to European legislators outlined, has rightly been called “dystopian” by Patrick Breyer, a former member of the European Parliament for the Pirate Party and a digital rights campaigner.
That’s why it’s good that the conservative Christian Democratic Union and Christian Social Union parliamentary group in the Bundestag came out in opposition to the proposal, saying it was “unwarranted” and “would be like opening all letters as a precautionary measure to see if there is anything illegal in them”.
As a result of German opposition, the current plans for Chat Control have been shelved. A watered-down Danish proposal also didn’t receive enough support to go ahead, meaning, for now, the planned vote at the next meeting of EU interior ministers this week won’t take place.
It’s a victory against a bad bit of legislation that could have imperilled the future of platforms like Signal in Europe. Meredith Whittaker, president of the Signal Foundation, has said that “we would leave the market” if Chat Control were passed. European users are already late to get access to some features platforms and services offer elsewhere because of concerns about meeting European demands. We don’t need to lose the existing services that have taken the chance on Europe, regulatory burden and all. That Chat Control hasn’t passed, and currently won’t in its present form, is therefore good news.
Besides putting onerous requirements on platforms that tank the trust of their users, the Chat Control bill also just wouldn’t work – in much the same way the highly-hyped, but ultimately underwhelming UK Online Safety Bill’s age check requirements have flopped. “From a technical perspective, client-side scanning is ineffective and extremely easy to evade,” says David Frautschy, senior director for European government and regulatory affairs at the Internet Society.
As Frautschy points out, those people perpetrating the sharing of child sexual abuse material (CSAM) can simply turn any abusive content into a different file format and manage to evade detection. “The result is a dangerous illusion of safety that does little to protect children,” he says.
“What it will reliably do instead is subject ordinary citizens to mass screening of personal communications, establishing an unprecedented system of surveillance over lawful, everyday life,” says Frautschy.
He’s not wrong. The Chat Control chaos highlights the common pitfall that Europe regularly falls into when it comes to taming big tech. They’ll often start with a laudable aim and end up with the most onerous, technocratic solution.
Tech platforms don’t need our help in batting off regulatory requirements that they see as damaging to their business – they already employ entire teams of expensive lobbyists to try and do that – at €67 million a year, that’s more than big pharma and big energy combined – and apply pressure on journalists to try and sway coverage.
But users do need help – and not just from tech platforms trying to eke out every bit of information they can from us to exploit. Sometimes we need help from misguided regulation that purports to protect us, but instead does nothing of the sort.