The new path to privacy after the failure of European data regulations

The endless cookie settings that appear for every website look a bit like the prank compliance of an internet determined not to change. ...


The endless cookie settings that appear for every website look a bit like the prank compliance of an internet determined not to change. He is very boring. And it looks a bit like a revenge of the regulators by the data markets, giving the General Data Protection Regulation (GDPR) a bad name and so it might seem like the political bureaucrats have, once again, awkwardly hampered the otherwise smooth progress of innovation. .

The truth is, however, that the view of privacy put forward by the GDPR would spur an era of innovation that is much more exciting than current sleaze-tech. As it stands, however, it just doesn’t succeed. What is needed is an infrastructural approach with the right incentives. Let me explain.

Granular metadata gathered behind the scenes

As many of us now know, an endless amount of data and metadata is produced by laptops, phones and all devices with the prefix “smart”. So much so that the concept of a sovereign decision on your personal data makes little sense: if you click “no” to cookies on a site, an email will nevertheless have discreetly delivered a tracker. Delete Facebook and your mom will have tagged your face with your full name on an old birthday photo and so on.

What’s different today (and why in fact a CCTV camera is a terrible representation of surveillance) is that even if you choose and have the skills and know-how to protect your privacy, the Overall environment of mass metadata collection will always hurt you. It’s not about your data, which will often be encrypted anyway, but how collective metadata feeds will nonetheless reveal things at a fine level and make you appear as a target – a potential customer or a suspect. potential if your role models stand out.

Related: Data privacy concerns rise and blockchain is the solution

Despite what that might look like, however, everyone actually wants privacy. Even governments, businesses and especially military and national security agencies. But they want privacy for themselves, not for others. And that gets them into a conundrum: How can national security agencies, on the one hand, prevent foreign agencies from spying on their populations while simultaneously constructing backdoors for them to force?

Governments and businesses have no incentive to ensure confidentiality

To put it in a language eminently familiar to this readership: the demand is there but there is a problem with incentives, to put it mildly. To illustrate just how much of an incentive problem there is currently, a report from EY values the UK health data market alone at $ 11 billion.

Such reports, while highly speculative in terms of the actual value of the data, nonetheless produce an overwhelming feam-of-missing-out, or FOMO, leading to a self-fulfilling prophecy as everyone rushes for the promised profits. This means that while everyone from individuals to governments and large tech companies might want to ensure privacy, they just don’t have enough incentives to do so. The FOMO and the temptation to sneak through a backdoor, to make secure systems a little less secure, are just too strong. Governments want to know what their people (and others) are talking about, businesses want to know what their customers are thinking, employers want to know what their employees are doing, and parents and teachers want to know what children are doing.

There is a useful concept from the early days of science and technology studies that can help clear up this mess somewhat. This is the affordance theory. The theory analyzes the use of an object by its determined environment, its system, and the things it offers people – the kinds of things that become possible, desirable, comfortable, and interesting to do through the object or system. . Our current environment, to say the least, offers the overwhelming temptation of surveillance to everyone from pet owners and parents to governments.

Related: The data economy is a dystopian nightmare

In an excellent book, software engineer Ellen Ullman describe programming network software for an office. She vividly describes the horror when, after installing the system, the boss enthusiastically realizes that it can also be used to track keystrokes from his secretary, someone who has worked for him for over a decade. Before, there was trust and a good working relationship. The new powers inadvertently transformed the boss, thanks to this new software, into a monster, scrutinizing the most detailed daily work rhythms of those around him, the frequency of clicks and the pause between keystrokes. This senseless surveillance, although by algorithms more than by humans, is generally regarded as innovation today.

Privacy as a material and infrastructural fact

So where does this take us? That we can’t just put personal privacy patches on this surveillance environment. Your devices, your friends’ habits and your family’s activities will still be linked and identify you. And the metadata will leak anyway. Instead, privacy should be secure by default. And we know this won’t happen just through the goodwill of governments or tech companies, because they just don’t have the motivation to do it.

The GDPR with its immediate consequences has failed. Privacy shouldn’t just be a right that we desperately try to enforce with every website visit, or that most of us can only dream of exercising in expensive court cases. No, it must be a material and an infrastructural fact. This infrastructure must be decentralized and comprehensive so that it does not fall in the interest of specific national or commercial interests. Moreover, it must have the right incentives, rewarding those who manage and maintain the infrastructure so that privacy is made lucrative and attractive while making it unachievable.

In conclusion, I want to highlight an extremely underrated aspect of privacy, namely its positive potential for innovation. Privacy tends to be understood as a measure of protection. But, if privacy were just a fact, data-driven innovation would suddenly become much more meaningful to people. This would allow for a much broader engagement in shaping the future of anything data-driven, including machine learning and AI. But more on that next time.

The views, thoughts and opinions expressed here are solely those of the author and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Jaya Klara Brekke is the Director of Strategy at Nym, a global decentralized privacy project. She is a researcher at the Weizenbaum Institute, holds a doctorate. from the Department of Geography at Durham University on blockchain protocol policy, and is an occasional expert advisor to the European Commission on distributed ledger technology. She speaks, writes, and conducts research on privacy, power, and the political economies of decentralized systems.