Posts Privacy by Design is reformist: But do we need revolution?
Post
Cancel

Privacy by Design is reformist: But do we need revolution?

If you search for ways to make computing more respectful of consumer privacy, it will not take you long to stumble upon Privacy by Design (PbD).

Privacy by Design (PbD) is a proactive approach to incorporating core privacy principles — e.g., control over personal data and informed consent over data collection — into the design of new technologies. The origins of PbD can be traced back to the late 1990s, where a cross-disciplinary range of scholars and experts, spanning law, policy and engineering, argued that technologies — not just policy — could be utilized to enforce conceptions of privacy present in regulation, law and scholarship. Capturing the ethos of this origin, Langheinrich argues in his paper on PbD 1: “laws can only work together with the social and technological reality, not against them” [46]. PbD can be thought of as a rebuttal to the popular belief that consumer privacy protections must come at the expense of product functionality and business interests — rather, by “baking in” privacy controls and privacy-preserving functionality as one is building a product — one can have both strong privacy protections and a useful and usable product. Consider, for example, principle 4 of the IAPP’s 7 foundational principles for PbD:

Privacy by Design seeks to accommodate all legitimate interests and objectives in a positive-sum “win-win” manner, not through a dated, zero-sum approach, where unnecessary trade-offs are made. Privacy by Design avoids the pretence of false dichotomies, such as privacy vs. security, demonstrating that it is possible, and far more desirable, to have both.

Sound uncontroversial? That’s by design! PbD is a general concept divorced from any specific use-case or context; it is vague enough to be included in regulation without forcing specific technologies or processes into development workflows. In a sense, PbD is a terminology contribution more than it is a concrete guide for improving consumer privacy protections in developing technologies. PbD is easily “grokable” even for the uninitiated. And, at least partially due to this no-effort grokability, PbD has grown in influence and integration in both law and design — it features heavily, for example, in the European Union’s General Data Protection Regulation.

That’s largely a good thing. But, there’s a flip side to this ambiguity. Data aggregating entities can use “PbD” as a veil under which they can justify just even privacy exploitative practices — e.g., Google’s new Federated Learning of Cohorts (FLoC) approach to behavioral advertising that is, as the EFF puts it, a terrible idea. FLoC is the result of PbD-style thinking. If we start with the goal of serving users targeted advertisements in a privacy-preserving way, rather than with the goal of ensuring that users feel comfortable with the data that is collected about them and used for the purposes of targeted advertising, we end up with incremental privacy reforms contingent on not hurting the short-term bottom-line. As the EFF writes:

Google’s pitch to privacy advocates is that a world with FLoC (and other elements of the “privacy sandbox”) will be better than the world we have today, where data brokers and ad-tech giants track and profile with impunity. But that framing is based on a false premise that we have to choose between “old tracking” and “new tracking.” It’s not either-or. Instead of re-inventing the tracking wheel, we should imagine a better world without the myriad problems of targeted ads.

I’m picking on Google as an example here because of the timeliness of FLoC, but the problem extends well beyond Google. The broader problem is that PbD, today, is operationalized as a “last mile” approach in which a finalized design concept should be implemented in a manner that maximally protects privacy. But this last-mile, no-compromise-necessary approach is unlikely to result in anything beyond incremental reform because privacy is only “baked in” to implementation processes for products that are built within a surveillance capitalism economy. This is why I’m bearish on purely technical approaches to solving privacy, e.g., differential privacy and federated learning. Differential privacy “improves” privacy only if we assume that the baseline is a world where analysts have carte blanche access to our personal data. Federated learning “improves” privacy only if we assume that the baseline is a world where our personal data must be used to feed the construction of machine learning models optimized to serve us advertisements. There is an ambivalence to the success of these technologies: they improve consumer privacy protections, sure, but in so doing they help ensure the continued existence of the surveillance capitalism processes that got us into this mess.

In the 1960s, the social philosopher Andre Gorz noted a distinction between reform and revolution 2. Reform works within existing structures of power to make progress — raising the minimum wage to $15 an hour, for example. Revolution, in contrast, challenges existing power structures. It works “not in terms of what is possible within the framework of a given system and administration, but in view of what should be made possible in terms of human needs and demands.” For example, when factory workers in the early 20th century assembled in unions to collectively bargain for what we now consider mainstays in modern-day work environments — a minimum wage, a two-day weekend, and an 8-hour work day. PbD is reformist — it allows surveillance institutions do what they want to do, with a little bit of added thought on how to do it more responsibly. Yet — after years of PbD thinking — we are still at a point where what they are doing feels wrong. So perhaps what we need now is revolution.


Thanks for reading! If you think you or your company could benefit from my expertise, I’d be remiss if I didn’t alert you to the fact that I am an independent consultant and accepting new clients. My expertise spans UX, human-centered cybersecurity and privacy, and data science.

If you read this and thought: “whoah, definitely want to be spammed by that guy”, there are three ways to do it:

You also can do none of these things, and we will all be fine.


  1. Langheinrich, Marc. “Privacy by design—principles of privacy-aware ubiquitous systems.” International conference on Ubiquitous Computing. Springer, Berlin, Heidelberg, 2001. 

  2. Gorz, André. “Reform and revolution.” Socialist Register 5 (1968) 

This post is licensed under CC BY 4.0 by the author.