Posts The crypto wars: How much privacy should we give up for security?
Post
Cancel

The crypto wars: How much privacy should we give up for security?

In 2015, there was a mass shooting in San Bernardino, California, claiming the lives of 16 (including the two perpetrators) and injuring 24 others. The incident was a terrorist attack, and a tragedy. It also surfaced a longstanding debate on the role of security and privacy-enhancing technologies in society: the “going dark problem” — also referred to as “the lawful access challenge” or the “crypto wars”.

The “going dark problem” is what the FBI calls the widespread — and apocalyptic — use of encryption technologies. If impregnable encryption becomes the norm, the argument goes, then the FBI and other intelligence agencies will be unable to do their jobs at uncovering and mitigating national security risks. Bad actors — terrorists, spies — will be able to plan and communicate in secret and the country will descend into chaos.

When the FBI seized the Sen Bernardino perpetrators’ phones, they wanted access — terrorists do not act in isolation, and information on the phone could have provided law enforcement with information on other potential security risks. But, the phones were encrypted and locked, with the only people knowing the PIN to unlock the phone dead. The FBI wanted Apple to circumvent their own security protections to unlock the phones; the FBI wanted privileged access to the encrypted phones and all others like it.

Setting aside the politicking of the debate in the San Bernardino case — i.e., the fact that the FBI did not actually need Apple’s help to get access to the San Bernardino shooters’ phones and only wanted Apple to comply to set legal precedent for encryption backdoors more generally — was what the FBI demanded reasonable?

Let’s cover this question from both a policy and a technical perspective.

Policy

From a policy perspective, the demand seems reasonable at first blush. Indeed, intelligence agencies do have and have always had privileged access to private information for the purposes of national security. An analogous situation in the physical world might be a search warrant — with a search warrant, law enforcement can access and search private property to, for example, uncover incriminating evidence or other information that can help prevent future threats. If we allow for search warrants in the physical world, shouldn’t we allow for backdoor access to encrypted data in the digital world?

I get the desire to extend what is familiar in the physical world to the digital world. But it is a fallacy. The rules at play in the digital world are different; we must let go of the delusion that the legal and policy constructs that have worked up to this point in history will continue to do so, unadapted, moving forward. Consider the converse of the “going dark problem” — the “too bright problem”, where law enforcement can access any and all personal data and digital information about anyone and everyone. In fact, the too bright problem might be a closer approximation to the state of the world today than the “going dark problem” — see, for example, the Snowden revelations.

An operationalization of the “too bright problem” in the real world might be everyone having a front-door lock to which intelligence agencies and law enforcement have a master key. The government would be able to enter and leave your house at will, at any time, and scrutinize what you own, what you are doing and who you are with. Such a world would not be pleasant to live in, but you would have some assurances. For example: (i) there is not enough man power in intelligence agencies or law enforcement to visit a given individual’s home very often; (ii) it takes significant time and money for agents to travel from place to place and to meticulously screen each house for “suspicious” activities; (iii) if the government were to exercise this privilege, it would be obvious that they had done so (they would not have to hide what they are doing, so there would be no incentive to keep it secret); (iv) the government would only be able to observe you at a given point in time, sans taking records of previous visits; and, (v) the built-in weakness of your lock, if compromised (e.g., if someone who replicated the key), would be exploitable but the attacker would have to physically come to your house to exploit it.

Now, consider how different the “too bright” problem would be in the digital world. With a master key to unlock any and all encryption, intelligence agencies would be able to simultaneously and continuously analyze everyone’s digital personal data, both current and past, with a relatively small team of intelligence analysts and engineers, and all with you being none the wiser of what they are doing, when they are doing it, and what they might be learning. In fact, if the Snowden revelations are any indication, intelligence agencies would outsource most of this engineering effort to private companies who would be forced to comply to information requests about individuals from the government. Finally, it is also important to note that if this master key were ever compromised (and it invariably would be), our personal data would become “too bright” to a lot of other folks beyond the government — and most of them who will not have your best intentions at heart. The cybercrime industry caused $600 billion in annual damages in 2019, according to one McAfee report. Make no mistake that they will dedicate all their effort towards compromising the master key. In short, such a “master key” or “backdoor” would eventually make using the Internet untenable for anything that requires the smallest modicum of secrecy: be it sharing personal photos of children with loved ones, managing finances remotely, or creating any kind of intellectual property.

Policies to reduce risks introduced by the “too bright problem” in the digital world would have to look very different than policies that address it in the physical world. The affordances and limitations of cybersecurity and digital privacy technologies are different than the affordances and limitations of physical security and privacy technologies. We should not attempt to generalize policy for one from existing policy for the other.

Moreover, we must consider that good policy should promote equity — it should not systemically disadvantage one group of people over another. While “backdoor access” to encryption might help law enforcement access terrorists’ communications, research suggests that excessive government surveillance is more likely to negatively affect minorities and other communities who have historically been elided from power. Simone Brown’s excellent book “Dark Matters: On the Surveillance of Blackness” 1 provides an excellent case study in this point. And there are plenty of examples of problematic uses of surveillance, by governments, to persecute minorities in other countries — Uighur people in China, LGBTQ+ activists in Russia.

Note that excess surveillance of minority communities will not necessarily be conscious or intentional — intelligence agencies and law enforcement will simply be “doing their job” in analyzing “suspicious” behavior. But when “normal” behavior is defined by what the majority do, governmental surveillance of “deviant” will naturally disadvantage minorities.

From a policy perspective, what the FBI demands is unreasonable and untenable.

Technical

If you were a large hedge fund who decided to short sell some stock, and I orchestrated a social media campaign to get millions of people to buy that stock in order to cost you billions of dollars, you might demand that I pay you back the billions that you lost. But unless you are willing to wait a few millennia (at my current yearly income), you would not getting your billions back from me. In other words, there is no point demanding something from me that I cannot deliver.

Irrespective of what policy makers might want, or what is socially necessary, we must live with the realities of what is technically feasible. In short, there is a gap between what is socially required of technology and what is technically feasible. Social requirements are important and should be foregrounded, but we cannot ignore technical feasibility.

A backdoor encryption key that allows law enforcement privileged access without significantly weakening the security of a system against other attackers is not technically feasible — or at least, there are no known solutions. This is not my conclusion — it is the consensus of many of the most respected and influential security engineers and cryptographers. See, for example, the Keys Under Doormats paper from 2015 2. The authors — all titans in the field — conclude: “The complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws.”

Part of the underlying reason for this impossibility is that complexity is the enemy of security. Solutions that introduce such backdoors into encryption algorithms will inherently be more complex, and that complexity will be exploited by attackers. Make no mistake — it will not only be the government who has backdoor access to your data.

From a technical perspective, what the FBI demands is unreasonable and untenable. We may as well have no security at all — see my analysis of the “too bright problem”, above, for a sense of what that would look like.

Straddling the boundary between individual privacy and national security

Individual privacy rights will always be at odds with national security. The question is: how much individual privacy should we give up to live in a society that is reasonably safe?

One of the most frustrating parts of being an advocate for digital privacy is that the benefits of privacy are abstract. In contrast, the benefits of security are often more concrete and acute. 14 innocent people lost their lives in the San Bernardino shooting. There is a reason the FBI pressed Apple in its aftermath, despite it not needing Apple’s help to access the phone — when faced with acute loss of life, arguing for the abstract benefits of privacy seems selfish.

But we must not underestimate those benefits. Arvind Narayanan argued, in his USENIX Enigma talk from 2018 3, that privacy is essential for democracy. Democracy works only if its citizens are active and responsible participants; only if ideas are generated, exchanged, debated and improved. But ideas that deviate from the status-quo require an incubation period; they require non-normative thinking and safe spaces, free from authoritarian oversight. Without privacy, those safe spaces will be eroded — especially as we enter a world increasingly proliferated with digital sensors, where the walls literally have ears.

That said, absolute privacy is also not the answer. A key tenet of living in a civilization is adhering to the social contract 4 — that we, as individuals, agree to give up some liberties to reap the benefits of social living. We agree, for example, not to murder, not to steal. In return, we are given a reasonable expectation of safety and property. As a society, we must also come to an agreement of what personal data we are willing to give up and in what contexts. In return, we should be given a reasonable expectation of safety.

A backdoor to give law enforcement the ability to circumvent encryption is not a reasonable ask; it amounts to what I believe is an anti-democratic erosion of privacy.

So what should we do instead?

There may be no perfect solution to the “going dark” problem; as constructed, it may be a winner-take-all affair. We either have strong encryption, or we have no encryption. Weak encryption is the same as no encryption, for all intents and purposes. And we should be very skeptical of policy debates that are constructed as binary in this way. FBI director James Comey is the one who first framed this issue as the “going dark” problem — but as a report from the Harvard Berkman-Klein Center suggests, “going dark” may be the wrong metaphor 4. The modern Internet infrastructure is far too complex for end-to-end encryption to become ubiquitous; moreover, end-to-end encryption runs counter to the business interests of surveillance capitalism 5, which is as rampant and unchecked as ever. A better framing for the present state of the world might, in fact, be what Georgia Tech professor Peter Swire calls: “The Golden Age of Surveillance.” 6 While some communication channels may be closed to intelligence agencies, there are many that are open and easily exploitable. Perhaps it is not so bad to afford people a few impregnable bubbles of digital solitude.

The “going dark” view is, as Bruce Schneier argues, myopic 7. It focuses on a narrow threat model. And the proposed solution — weakening encryption — would cause far more issues than it solves.


Thanks for reading! If you think you or your company could benefit from my expertise, I’d be remiss if I didn’t alert you to the fact that I am an independent consultant and accepting new clients. My expertise spans UX, human-centered cybersecurity and privacy, and data science.

If you read this and thought: “whoah, definitely want to be spammed by that guy”, there are three ways to do it:

You also can do none of these things, and we will all be fine.


  1. Browne, Simone. Dark matters: On the surveillance of blackness. Duke University Press, 2015. 

  2. Abelson, Harold, et al. “Keys under doormats: mandating insecurity by requiring government access to all data and communications.” Journal of Cybersecurity 1.1 (2015): 69-79. 

  3. Narayanan, Arvind. “The web tracking arms race: Past, present, and future.” Enigma 2018 (Enigma 2018). 2018. https://www.youtube.com/watch?v=UhSya5J_cxw 

  4. Zittrain, Jonathan L., Matthew G. Olsen, David O’Brien, and BruceSchneier. 2016. “Don’t Panic: Making Progress on the “Going Dark”Debate.” Berkman Center Research Publication 2016-1.  2

  5. Zuboff, Shoshana, et al. “Surveillance Capitalism: An Interview with Shoshana Zuboff.” Surveillance & Society 17.1/2 (2019): 257-266. 

  6. Peter Swire, testimony at Senate Judiciary Committee Hearing, “Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy,” July 8, 2015. https://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Swire%20Testimony.pdf. 

  7. Schneier, Bruce. 2016. Security or Surveillance? 

This post is licensed under CC BY 4.0 by the author.