It’s no secret that many people shirk expert-recommended privacy and security advice. For example, experts commonly recommend the enabling of two-factor authentication for important accounts (e.g., your email), yet, in 2018, fewer than 10% of Google account holders had enrolled in two-factor authentication1. The broader enigma is why, despite decades of dedicated work towards improving the usability of security and privacy controls, this remains to be case.
Well, is usability — alone — enough? It certainly helps. It’s a lot easier to browse reddit by typing “reddit.com” into a web browser than it is to, e.g., fire up a terminal and use
curl. The usability of a graphical web browser has clearly contributed to the widespread adoption of the Internet. But imagine if I created an apparatus that lets you hurl spiders at your face at a rapid clip. No matter how usable and accessible I made that apparatus — I’m going to call it the Spidler — I’m willing to guess you would not be keen to use it. Both things might be “usable”, but people only want to use one of them.
So are privacy and security technologies the equivalent to hurling spiders at your face? Not exactly. While it can feel that way sometimes, people do desire security and privacy — all things equal, they would prefer more secure email, or a more privacy-conscious search engine. Many may, in fact, be willing to pay a small premium for improved security and privacy 2. So in, in that sense, better privacy and security controls are akin to the web browser in that greater usability should increase their use. But there’s another key distinction between something like a web browser and something like two-factor authentication. Using a web browser is fun; people sing their praises (it is marketable); it solves a problem people can easily wrap their heads around (it solves a concrete problem); people around you can see you use it (it is observable). And perhaps, most importantly, web browsers facilitate tasks that users want and need to do anyway (they are primary concerns).
In contrast, security and privacy tools (like two-factor authentication) are rarely fun, they solve abstract problems (e.g., preventing a “hacker” from compromising your account sometime in the future), they appear to do nothing of consequence if they work well, and their use is largely invisible — so there is no social proof that anyone around you actually cares about security themselves. Moreover, privacy and security — while often desirable — are secondary concerns. Very few people use computing for the purposes of security; rather, they would like to have security because they are using their computing devices for other purposes (e.g., sharing photos, creating intellectual property).
Like insurance, security is what anthropologist Everett M. Rogers called a “preventative innovation.” 3 Preventative innovations are used now to reduce the risk of something bad happening later. The classic example of this might be insurance. But people aren’t very good at considering future harms in decision making. They over-estimate present value and under-estimate future harms. 4 So, the same tricks to market innovations that help with normal innovations will not necessarily work for preventative innovations.
The upshot of all of this is that a laser-like focus on usability is unlikely to drive up adoption of expert-recommended security and privacy behaviors. Usability is important, yes, but we may have reached a point of diminishing returns.
We must think more creatively. We need to find ways to align good security and privacy behaviors with primary concerns. We need to tap into people’s instincts to care for one another, instead of assuming they care only about themselves. We need to create systems that are delightful, that are fun. We need to demonstrate how stronger security behaviors are not just a burden — they are, in the long run, essential.
Usability is not enough.
Thanks for reading! If you think you or your company could benefit from my expertise, I’d be remiss if I didn’t alert you to the fact that I am an independent consultant and accepting new clients. My expertise spans UX, human-centered cybersecurity and privacy, and data science.
If you read this and thought: “whoah, definitely want to be spammed by that guy”, there are three ways to do it:
- Subscribe to my mailing list (spam frequency: a couple of times a month.)
- Follow me on Twitter (spam frequency: weekly)
- Subscribe to my YouTube channel (spam frequency: ¯\_(ツ)_/¯)
You also can do none of these things, and we will all be fine.
Milka, G. The Anatomy of Account Take-Over. USENIX ENIGMA, (2018). ↩
Acquisti, A., Brandimarte, L., and Loewenstein, G. Privacy and human behavior in the age of information. Science 347, 6221 (2015), 509–514. ↩
Rogers, Everett M. “Diffusion of preventive innovations.” Addictive behaviors 27.6 (2002): 989-993. ↩
Acquisti, Alessandro, and Jens Grossklags. “Privacy and rationality in individual decision making.” IEEE security & privacy 3.1 (2005): 26-33. ↩