Tangled in the Threads

Jon Udell, October 18, 2000

Bruce Schneier's Secrets and Lies

The math is elegant, but reality is a mess

Schneier pauses to rethink the foundations of his profession.

Everyone who needs to understand or implement cryptographic algorithms reads Bruce Schneier's Applied Cryptography. In that cookbook for cryptographers, it's a matter of faith that deep mathematics, properly understood and cleverly arranged, can make three interrelated guarantees regarding digital communication:

Now, in his new book Secrets and Lies, Schneier questions his own faith:

It's just not true. Cryptography can't do any of that. It's not that cryptography has gotten any weaker sinced 1994, or that the things I described in that book are no longer true; it's that cryptography doesn't exist in a vacuum.

The error of Applied Cryptography is that I didn't talk at all about the context. I talked about cryptography as if it were The Answer. I was pretty naive.

The result wasn't pretty. Readers believed that cryptography was a kind of magic security dust that they could sprinkle over their software and make it secure. That they could invoke magic spells like "128-bit key" and "public-key infrastructure." A colleague once told me that the world was full of bad security systems designed by people who read Applied Cryptography.

After the publication of Applied Cryptography, Schneier's work as a security consultant led him to an increasing appreciation of the role of human factors. He began saying, over and over, that security is a process, not just a technology or a product. At one point, despairing that mathematically-unbreakable security schemes kept failing in the real world, he abandoned the book and rethought his whole approach to security. Cryptography, he realized, is essentially prophylactic. You encrypt a message to guard against attack; if the message is cracked, the game's over. But the real world isn't an all-or-none game. We can't prevent every bad thing from happening, and when bad things do happen, we can't just fold our tents. Prevention strategies are important tools, but they've got to be embedded in an ongoing process of risk analysis, detection, and response.

A week in the life of cyberspace

Secrets and Lies opens with a log of security events culled from various sources during March 2000. You've heard it all before: buffer overflows, e-mail worms, Microsoft Windows vulnerabilities, denial-of-service attacks, CGI exploits, privacy violations, defaced websites, credit-card fraud.

The litany of woe runs for several pages, and then Schneier notes that he stopped keeping track after only a week. There was nothing unusual about that week, either, it was just a normal week in the life of cyberspace. Nor is there any reason to expect this flood of events to diminish in the near future. In fact the reverse is likely. Growing interconnectedness means growing complexity. There are more moving parts, more points of failure, more ways to screw up.

Bad guys in cyberspace are motivated by the same things that motivate bad guys in the real world: notoriety, money, vengeance, thrills. But in cyberspace, three factors work in favor of the bad guys to make the threats they pose qualitatively different.

These factors, working together, assure that we'll see an ongoing, and likely increasing, flood of security events such as those Schneier logged in March.

Black hats vs white hats

It's true that the same factors that help the bad guys also help the good guys. Just as it only takes one smart bad guy to discover and disseminate an exploit, it only takes one smart good guy to discover and disseminate the fix. The global nature of the Internet, and its amazing ability to propagate memes at lightspeed, works both for good and evil.

Sometimes the would-be good guys go too far, though. They don't just counteract exploits, they create and publicize them. The rationale is that flaws exist, vendors aren't necessarily motivated to find and fix them, bad-guy hackers will inevitably find and exploit them, so it's up to good-guy hackers to find and publicize them.

Schneier isn't buying that argument. In particular, he draws a sharp distinction betwen researching and documenting a flaw, and distributing software that exploits it. People who make use of these software exploits are criminals, says Schneier, and so are the people who write and distribute the exploits. Let's deal with them accordingly.

Privacy, anonymity, authentication

There's a lot of confusion nowadays swirling around the notions of privacy, anonymity, and identification. Sometimes people conflate privacy and anonymity, though of course they're quite different things. I want my medical records to be private, meaning that only those medical personnel I authorize can read them. But it makes no sense to discuss anonymity in this context. I'm never anonymous in my dealings with the medical establishment, or indeed in nearly any other real-world relationship.

There are, to be sure, a few valid reasons to hide identity. In special cases -- abuse victims, whistle blowers -- there is need for what Schneier terms "social anonymity." Sometimes, people really do need to be able to speak and act anonymously. But in supporting such anonymity, the Internet also opens itself to attack. As it happens, true anonymity is as hard to achieve as any other kind of digital security. What abuse victims and whistle blowers can have, and what they really need, is pseudonymity: "Hi, my name's Bob, and I'm an alcoholic." But networks are inherently traceable, and Schneier concludes that "true anonymity is probably not possible on today's Internet." That's probably a good thing.

I have long believed that it's more important to assert our own identies, and authenticate who and what we encounter in cyberspace, than to hide our identies. Schneier thinks so too. I have tended to focus on the "who" aspect -- that is, authenticating who really sent a message, who really presented a string of credit-card numbers to an e-commerce site. Schneier acknowledges this, while also calling attention to the "what" aspect -- assuring, for example, that a fact in a database or a video clip, crucial to some matter of public policy, has not been faked.

I've written in the past about how authentication on today's Internet is asymmetrical. When I establish an SSL session with an e-commerce server, I'm given some assurance that the server is genuine, but it receives no similar assurance from me. In the long run, says Schneier, such mutual authentication is essential:

Authentication is about the continuity of relationships, knowing who to trust and who not to trust, making sense of a complex world. Even nonhumans need authentication: smells, sound, touch. Arguably, life itself is an authenticating molecular pit of enzymes, antibodies, and so on.

We authenticate one another instinctively, in many ways, all the time, as we interact in the real world. When we extend our relationships into cyberspace, we lose visual, olfactory, tactile, and other cues. In their place, we offer PKI (public-key infrastructure). It's a poor trade-off. For example, as Schneier notes, even the limited one-way server-only authentication that does exist on today's Web is flawed. As often as not, the website named in a server certificate is not the same as the website on which the customer began a transaction. In such cases, theoretically, the customer should check the certificate, and contact the issuer to verify its authenticity. Of course, nobody does. In one of the most damning remarks in the book, he concludes:

I make my purchases because the security comes from credit card rules, not from the SSL. My maximum liability from a stolen card is $50, and I can repudiate a transaction if a fraudulent merchant tries to cheat me. Digital certificates provide no actual security for electronic commerce; it's a complete sham.

Phew! These are strong words indeed. I prefer to think that certificates are not so useless. I sign all my emails, and in doing so I assert a binding between my identity and my email address, as certified by Thawte. It's hardly infallible, but that's vastly more assurance of identity than is conveyed by the average unsigned email message. But Schneier's point is a crucial one. It's not enough to delegate authentication to PKI infrastructure. Ultimately we need to take these matters into our own hands. To do that, we'll need to be able to authenticate one another directly, in a peer-to-peer fashion, using cues that are convenient, natural, and easy to understand.

From despair to pragmatism

It's a rare book that distills a lifetime of experience. It's a rarer one that chronicles the kind of crisis and transformation that Bruce Schneier has undergone in the last few years. He's emerged with a vital perspective. Cryptography is an amazingly powerful tool, but it's only a tool. We need to use it for all it's worth. But at the same time we have to be clear about its limitations, and locate its use within a real-world context that is scarier and more complicated than we dare imagine. Is there hope? Schneier admits that he abandoned the book for a while because he felt he could offer none. In the end, he concludes:

We're still stuck with an insecure Internet, and insecure password-protected systems. But by the same token, we're still stuck with insecure door locks, assailable financial systems, and an imperfect legal system. None of this has caused the downfall of civilization yet, and it is unlikely to. And neither will our digital security systems, if we refocus on the processes instead of the technologies.

We're going to be dealing with these issues for the rest of our lives, and they're not going to get easier anytime soon. Secrets and Lies doesn't offer many clear-cut answers because, well, there aren't any, and Schneier isn't pulling any punches. What he gives us, instead, is a framework within which to think rationally and productively about digital security. It's a remarkable book. Anyone touched by these issues -- which is to say, almost everyone -- should read it.


Jon Udell (http://udell.roninhouse.com/) was BYTE Magazine's executive editor for new media, the architect of the original www.byte.com, and author of BYTE's Web Project column. He's now an independent Web/Internet consultant, and is the author of Practical Internet Groupware, from O'Reilly and Associates. His recent BYTE.com columns are archived at http://www.byte.com/index/threads

Creative Commons License
This work is licensed under a Creative Commons License.