The Short Life and Humiliating Death of the Clipper Chip

The Short Life and Humiliating Death of the Clipper Chip


The year was 1992 and the FBI had what it considered to be a major problem on its hands.

That year, AT&T had announced the launch of something called the TSD-3600 (short for Telephone Security Device)—a novel piece of privacy hardware that could encrypt voice transmissions in standard landline phones. A clunky white device that sorta resembled a walkman, the TSD digitized calls, then encrypted them using a 56-bit key. It was originally offered to business executives at $1,295 apiece as a secure way to conduct important business calls.

For most people, this probably seemed like an exciting innovation. But for the government, it represented a major threat. For years, federal law enforcement had worried that organized crime and terrorist groups would inevitably turn to encrypted communications to conceal their activities from police wiretaps. Now, with the launch of the TSD-3600—a relatively low-cost device for the average mafioso—it seemed as if their worst fears were being realized.

Swiftly, the government developed a solution to its problem. In April of 1993, the Clinton administration announced the launch of the government’s own cryptographic device, dubbed the “Clipper Chip.” The idea behind the Clipper was simple: it was a microchip, manufactured by a government-sanctioned contractor and powered by an encryption algorithm that had been developed by the National Security Agency; the chip, which the government hoped would be integrated into commercial communication devices, included a cryptographic backdoor that—under the right circumstances—would allow the government to decode any communications being relayed on the device. “Our policy is designed to provide better encryption to individuals and businesses while insuring that the needs of law enforcement and national security are met,” Vice President Al Gore proffered at the time.

The plan enjoyed some initial success. After lobbying by the government, AT&T would revise its original designs for the TSD-3600, scrapping its original model and rolling out a new version that integrated the Clipper into its system. The White House hoped that it would be the first of many companies to adopt the new technology. The government wasn’t mandating that the private sector use this technology, but it was highly suggesting it. Officials really seemed hopeful that the public and businesses would go for it.

Suffice it to say, that didn’t happen.

Instead, privacy and civil liberties activists and the software community flipped out. Businesses boycotted the idea. And, within less than three years of the device’s initial announcement, the Clipper would be officially declared dead in the water. This month marks the 30th anniversary of the Clipper’s ill-fated launch—a commemoration, of sorts, of one of the U.S. government’s most spectacular tech failures. But the legacy of the chip lives on. It would help launch what came to be known as the first “Crypto Wars”—a tumultuous legal and cultural battle that is still ongoing to this day.

New Directions in Cryptography

The infamous chip.

The Clipper emerged at a time when technological advancements were forcing the government to confront unsettling new realities. At that time, the internet was starting to take off and cryptosystems were evolving with it. From the ciphers of ancient Rome to the code makers and breakers of WWII, encryption—the process of scrambling readable language into unintelligible signs and symbols—had almost always been the purview of governments, useful as it was for the concealment of sensitive state secrets. In the modern era, electronic encryption—which leveraged increasingly sophisticated machines to help create powerful encryption ciphers—was exclusively leveraged by secretive government agencies, like the NSA.

But by the 1990s, the government’s monopoly on this kind of information security had begun to slip somewhat.

A pivotal event in this trend occurred in 1976, when cryptographers Whitfield Diffie and Martin Hellman published “New Directions in Cryptography.” An influential academic paper, “New Directions” revealed that advances in modern technology were putting powerful electronic encryption ciphers within the civilian world’s grasp. Eventually, Diffie and Hellman helped create what is known as public-key cryptography, or asymmetric encryption—a new breed of cryptosystems that would go on to become the bedrock of much of today’s internet security—eventually helping to power secure web protocols like Transport Layer Security (TLS) and the Secure Shell protocol (SSH).

As a result of the technological advances that were made during the 1970s, popular interest in cryptography naturally increased over the next decade, as did speculation about its potential applications in the arenas of business, academia, and wider society. By the early 1990s, the U.S. government was increasingly nervous about encryption’s democratization—so much so that it had begun to develop a workaround.

“The government had been worrying since the mid-1970s about growing civilian interest in cryptography,” Steve Bellovin, a computer science professor at Columbia University, told Gizmodo. Bellovin, a veritable army of other computer scientists, entrepreneurs, and activists, helped lobby against the Clipper Chip during the 1990s. “This worried the NSA. They worried about more people starting to use cryptography and making messages unintelligible.”

The solution that the government came up with was the concept of large-scale key recovery systems—or what came to be known as “key escrow.” It was a strategy that involved allowing a third-party (namely, the government) access to an encryption key in the event that they might need to read the contents of an encrypted conversation or email. In short: it was a backdoor.

The Clipper would serve as the government’s initial prototype for this concept. But how was it supposed to work?

To power the Clipper, the NSA designed a cipher, called “Skipjack.” Every device that included the chip would, upon fabrication, be given a designated cryptographic key. That key was also held by the government and, under certain, vaguely defined circumstances, it could be handed over to an investigative agency so that they could unscramble a particular user’s communications and read, in plaintext, what had been said. But there were a lot of obvious problems with the “key escrow” model. For instance, nobody quite knew how the Clipper’s algorithm, Skipjack, worked—because the government insisted on keeping it classified. The chip, which the government expected businesses to integrate, was also a total mystery, having been designed with a tamper-proof mechanism so as to preclude snooping. Problematically, the chip was also said to be energy intensive and was considered expensive relative to other available chips.

In short: it wasn’t exactly what you’d call an ideal situation.

AT&T’s TSD-3600E Telephone Security Device

‘Pretty Much Everybody Hated This Idea’

For many reasons, the Clipper Chip seemed like a stupid idea. For businesses and Silicon Valley, the chip was a danger to their interests—a move that might curb the nascent encrypted communications market and strangle internet security in its cradle. For civil liberties activists, it was an obvious violation of Americans’ right to privacy—an abject example of the worst kind of dystopian government overreach.

But outside of interest groups’ specific gripes, there were other obvious problems. On a purely practical level, the plan seemed to make no sense. After all, the government claimed it wanted to launch the Clipper to help catch criminals. Yet many critics were quick to point out that, given the fact that the Clipper was optional, and given that the U.S. had been so vocal about its “backdoor” plans, there was absolutely no reason that criminals (or anybody else, for that matter) would ever buy products that used the Clipper. Overseas markets were beginning to roll out their own encrypted communications products, unfettered by invasive backend infrastructure, and there was no doubt that some domestic companies would also refuse to go along with the government’s plans; crooks would just use those alternatives. In essence, the entire raison d’être for the Clipper was bullshit.


Despite the myriad ethical and logistical concerns, the death knell for the Clipper was delivered less by activist critics than by the imperfect technology that the government itself had implemented.

That is to say: the chip ended up having a pretty bad bug in it.

In 1994, NSA officials arranged to have a meeting with AT&T—in an apparent bid to sell the floundering project to the one company that had agreed to its unwieldy terms. Matt Blaze, a recent Ph.D. graduate and young computer scientist working for the company’s Bell Labs research group at the time, was given the opportunity to play with some of the government’s technology.

“I had read about Clipper initially in the New York Times,” Blaze told Gizmodo. Right off the bat, he and his colleagues thought the idea seemed ludicrous. “So the government is saying ‘you can’t look at the algorithm and, oh, by the way, if you want to use it, you have to incorporate it into your product with this expensive tamper-resistant chip that’s only going to be available from one trusted vendor’,” he said, scoffing lightly. The whole thing seemed absurd. “Outside of the government, pretty much everybody hated this idea,” he said.

Still, the NSA was interested in showing off some of their hardware to the young programmer. Blaze says that he was invited to the agency’s headquarters at Fort Meade, Maryland, where he soon found himself standing in a SCIF, a secure compartmentalized information facility, talking with several NSA employees about the prototype for their new creation. Astonishingly, the NSA allowed Blaze to take this prototype home with him—seemingly hoping for a positive review from him. Blaze didn’t end up giving it to them.

In 1994, Steve Jackson Games released Illuminati: New World Order collectible card game. It included a card themed around the Clipper Chip.

In 1994, Steve Jackson Games released Illuminati: New World Order collectible card game. It included a card themed around the Clipper Chip.
Image: Steve Jackson Games

“So I went back home and said, ‘okay, let me play with this.’ I figured I wouldn’t be able to find any bugs or anything.” But, lo and behold, that’s exactly what happened.

Within a short period of time, Blaze discovered a pretty big problem with the chip: a flaw in its design allowed the key escrow functionality to be easily disabled, allowing Clipper-ed devices to be used in a rogue fashion for encrypted communications that law enforcement could not crack. In brief: a savvy cybercriminal could quite easily “own” Clipper devices, making them work to their benefit at the expense of the government’s plans. Technically speaking, these issues were fixable. But when the New York Times publicized Blaze’s findings shortly afterward, the idea that undiscovered flaws were lurking took hold. It was a sucker punch to an already dubious public’s confidence in the project, amounting to no less than a death blow.

The Clipper’s Legacy

In the end, Clipper withered and died because nobody—neither the public nor the private sector—bought into the government’s vision. Indeed, the government ended up being the only “customer” to ever buy a device that included the ill-fated chip; during the years when the project was still technically alive, the White House made bulk purchases of the TSD-3600, in the hopes of driving broader interest. The purchased devices presumably moldered in a government basement somewhere while, outside, the project floundered. The chip was officially declared dead in 1996.

The controversy over the Clipper helped to lead to a broader conversation about the role encryption should play in the burgeoning internet revolution. Eventually, that conversation led to what has come to be known as the first “Crypto War” —a full-bodied cultural and legislative wrestling match between digital activists and the government over U.S. encryption export controls. Long technically considered a “munition” due to its utility for protecting the military’s secrets from prying eyes, some forms of encryption had to be legally redefined to allow for their commercialization and for digital security to truly bloom. After a lengthy row between cyber activists and the government, the private sector eventually won—and, in 1996, the Clinton administration ultimately eased export restrictions, allowing certain kinds of encryption to be taken off the government’s “munitions” list and transferred to the Commerce Control List.

But if a first volley in this battle was decided for the good guys, the crypto wars never really ended. Nor did the specter of the Clipper Chip—that is, the notion that the government had an ethical obligation to undermine encrypted communications—ever truly fade. Instead, with each new development in the field of encrypted communications, governments continued to take more ill-fated swings at inserting the backdoor that the U.S. originally tried to pitch to the public so many years ago.

“We’ve been going through this for so many years,” said Matthew Green, a professor of cryptography at John Hopkins University, and a major critic of the government’s efforts to backdoor encrypted communications. “It’s easy to see it all as part of the same big fight.”

After the demise of the Clipper, the next great backdoor campaign took place in the mid to late aughts, as the advent of secure messaging took off and encrypted messengers—of the kind that would eventually lead to apps like Signal and WhatsApp—began to find their footing. “They [the government] became much more worried [by encrypted messaging]…they began asking for something like the Clipper Chip to be built into software,” said Green, citing a push by the FBI to mandate backdoors in messaging apps that never gained any ground. Meanwhile, when the Snowden leaks broke in 2013, they revealed evidence of a vast NSA surveillance program, known as “Bull Run,” that revolved around attempts to insert or maintain backdoors in widely used encrypted communications products and protocols. This would be followed, in 2014, by a renewed push on the part of the FBI and other federal agencies to mandate the cracking of smartphone encryption—a new battle that came to be known colloquially as the “Crypto Wars II.” In short: the government has remained persistent in its dogged pursuit of a workaround for secure communications and will remain as such for the foreseeable future.

The encryption battle has also gone global. Today, security agencies all over the world are angling for quiet ways to route digital protections. In 2018, Australia passed a law that allowed the government to weaken strong encryption in the country. More recently, the European Union introduced unprecedented legislation that would effectively nullify the security and privacy that encrypted messaging is supposed to provide. The proposal known as “Chat Control” would mandate that all technology companies operating in the EU scan all communications between participating nations’ citizens, in an effort to ferret out child abuse material.


As for the Clipper, those who helped bring it down are still glad they did so.

“It’s good that Clipper was killed—and I’m glad that I helped kill it…but it was sorta killed for the wrong reasons,” Blaze tells me. “The bug I found wasn’t why it was a bad idea. The stuff I found could be fixed…but there were all these other problems—the fact that it involved a secret algorithm…the fact that it included the key escrow mechanism that could be compromised…” In short: it was the paradigm of the chip itself that was the fatal flaw. “There was no version of this that you could build that wouldn’t have had those problems,” Blaze said.

Comments

No comments yet. Why don’t you start the discussion?

اترك تعليقاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *