Click here to jump to a specific page:
[Page 709]
[Page 712]
Without the ability to keep secrets, individuals lose the capacity to distinguish themselves from others, to maintain independent lives, to be complete and autonomous persons. . . . This does not mean that a person actually has to keep secrets to be autonomous, just that she must possess the ability to do so. The ability to keep secrets implies the ability to disclose secrets selectively, and so the capacity for selective disclosure at one's own discretion is important to individual autonomy as well.{1}Secrecy is a form of power.{2} The ability to protect a secret, to preserve one's privacy, is a form of power.{3} The ability to penetrate secrets, to learn them, to use them, is also a form of power. Secrecy empowers, secrecy protects, secrecy hurts. The ability to learn a person's secrets without her knowledge to pierce a person's privacy in secret is a greater power still.
People keep secrets for good reasons and for evil ones. Learning either type of secret gives an intruder power over another. Depending on the people compromised and the secrets learned, this power may be deployed for good (preventing a planned harm) or ill (blackmail, intimidation).
This Article is about the clash between two types of power:
the individual's power to keep a secret from the state and
others, and the state's power to penetrate that secret.{4} It focuses on new[Page
713]
conflicts between the perennial desire of law
enforcement and intelligence agencies to have the capability to
penetrate secrets at will, and private citizens who are acquiring
the ability to frustrate these desires. This is an article about
the Constitution and the arcana of secret-keeping:
cryptography.{5}
This is also a long article. It is long because it addresses three complex issues. First, it outlines some of the promises and dangers of encryption. Second, it analyzes the constitutional implications of a major government proposal premised on the theory that it is reasonable for the government to request (and perhaps some day to require) private persons to communicate in a manner that makes governmental interception practical and preferably easy. Third, it speculates as to how the legal vacuum regarding encryption in cyberspace shortly will be, or should be, filled.
What fills that vacuum will have important consequences. The resolution of the law's encounter with cryptography has implications far beyond whether the government adopts the Clipper Chip or whether a particular cipher may be licensed for export. The resolution of this debate will shape the legal regulation of cyberspace and in so doing shape its social structures and social ethics.
Cryptologists{6} use a few terms that
may not be familiar to lawyers, and it is useful to define them
at the outset of any discussion relating to encryption.
Cryptography is the art of creating and using methods of
disguising messages, using codes, ciphers, and other methods, so
that only certain people can see the real message. Codes and
ciphers are not the same. A code is a system of
communication that relies on a pre-arranged mapping of meanings
such as those found in a code book. A cipher is a method
of encrypting any text regardless of its content.{7} Paul Revere's "[o]ne,
if by land, and two, if by sea" was a code.{8} If the British had landed
by parachute,[Page 714]
no
quantity of lanterns would have sufficed to communicate the
message. The modern cryptographic systems discussed in this
Article are all ciphers, although some are also known as
electronic code books.
Those who are supposed to be able to read the message disguised by the code or cipher are called recipients. "The original message is called a plaintext. The disguised message is called a ciphertext. Encryption means any procedure to convert plaintext into ciphertext. Decryption means any procedure to convert ciphertext into plaintext."{9} An algorithm is A more formal name for a cipher. An algorithm is a mathematical function used to encrypt and decrypt a message. Modern algorithms use a key to encrypt and decrypt messages.{10} A single-key system is one in which both sender and receiver use the same key to encrypt and decrypt messages. Until recently, all ciphers were single- key systems. One of the most important advances in cryptography is the recent invention of public-key systems, which are algorithms that encrypt messages with a key that permits decryption only by a different key.{11} The legal and social implications of this discovery figure prominently in this Article.
Cryptanalysis is the art of breaking the methods of
disguise invented with cryptography. Lawyers will recognize the
cryptographers' terms for cryptanalysts who seek to read messages
intended only for recipients: enemies, opponents,
interlopers, eavesdroppers, and third
parties.{12} In this
Article, however, cryptanalysts who work for U.S. law enforcement
or intelligence organizations such as the FBI or the National
Security Agency (NSA) will be called public servants.
Key escrow refers to the practice of duplicating and
holding the key to a cipher or the means of recreating or
accessing the key to a cipher so that some third party (the
escrow agent) can decrypt messages using that cipher. As used in
the Clipper Chip debates, the term "escrow" is
something of a misnomer because the escrow is[Page 715]
primarily for the benefit
of the government rather than the owner of the key.
Part I of this Article describes advances
in encryption technology that are increasing personal privacy,
particularly electronic privacy, but reducing the U.S.
government's ability to wiretap telephones, read e-mail
surreptitiously, and decrypt computer disks and other encrypted
information. To ensure the continuation of the wiretapping and
electronic espionage capabilities that it has enjoyed since soon
after the invention of the telegraph and the telephone,{13} the government has devised an Escrowed
Encryption Standard (EES),{14} to be implemented
in the Clipper Chip{15} and other similar
devices.{16} In Clipper and related
products the government[Page
716]
proposes a simple bargain: In exchange for
providing the private sector with an encryption technology
certified as unbreakable for years to come by the NSA,{17} the government plans to
keep a copy of the keys{18}--the codes
belonging to each chip which, the government hopes, will allow it
to retain the ability to intercept messages sent by the chip's
user. The government's proposal includes procedures designed to
reduce the risk that the keys would be released to law
enforcement agencies without legally sufficient justification,
although the likely effectiveness of these
procedures is debatable. Most U.S. residents remain free,
however, to reject the government's offer, use alternatives to
Clipper (so long as the software or hardware remains in the
U.S.),{19} and withhold their keys
from the government.{20} With ever more
secure methods of [Page
717]
encryption becoming easier to use, U.S. residents
can protect their electronic communications and records so well
that they are able to frustrate interception attempts by even the
most sophisticated government agencies.{21}
Part II examines the legal justifications and constitutional implications of the EES proposal. It argues that the EES proposal violates the spirit, although not the letter, of the Administrative Procedures Act and represents an abuse of the technical standard-setting process. The involvement of the NSA may violate the Computer Security Act, but the absence of public information as to its role makes a firm judgment impossible. Part II also discusses Clipper's inherent policy and technical weaknesses and the inconsistencies between the Administration's policy objectives to the extent they are unclassified and the Clipper proposal itself. It concludes, however, that a purely voluntary Clipper program violates no statutory or constitutional provisions, and that even if it does, there is no one with standing to challenge such a violation. Part II also concludes that an optional Clipper will probably make only a modest contribution to the government's stated goal of maintaining its wiretap and electronic espionage capability.
Thus, Part III considers the
constitutional implications of the more radical proposal that
some commentators find implicit in the policies animating
Clipper: requiring all users of strong encryption to register
their ciphers' keys with the government. After a whirlwind
survey of evolving conceptions of the constitutional right to
privacy as well as more settled First, Fourth, and Fifth
Amendment doctrines, Part III concludes that
although mandatory key escrow would infringe personal privacy,
reduce associational[Page 718]
freedoms, potentially chill
speech, constitute a potentially unreasonable search, and might
even require a form of self-incrimination, the constitutionality
of mandatory key escrow legislation remains a distressingly close
question under existing doctrines.
Part IV addresses the cryptography controversy as an example of the law's occasionally awkward response to a new technology. The courts, and to a lesser extent the legislative and executive branches, have yet to come to grips with many cryptographic conundrums. As a result, this part of the legal "landscape" remains relatively barren. As more and more settlers arrive in cyberspace, the nature of this new landscape will depend critically on the legal metaphors that the colonists choose to bring with them.
Finally, the Technical Appendix discusses modern cryptographic systems, including the widely-used DatA Encryption Standard (DES), and how they can (at least theoretically) be broken by attackers armed with large numbers of relatively modest computers. It also provides an introduction to public-key cryptosystems and to digital signatures, which could represent the most important commercial application of modern cryptographic techniques.
I. Modern Cryptography: Private Security, Government
Insecurity
Cryptography contributes to commercial, political, and personal
life in a surprising number of ways. Now that modern
cryptographic techniques have put strong, perhaps uncrackable,
cryptography within the reach of anyone with a computer or even a
telephone, the use of strong cryptography is likely to increase
further. As A result, worried law enforcement and intelligence
agencies have developed the Clipper Chip in order to retain their
capability to eavesdrop on private electronic communications.
A. Who Needs Cryptography?
Many individuals and businesses want or need communications and
data security.{22}
Although these desires clearly have an objec[Page 719]
tive basis in many cases,
some of these desires are undoubtedly symbolic and psychological.
Who other than the recipient, after all, is likely to want to
read most private faxes and e-mail?{23} The subjective nature of a desire for
privacy
makes it no less real or worthy of respect.{24} Encryption can play A critical role in
contributing to this communications and datA security.{25}
The government's assurance that a cryptosystem is secure also contributes to this security. Evaluating the strength of a cipher is a black art that requires skills few businesses or individuals possess. The government's endorsement will at least reassure those, such as banks and lawyers, who have a duty to secure their communications and data but lack the technical knowledge to determine what ciphers are reliable.
[Page 720]
messages.{27}Banks use encryption to protect ID numbers that customers use at bank automated teller machines (ATMs).{28} In addition, many banks encrypt the customer data on ATM cards in order to protect against forgeries.{29} The banking sector's awareness of its vulnerability to electronic theft of funds has spurred the creation of cryptographic standards for both retail and inter-bank transactions.{30}
As
the economy continues to move away from cash transactions towards
"digital cash," both customers and merchants will need
the authentication provided by unforgeable digital
signatures in order to prevent forgery and transact with
confidence.{31} Forgery
is a perennial problem with electronic mail: copying is easy,
there are no tangible permanent media involved in the
communication, and programmers or system managers can alter e-
mail headers to fake the source of a message. Cryptography can
provide an authenticating function for these electronic
transactions. Cryptographic [Page
721]
techniques
can be used to produce a digital signature which, when
properly used, can prove that a cleartext message (such as a buy
or sell order) was really sent by the party from whom the message
appears to originate.{32} In addition, a
digital signature attests to the integrity of the contents of a
message. If the digital signature system is properly
implemented, the signature of every document is uniquely
calculated from the full text of the document, and is uniquely
associated with the sender. There is no way to fake a signature
by copying A signature from one document and attaching it to
another, nor is it possible to alter the signed message in any
way without the recipient immediately detecting the deception.{33} The slightest change in
a signed document will cause the digital signature verification
process to fail. Indeed, a signature verification failure will
be caused by a transmission error affecting a single bit of the
message.{34}
The proposed National
Information Infrastructure, better known as Vice President Al
Gore's information superhighway, envisions
"telebanking" and other electronic transactions.{35} It recognizes, however,
that as these services expand, so too will "public concern
about communications and personal privacy."{36} One important issue
will be the extent to which consumer-oriented digital payment
systems allow for anonymity and privacy; another will be the
extent to which law enforcement and banks will require audit
trails that lead to the consumer.{37}
[Page 722]
Business information need not be scientific or technical to be of enormous value. Sensitive market information such as the amount that a corporation plans to bid at an auction for valuable oil leases or the amount that a construction company plans to offer at tender is of enormous benefit to a competitor.{40} Knowledge of A company's cost and price structure, market research, strategic plans,order and customer lists are of obvious benefit to competitors. For an investor, inside information such as planned merger or acquisition activity, can also reap huge profits. Encryption helps prevent high-tech eavesdropping, while at the same time discourages some low-tech theft: a stolen laptop with an encrypted disk represents a loss of hardware, but not of sensitive information.{41}
The increasing importance of intellectual property makes information security especially valuable to industry; the portability of ideas makes it ever-harder to achieve. The increase in mobile communications also plays a role. As workers rely on networks to tele-commute to the office, or use cellular telephones to communicate with colleagues, or download e-mail onto their laptops while away from the office, they expose their information to eavesdroppers.{42}
[Page 723]
The risk to U.S.
corporations of both high- and low-tech industrial espionage is
particularly great because they are not just the target of
domestic and foreign competitors, but also of foreign
intelligence agencies.
Indeed, according to the FBI, foreign governments routinely use
their intelligence services to acquire valuable information about
U.S. corporations.{43} As a result,
without some form of communications and data security, sensitive
technical and market information can be intercepted from faxes,
cellular and microwave telephone calls, satellite
communications, and inadequately protected computer systems.{44} Foreign firms may soon
face a similar threat of industrial espionage by U.S.
intelligence agencies searching for new roles, and continued
appropriations, in the post-cold-war era.{45} [Page
724]
[Page
725]
versations may be at risk if the signal travels by
microwave or satellite.{50} Although there are
no cases to date holding that failure to encrypt a cellular
telephone conversation or an electronic mail message, much less A
regular phone call, constitutes professional negligence, the ease
with which these can be overheard or intercepted, combined with
the growing simplicity of encryption software, make it
conceivable that failure to use encryption may be considered a
waiver of privilege at some point in the future (at least for
insecure media such as electronic mail and cellular
telephones).{51}Lawyers are not the only professionals who receive client confidences. Doctors, therapists, and accountants all receive sensitive information which they then have a duty to keep confidential. These duties can arise in tort or contract, or pursuant to state and federal statutes.{52} Some of these duties are reflected in evidentiary privileges,{53} but a privilege is not required to create the duty.{54}
[Page 726]
a digitized photograph, and
any other information (for example, health, immigration status,
or prior convictions).{56} Users (who might
include liquor stores, police, banks, employers, or a national
health insurance trust) would have a reader with the government's
public key on it, which they would use to decrypt the card. So
long as the government was able to keep its private key secret,
the ID card would be unforgeable.
National ID cards raise a host of problems outside the scope of
this Article, many of which could be exacerbated by the use of
cryptography. Chief among these difficulties is the danger that
the government might encrypt additional information on cards that
would be invisible to the holder but might be accessible to law
enforcement, or even some employers. Examples of such secret
information include criminal record, military discharge status,
or health information.{57} Less ominously,
digital signatures provide a means of
authenticating all electronic data. In a world in which bank,
tax, and medical records, and the contents of the digital library
are all at risk of accidental or malicious alteration,
authentication of data becomes critical. By providing a reliable
guarantee that data with a proper signature is authentic, digital
signatures provide a certain means of detecting changes when
someone tries to rewrite history. [Page
727]
Cryptologists have worked out protocols for untraceable, anonymous, electronic cash ("E$") that also resist illicit duplication. These permit customers to acquire E$ from A digital bank without disclosing their identity to the bank. Using high-level cryptographic techniques, the E$ is unforgeably certified as valid, but can be spent only once.{60}
Unfortunately, although cryptography allows the creation of privacy-enhancing E$ and helps ensure that an Orwellian surveillance state remains in the realm of fiction, its advantages come at a price. The same features that might make uncrackable encryption attractive to groups seeking to change the social order by lawful but unpopular means, and that protect those working towards unpopular causes from retribution, also provide security to lawbreakers. Untraceable E$ may help make untraceable "perfect crimes" possible.{61}
[Page
728]
Undoubtedly, criminals and conspirators will find
a use for encryption,{62} but so too will
many others. Not every diarist records crimes in his daybook,
but for many people there will be a certain satisfaction in
knowing that their most private thoughts are safe from anyone's
prying eyes, be they major governments or younger siblings.{63}
[Page 729]
Encryption
also protects against the consequences of misdialing a telephone
number and reaching the wrong fax machine an
increasingly common problem as the number of dedicated fax lines
grows.
d. E-mail
The exponential growth in the Internet's popularity has fueled
the private demand for encryption.{72} Military-grade cryptography, or something
close to it, is easily available free to any user of the Internet
who knows how to download a file.{73}
[Page 730]
e. Personal Records
Many people have things they want to hide from their colleagues
or family members. The secret can be as trivial as a planned
surprise party, as personal as a love letter or sexual
orientation, or as unsavory as a planned theft or past misdeed.
It can be A private diary or the plans for a bomb. These records
may be on paper or stored on a computer disk. Some people derive
a sense of security from the knowledge that their communications
and data are safe from unauthorized snooping by their friends,
family, or anonymous computer hackers. Others seek an even
greater sense of security by attempting to encrypt their
communications and records in a manner that cannot be decrypted
even by authorized law enforcement.{74}
7. Dissidents and Others
Most, if not all, of the readers of this Article probably
experience life in the United States as one of political freedom.
For some of these readers, a desire for communications and
electronic records security, particularly security from possible
or suspected government surveillance or intrusion, may appear to
be an excess of libertarian paranoia. The existence of low-water
marks in civil liberties (such as the 1798 Alien and Sedition
Act,{75} the 1920s'[Page 731]
"Palmer raids,"{76} the Japanese internment during World War
II,{77} and COINTELPRO{78}) may be seen by some readers as well-
documented and anomalous departures from American ideals; other
readers may see them as symptoms of A more general tendency of
those in authority, approaching the "iron law of
oligarchy."{79}
Organized government intrusion into personal communications and data privacy is less visible than an order to round up thousands of civilians. It is also far more frequent. When given the duty and authority to identify threats to national security,{80} public servants have shown a tendency to adopt a "vacuum cleaner[]" approach to private information.{81} Indeed, the Senate committee charged with investigating domestic surveillance noted "the tendency of intelligence activities to expand beyond their initial scope" and stated that government officials "have violated or ignored the law over long periods of time and have advocated and defended their right to break the law."{82}
[Page 732]
It is harder to view fears
of government surveillance as aberrational when one learns that
in the 1950s the FBI identified 26,000
"potentially dangerous" persons who should be rounded
up in the event of a "national emergency," and that it
maintained this list for many years.{83} During the 1970s, even sympathizers
dismissed as fantastical the claims by Black Panthers and other
dissident groups that they were being wiretapped and bugged by
the FBI. These allegations proved to be correct.{84} Indeed, the U.S. government has an
unfortunate recent history of intrusion into private matters.
During the 1970s, the FBI kept information in its files covering
the beliefs and activities of more than one in four hundred
Americans;{85} during the 1960s, the U.S. Army
created files on about 100,000 civilians.{86} Between 1953 and 1973, the CIA opened and
photographed almost 250,000 first class letters within the U.S.
from which it compiled a database of almost 1.5 million names.{87} Similarly, the FBI opened tens of
thousands of domestic letters, while the NSA obtained millions of
private telegrams sent from, to, or through the United States.{88}
Although the Constitution guarantees a
high degree of political freedom and autonomy, "[t]he
Government has often undertaken the secret surveillance of
citizens on the basis of their political beliefs, even when those
beliefs posed no threat of violence or illegal acts on behalf of
a hostile foreign power."{89}
Certainly, neither
statutory nor constitutional prohibitions have proved
consistently effective in preventing civil liberties abuses. For
example, U.S. Census data is supposed to be private, and that
privacy is
guaranteed by law. Nevertheless, during World War II the
government used census data to identify and locate 112,000 [Page 733]
Americans of Japanese
ancestry who were then transported to internment camps.{90} Similarly, the CIA repeatedly violated the
prohibition on domestic intelligence contained in its charter.{91}
One need not believe that such excesses are routine to sympathize with those who fear that another such excess is foreseeable. Indeed, whether one considers these operations to have been justified, to have resulted from a type of a bureaucratic rationality that rewards results regardless of legal niceties,{92} or to have been a form of security paranoia, this history could cause a reasonable person to fear she might someday be swept up in an investigation.{93} The passage of Title III of the Omnibus Crime Control and Safe Streets Act of 1968 (Title III),{94} designed to define standards for the use of wiretaps, appears to have reduced greatly the amount of illegal wiretapping by police. Nonetheless, illegal wiretapping by police has not been completely eliminated.{95}
Not all governmentintrusion into privacy is centrally organized, but that hardly makes it less intrusive. During the past five years the IRS has caught hundreds of its employees snooping into the tax records "of friends, neighbors, enemies, potential in-laws, stockbrokers, celebrities and former spouses."{96} Authorized users of the FBI's National Crime Information Center have used its databases to check up on friends and neighbors and to check backgrounds for political purposes.{97} It is an article of faith for many Americans that postal workers read the postcards they process and not without reason when postal workers are heard to say that they "pass the really good ones around the office."{98}
A reasonable person may also be concerned about surveillance by nongovernmental actors. For instance, political campaigns are notorious for dirty tricks, including the bugging of opponents;{99} the yellow pages in any major city contain numerous advertisements for detective agencies and investigators;{100} and eavesdropping and bugging devices are readily available in stores.{101}
In light of this history of public
and private intrusion into personal privacy and the growing
interconnection of computers and communications envisioned by the
National Information
Infrastructure, it is impossible to dismiss the desire for
personal communica[Page
735]
tions and records security as pure paranoia. It
may, in fact, be very sensible.
B. The U.S. Data Encryption Standard (DES) Is
Increasingly Vulnerable
While the need for communications security grows, the
officially sanctioned tools for providing that security are
beginning to look dated and vulnerable.
[Page
736]
In 1977, after several years of acrimonious public
debate among professional cryptologists, the NBS selected an
algorithm developed by IBM that the NSA had certified as
"free of any statistical or
mathematical weaknesses."{105} It is
now known as the Data Encryption Standard (DES).{106} DES is a single-key cipher: the sender
and the receiver use the same key to encrypt and decrypt the
message. DES keys are fifty-six bits (about eight ASCII
characters) long.{107} This means that
there are seventy-two quadrillion (actually
72,057,594,037,927,936) different possible keys.{108} DES is approved for use by the government
for its sensitive
information, but
not for classified information.{109}
The
designation of DES as the U.S. standard was controversial,
foreshadowing the current controversy over Clipper. An earlier
version of the IBM project used a key with well over one hundred
bits.{110} The key
shrank to fifty-six bits by the time it became the U.S. standard.
Critics charged that the shortened key was designed to be long
enough to frustrate corporate eavesdroppers, but short enough to
be broken by the NSA.{111} Some critics
also feared there might be a "back door,"{112} an implanted weakness
in a key[Page 737]
part of the
encryption algorithm known as S-boxes, that would allow the
agency to use computational shortcuts to break the code.{113}
The problem was exacerbated by the unwillingness of DES's creators to explain why they had chosen the particular, seemingly arbitrary, method of mixing up bits that they had selected. Cryptology is a field for the truly devious, and many cryptologists were concerned that there might be a mathematical vulnerability intentionally inserted by the cryptographers who designed the DES cipher. The search for such back doors in government-sponsored ciphers such as DES has been a popular pastime among suspicious cryptologists since the NBS proposed DES, yet no back door has been reported. Recently, however, academic cryptologists determined that DES's unusual algorithm is peculiarly resistant to a newly discovered mathematical attack called "differential cryptanalysis"a technique which had not been discovered, at least in unclassified form, at the time DES became the U.S. standard. DES's inventors have since stated that they were aware in 1974 of DES's resistance to differential cryptanalysis, but kept quiet to protect national security.{114}
Export of DES is controlled by the State Department as if it were a weapon like a tank or fighter plane.{115} Financial institutions and the foreign offices of U.S.-controlled corporations routinely receive clearance to export DES if they show a need, but the State Department presumably acting under the advice of the NSA usually refuses to allow others to export it.
Although U.S. law ordinarily prevents Americans from selling
DES-equipped encryption products to foreigners, DES is found
around the world and freely sold by foreign corporations in many
countries. It may be "the most widely used cryptosystem in
the [Page 738]
world."{116} A full specification
of DES is available in books sold in the United States,{117} the export of which
is not controlled,{118} presumably on
First Amendment grounds.{119}
Given that computer processors become cheaper every day, brute-force searches for DES keys are now well within the reach of relatively affordable, massively parallel machines.{121} A recent paper describes a brute-force attack on DES as "alarmingly economical," estimating that for $1 million one could build an optimized machine that would try fifty million keys per second and would crack a DES key in an average of 3.5 hours.{122} An investment of $10 million would produce a machine that would be expected to crack A DES key every twenty-one minutes.{123} DES-cracking remains beyond the means of the casual snooper, but is now within the means of many corporations and every government.
[Page 739]
The security
problem is compounded by the probabilistic nature of a brute-
force key search. The strength of an algorithm is expressed in
the amount of time it would take to be certain of finding
the key by trying every possibility. The expected (average)
amount of time per key is only half that amount. If, however, an
attacker is engaged in a routine program of successively trying
to break keys, and knows how often they are changed, the attacker
will inevitably get lucky. This can be a serious threat in
situations where one piece of luck will garner the attacker a
large return.
Suppose, for example, that a bank which becomes concerned about the vulnerability of its DES keys decides to change the key used for interbank financial transactions every day. Does this give it security? If an attacker has a machine that is certain to break A key in a year, then the attacker has over a 0.01% chance of breaking the new key in an hour, and a 0.27% chance of breaking it in a day.{124} In plain English, the attacker has just better than a one in ten thousand chance of breaking each key in the first hour; she has a chance of about one in 370 of breaking each key before it is changed. The attacker thus can hope for a large electronic funds transfer to her bank account about once a year.{125}
Worse, the attacker does not need
special computers so long as she has several of them. An
attacker armed with only one 100Mhz Pentium computer would have a
minuscule daily chance of success. If she links a group of 500
Pentium computers on a university network, however, her chance of
cracking DES in a day rises to just above one in 40,000.{126} These are not bad odds for a lottery in
which
the payoff can be in the millions, and the cost of a ticket idle
[Page 740]
time on computers
in A university network may be zero to the user.
The idea of networks of computers harnessed together to crack A DES password may sound like science fiction, but something similar is already happening. A group of computer scientists and mathematicians recently used the Internet to harness computer time donated by 600 volunteers. Using a total of about 5000 MIPS-years{127} of processing time to make 100 quadrillion calculations over an eight month period, the group solved a problem equal in complexity to breaking a 129-digit RSA key.{128} RSA is a commercial public-key cryptosystem{129} and its keys are not precisely comparable to DES keys, but even so the problem was far harder than breaking DES's 56-bit key.{130}
[Page
741]
compatible with regular DES.{131} The advantage of using triple-DES rather
than a single 56-bit encryption is that messages remain more
compatible with existing equipment; the disadvantages are a loss
in speed, a need to revise existing software and hardware,
inelegance, and some lingering uncertainty as to its safety.{132} NIST has been silent on the security (or
lack
thereof) of triple-DES. The NSA has not disclosed whether it
considers triple-DES insecure, too secure, or neither.{133} It may be that the
NSA has been silent on triple-DES in the hopes that it will be
elbowed out of the market by "escrowed" encryption
products such as Clipper. Triple-DES is probably very hard to
break; breaking through Clipper's protections will involve no
(computational) effort for authorized persons because the
government will keep a copy of the keys.{134} [Page
742]
A second solution, applicable only to time-
sensitive information, is to change DES keys very frequently. If
a new DES key is used for every message, by the time the attacker
figures out the old key, it is too late. Of course, this
solution does not work for things that need to be kept secret for
long periods of time. It also requires that parties to
communication have some way to agree on a continuing supply of
new keys which, by definition, they cannot do on the insecure
channel which requires the encryption in the first place.{135}
A third solution is to abandon DES,
in whole or in part, and try something new. The U.S. government
has selected a replacement for DES that involves escrowed
encryption using a new algorithm called SKIPJACK. The government
has indicated that it hopes U.S. users of cryptography will adopt
this option.
C. The Escrowed Encryption Standard (EES)
The industrialized world is in the opening stages of an
"ongoing telecommunications revolution with still undefined
potential to affect the way we communicate and develop our
intellectual resources."{136} These
changes can be liberating, and they can
be painful; some have distributional consequences affecting
relative power as well as access to information.
The increases in personal privacy and communications security
promised by cryptography come at the expense of those who benefit
from insecure communications. If every telephone call is
routinely encrypted, domestic law enforcement agencies, such as
the FBI and local police forces, will find wiretapping harder or
even
impossible. If information on computers is routinely encrypted
police may find evidence inaccessible or incomprehensible. When
sophisticated encryption technologies are used abroad,
intelligence agencies such as the NSA, which routinely seek to
penetrate the communications of foreign powers, find their
missions complicated. To the extent [Page 743]
that American citizens are
better off because wiretaps help catch and convict criminals, and
to the extent that communications
intelligence protects the national interest from foreign threats,
developments that impede legitimate wiretaps may make us all
worse off.
The fear of losing electronic surveillance capabilities because
of advances in encryption technology has produced a three-pronged
reaction from the law enforcement and intelligence communities.
First, their spokespersons have begun a public relations
offensive designed to explain why these capabilities matter.{137} Second, they have
sought legislation requiring that telephone networks and other
similar communications channels be designed in a manner that
facilitates wiretapping.{138} Third, they
have designed and supported EES,
best known in its most famous implementation, the Clipper Chip,
which enables the government to keep a copy of the key needed to
decrypt all communications using EES. These activities share the
premise that it is reasonable for the government to request, and
in some cases require, that private persons communicate in a
manner that makes interception by the government at least
practical and preferably easy.
The Administration{140} makes two types
of arguments in favor of EES.
In its hard sell, the Administration, primarily through the [Page 744]
FBI, paints a lurid
picture of law enforcement stripped of an essential crime-
detection and evidentiary tool wiretapping while pornographers,
drug dealers, terrorists, and child molesters conspire via
unbreakable ciphers, storing their records and child pornography
in computers that become virtual cryptographic fortresses.
Meanwhile, the
intelligence agencies, primarily the NSA, quietly murmur that
existing policies have proved ineffective in preventing the
increasing use of unescrowed encryption, and suggest that their
proposals should be adopted to prevent developments that might
(or might not, they won't say) undermine the nation's
communications intelligence capabilities.
In its soft sell, the government argues that if the NSA has
designed a cryptographic system that it is willing to certify as
secure and make available to the American public, the government
has an obligation to take steps to prevent that cipher from being
used against it by criminals and foreign governments. In fact,
the current national standard cipher, DES, is strong enough that
the U.S. government has sought to prevent its export and may
indeed regret having let the algorithm become publicly
available.{141} EES, the argument
goes, just maintains the status quo. Even if everyone used A
Clipper-equipped telephone, telephone conversations would be no
less secure against legitimate government wiretapping than they
are today, while being more secure against illicit
eavesdropping.{142}
a. Domestic Law Enforcement
According to FBI Director Louis Freeh, electronic intelligence,
especially wiretapping, is crucial to effective law enforcement:
if the FBI and local police were to lose the ability to tap
telephones because of the widespread use of strong cryptography,
the "country [would] be unable to protect itself against
terrorism, violent crime, foreign threats, drug trafficking,
espionage, kidnapping, and other crimes."{143}
From the statistics available, it is
difficult to determine how [Page
745]
much difference wiretaps actually make.{144} The FBI estimates that wiretaps play a
role in an average of 2200 convictions per year,{145} but it is unclear how
many of these convictions could have been obtained without
wiretaps. Despite an almost 50% increase since 1983, court-
ordered wiretaps are still relatively rare: only 919 were
authorized in 1992 for all federal, state, and local police
forces.{146} Of these, only 141
wiretap orders covered electronic devices such as faxes, digital
display pagers, voice pagers, cellular phones, or electronic
mail. In 1993, the 976 active court-ordered wiretaps allowed
police to hear approximately 1.7 million conversations involving
nearly 94,000 persons. The listeners described about 20% of the
conversations as incriminating.{147} The law
enforcement community suggests that wiretaps make the biggest
difference in the largest cases because wiretaps have been used
to gather evidence in 90% of the terrorism cases brought to
trial.{148} The average cost of a
wiretap was $57,256 in
1993,{149} so it may be
that the biggest cases are the only ones in which the expense of
monitoring a telephone line seems justified.{150}
Statistics aside, it seems only
logical that the spread of strong, user-friendly cryptography
would increase the risk that evil people will be able to
frustrate law enforcement attempts to crack their computers or
bug their telephones. Whether the risk has yet [Page 746]
manifested itself is less
clear. For all its predications of disaster in the making,
"the FBI has not been able to point to a single instance to
date [(September 1994)] where encryption has hampered [its]
investigation of a case."{151}
Nevertheless, the fear that rogue cryptography might allow
"terrorists, drug dealers, and other criminals"{152} to evade law enforcement seems to supply a
large part of the motivation for the Administration's support for
EES. One can only sympathize with officials who were, no doubt,
asked whether they wished to go down in history as the
individuals responsible for letting loose A technology that might
someday hamper the investigation of A terrorist threat to a large
population center.{153} Faced with the
FBI's
Manichaean vision of, on the one hand, a world of rampant
cryptography in which the bad guys remain impregnable behind
cryptological walls and, on the other hand, an ambitious plan to
return to the status quo ante in which the police remain able to
intercept and understand most if not all electronic
communication, it is not surprising that the Clinton
Administration opted for what must have appeared to be the safer
course.
[Page 747]
b.
Intelligence-Gathering
The communications intelligence capabilities of the United
States are a subject "characterized by secrecy even greater
than that surrounding nuclear weapons."{154} Unclassified discussion of the effect of
strong private cryptography on the capabilities of intelligence
agencies quickly becomes conjecture. We do know, however, that
two of the most important functions of the NSA are to acquire and
decrypt foreign communications, and to conduct traffic analysis
of foreign and international communications.
The two functions are related, but different. Acquisition and decryption of foreign communications are the stuff of headlines: listening to the Soviet President's telephone calls made from his limousine or breaking German codes during World War II. Traffic analysis is more subtle, but no less important. It is the study of the sources and recipients of messages, including messages that the eavesdropper cannot understand. In wartime, traffic analysis allows intelligence agencies to deduce lines of command. Changes in the volume and direction of traffic can signal the imminence of operations.{155}
Widespread
foreign access to even medium-grade cryptography makes it more
difficult for U.S. communications intelligence to select the
messages that are worth decrypting, or even worth reading.{156} Worse, it makes
traffic analysis much more difficult. So long as most electronic
communications are unencrypted, intelligence agencies are able to
sort messages in real time, and identify those of interest, or
those which warrant further attention.{157} [Page
748]
Furthermore, if most traffic is plaintext, then
ciphertext cries
out for attention here is someone with something to hide. Even
if the message cannot be decrypted quickly, the source can be
flagged for traffic analysis, which enables the intelligence
agency to build up a picture of the persons with whom the source
communicates. If everyone is using strong cryptography, then the
most secret messages no longer stand out.
c. Failure of Laws Designed to Prevent the Spread of
Strong Cryptography
The United States has several long-standing laws and policies
designed to prevent strong cryptography from spreading abroad,
and even from being widely used at home. Although these may have
served to slow the spread of strong cryptography, ultimately they
have failed to stop it. The following is only a brief summary of
two exemplary policies and their effects.{158}
i. Export Control: The ITAR
U.S. export control is designed to prevent foreigners from
acquiring cryptographic systems that are strong enough to create
A serious barrier to traffic analysis, or that are difficult to
crack.{159} Two sets of
regulations govern the export of encryption software: the Export
Administration Regulations (EAR) govern "dual use"
technologies{160} and the
International Traffic in Arms Regulations (ITAR) apply to items
that the government considers inherently military in nature.{161} The EAR are generally less demanding, but
the ITAR take precedence.{162} Under the ITAR
regime, applica[Page
749]
tions to export cryptographic software as strong
as (or stronger than) DES are routinely denied.{163} Only strong products that lack [Page 750]
the capability of being
adapted for encryption, or which are designed for specific
banking applications, receive official export clearance.{164}
The ITAR have failed to prevent the spread of strong cryptography. The ITAR prohibit export of cryptographic software,{165} nevertheless software created in the United States routinely and quickly finds its way abroad. For example, when version 2.6 of PGP, a popular military-grade cryptography program, was released in the United States by graduate students at MIT as freeware,{166} a researcher at the Virus Test Center at the University of Hamburg, in Germany, received a copy within days from an anonymous remailer.{167} He then placed it on his internationally-known Internet distribution site.{168} As would-be sellers of cryptographic products have frequently testified to Congress, the major effect of the ITAR is to prevent U.S. companies from competing with those foreign companies that sell sophisticated cryptographic software abroad.{169}
[Page 751]
Meanwhile,
enforcement of the ITAR has produced absurd results. The State
Department has refused to license the export of a floppy disk
containing the exact text of several cryptographic routines
identical to those previously published in book form.{170} The refusal was all
the more bizarre because the book itself was approved for
export.{171} The only reasons
given by the State Department for its refusal were that
"[e]ach source code listing has been partitioned into its
own file and has the capability of being compiled into an
executable subroutine,"{172} and that
the source code is "of such a strategic level as to
warrant" continued control.{173} The
State Department also concluded that the
"public domain" exception to the ITAR{174} did not apply and most bizarrely of all
that its decision was consistent with the First Amendment.{175}
ii. "Classified at Birth"
The Inventions Secrecy Act{176} gives
the Commissioner of Patents the authority to issue patent secrecy
orders. Even if the government has no ownership interest in the
invention, the orders block the issuance of a patent and place
the application under seal. If the Nuclear Regulatory Commission
or the Department of Defense states that publicizing the
invention would be detrimental to the national security, the
patent will be withheld "for such period as the national
interest requires."{177} Willful
disclosure of an invention covered by
a secrecy order is a criminal offense.{178} [Page
752]
While the application of the Inventions Secrecy Act to privately
created cryptographic devices has sometimes occasioned
publicity,{179} most devices covered
by secrecy orders are invented at government expense.{180}
The existence of a number of high- level cryptographic algorithms in public circulation, some patented,{181} some not, suggests that the Inventions Secrecy Act has been far from successful at preventing the spread of strong cryptography.{182}
"Here, here, here be my keys; ascend my chambers;The Escrow Encryption Standard is designed to provide users with communications that are secure against decryption by all third parties except authorized agents of the U.S. government. Before A Clipper Chip is installed in a telephone,{184} the government will permanently inscribe it with a unique serial number and A unique encryption key. The government will keep both of these numbers on file. In order to reduce the danger that the file might be stolen or otherwise compromised, the chip's unique encryption key will be split into two pieces, each held by a different "escrow agent." The escrow agents will be required to guard the segments and release them only to persons who can demonstrate they will be used for authorized intercepts. Reuniting the pieces of a chip's unique key gives the government the capability to decrypt any Clipper conversations.
search, seek, find out."{183}
[Page 753]
a. A Tale of Three Keys
From the user's point of view, the Clipper Chip is a black box:
pick up your Clipper-equipped telephone, dial another
Clipperphone, push a red button to initiate the security feature,
wait a few seconds for the two chips to synchronize, read off the
character string displayed on the telephone to the other party to
confirm the security of the conversation,{185} and start the conversation.{186} The conversation is
scrambled with a classified algorithm called SKIPJACK, which took
the NSA ten years to develop, and which the government certifies
as secure for the foreseeable future.{187} What [Page
754]
happens during those few seconds before the
conversation begins, and why, are the essence of EES and the
source of controversy.
From the government's point of view, EES relies on three keys: the session key,{188} the chip key, and the family key. The session key is what SKIPJACK uses to encrypt and decrypt the conversation. Every conversation has a new session key, and any third party seeking to eavesdrop on the conversation would need to have the session key to decrypt the conversation. Oddly, the Clipper Chip does not select the session key; indeed, the Clipper Chips do not care how the telephones do this.
Suppose Alice wants to have a
secure conversation with Bob. Alice calls Bob, then pushes the
red button. At this point, the two Clipperphones have to agree
to a session key according to A method selected by the
manufacturer. The maker of the Clipperphone is free to use as
secure a method as she likes. The two
Clipperphones might, for example, use a supersecure method of
agreeing on the session key which is so safe that two strangers
who have never met before can agree on a session key in public
while being overheard, and yet anyone who overhears what they say
will still be unable to work out what the key is.{189} Assume that Alice
and Bob use telephones that have this supersecure selection
method built in. Once the two telephones agree on the session
key, each phone feeds the key to its Clipper Chip.{190} As soon as the Clipper Chips are [Page 755]
told the session key, they
begin the Clipper telephone session. The first step in a Clipper
telephone session is to undermine the eavesdropper-proof creation
of the session key by transmitting the session key in encrypted
form for the benefit of any public servants who may be
listening.
At the start of every Clipper session, a Clipper Chip sends A
stream of data called a Law Enforcement Access Field (LEAF).{191} Unless Bob's Clipper
Chip receives a valid LEAF from Alice's chip, Bob's chip will not
talk with it.{192} As can be seen
from the Figure on page 756, the LEAF is built in layers. At the
center lies the session key. The chip encrypts the session key
with the unique chip key. It then appends the sending chip's
serial number and a checksum, then reencrypts the data with the
family key, which is a master key held by the government.{193}
[Page 756]
In short, eavesdroppers seeking access to the session key must
use two keys to decrypt the LEAF: the family key (which is
common to all chips) and the chip key (which is different for
every chip).
Assuming that the family key will be in fairly wide
circulation,{194} the security of
the
Clipper Chip stands or falls on the security of the master list
of chip keys. This list, or the two lists of key segments, would
be of enormous value to any attacker, such as a foreign
government bent on industrial espionage. The way in which the
keys are created, and the method by which they are held and
released, are critical elements of the user's security.
When a public servant engage in a lawful wiretap first comes
across a Clipper session, she records it, including the LEAF.
The public servant must now acquire the family key if she does
not already possess it. According to NIST, the family keys will
not be transmitted to law enforcement personnel, but will instead
be stored [Page 757]
in
special circuit boards capable of being installed in ordinary
PCs.{195} Once decrypted with
the family key, the LEAF reveals the serial number of the Clipper
Chip and also reveals the encrypted session key. The public
servant must then contact the two escrow agencies, giving them
the chip's serial number and a legally valid reason for the
wiretap, usually in the form of a warrant from a state court, a
federal court, or the special Foreign Intelligence Surveillance
Act (FISA) court.{196} The requestor
must "certify that [the] necessary legal
authorization for interception has been obtained to conduct
electronic surveillance regarding these communications."{197} How this certification operates when the
legal basis [Page 758]
is "exigent
circumstances" (which is determined by the same officer who
would be
requesting the key segment), is not explained,{198} perhaps because
warrantless wiretaps based on exigent circumstances are
relatively rare.{199} There
remains some doubt as to how the NSA and other agencies in the
national security community will obtain keys. It is notable that
in a recent meeting involving the FBI, the NSA, and AT&T's
Bell Labs, "the NSA did not answer a question as to whether
the national security community would obtain keys from the same
escrow mechanism for their (legally authorized) intelligence
gathering or whether some other mechanism would exist for them to
get the keys."{200}
The escrow
agents have no duty to make any independent inquiries as to the
adequacy of the certification before releasing the key
segments.{201} Once
satisfied that the wiretap request appears legitimate (in that it
comes from someone authorized to make a request and contains her
certification that adequate legal authority exists), the escrow
agents are required to disclose the key segments for the key for
which the serial number was submitted. The public servant
requesting the key fragments puts them together and uses [Page 759]
the reconstituted chip key
to decrypt the session key. Armed with the decrypted session
key, the public servant can at last decrypt the conversation.
Because the presence of the Clipper Chip has no effect on the
applicable constitutional and statutory rules, the public servant
remains obligated to minimize the intrusion.{202}
In summary, a public servant might decrypt an EES message as follows:
Public servant
(1) intercepts the message, including the LEAF (128-bit LEAF encrypted with the family key);
(2) decrypts the LEAF with the family key (32-bit chip ID, 80- bit session key encrypted with chip key, 16-bit checksum);
(3) contacts her escrow agents, reports the chip ID, and avers existence of the legal authority for the wiretap;
(4) receives two 80-bit key segments;
(5) XORs{203} the key segments to produce an 80-bit chip key;
(6) decrypts the encrypted session key with the chip key;
(7) decrypts the entire message with her decrypted session key.
[Page 760]
they will be taken to a
secure, compartmented information facility,{205} which is the vault-like room that the
government uses when handling classified documents. Each of the
escrow
agents will provide a list of random numbers which, when
combined, will provide the numbers from which the keys will be
generated.{206}After the keys are generated, the escrow agents will be given A disk containing lists of chip serial numbers and an associated 80-bit number which represents half the information needed to recreate a chip's key. Both key segments must be combined to retrieve the chip key, and neither segment alone provides the holder with any information as to the chip key's contents.{207}
Although the escrow agents do not
check the bona fides of any requests for key fragments, they do
require a substantial amount of paperwork before releasing a key.
The escrow agents are also required to keep detailed records of
key segment requests and releases. The existence of this paper
trail should provide A significant disincentive to rogue
wiretapping requests by agents in the field. Similarly, NIST has
announced an elaborate system of safeguards to protect each
Clipper Chip's unique key. The scheme [Page 761]
involves complex rationing
of information and mutual monitoring by the escrow agents from
the moment the Clipper Chip is created. Further security attends
the inscription of the key upon a Clipper Chip, its subsequent
division into two key segments, and ultimate
safeguarding by the two escrow agents.{208}
The security precautions introduced
by NIST in late 1994 are complex. To the nonspecialist they
appear sufficient to prevent security breaches at the time the
keys are "burned in" and to prevent surreptitious
copying or theft of the key list from the escrow agents. But no
amount of technical ingenuity will suffice to protect the key
fragments from a change in the legal rules governing the escrow
agents. Thus, even if the technical procedures are sound, the
President could direct the Attorney General to change her rules
regarding the escrow procedures. Because these rules were issued
without notice or comment, affect no private rights, and (like
all procedural rules) can therefore be amended or rescinded at
any time without public notice, there is no legal obstacle to a
secret amendment or supplement to the existing rules permitting
or requiring that the keys be released to
whomever, or according to whatever, the President directs.
Because the President's order would be lawful, none of the
security precautions outlined by NIST would protect the users of
the EES system from disclosure of the key segments by the escrow
agents. Nothing in the EES proposal explicitly states that the
NSA will not keep a set of keys; indeed, the only way to acquire
a set of EES-compliant chips is to have the device that
incorporates them tested and approved by the NSA. Similarly,
although the specifications for the decrypt processor call for it
to delete keys when a warrant expires and to automatically send a
confirmation message to the key escrow agents, the interim model
(there is only one) in use by law enforcement organizations
relies on manual deletion.{209}
[Page 762]
c. Limited Recourse for Improper Key
Disclosure
The escrow system lacks legal guarantees for the people whose
keys are generated by the government and held by the escrow
agents.
Indeed, the Attorney General's escrow procedures state that they
"do not create, and are not intended to create, any
substantive rights for individuals intercepted through electronic
surveillance."{210} In short, the
government disclaims in advance any reliance interest that a user
of an EES-equipped device might have in the
government's promise to keep the key secret.{211} A victim of an
illegal wiretap would have a cause of action under Title III
against the wiretapper,{212} but, it appears,
no remedy against the escrow
agents, even if the escrow agents acted negligently or failed to
follow their own procedures.{213} The
Attorney General's proce[Page
763]
dures themselves are merely directives. They are
not even legislative rules, which might be subject to notice and
comment restrictions before being rescinded. A future
administration could, if it wanted, secretly{214} instruct the escrow
agents to deliver copies of the keys to an intelligence or law
enforcement agency, or even White House "plumbers,"
thereby violating no law or regulation (the plumbers, though,
would violate Title III when they used the information).{215} Because the chip-
unique keys were voluntarily disclosed to the government, the
chip's owner might lack a "legitimate" (that is,
enforceable) expectation of privacy in the information.{216}
If the intercepted communication were an e-mail or a file transfer, rather than a telephone call, the chip owner subject to an illegal or inadvertent disclosure by the escrow agents may be in a particularly weak position if the information ever makes its way to court: many Title III protections granted to voice communications do not apply to transfers of digitized data.{217}
Shortly before the 103d Congress
adjourned, Congressman George Brown introduced the Encryption
Standards and Procedures Act of 1994,{218} which would
have waived the sovereign immunity of the United States for
"willful" but unauthorized disclosures of key fragments
by its officials and excluded liability in all other
circumstances.{219} In the absence
of similar legislation, however, there [Page 764]
may currently be no
monetary remedy even for a "willful" disclosure.
II. The Escrowed Encryption Proposal--Legal, Policy and
Technical Problems
The Clinton Administration introduced EES through a procedural
back door that relies on market power to prevent a substantial
increase in the communications privacy of Americans, an outcome
not authorized by any statute. EES used a standard-setting
procedure but failed to set an intelligible standard. The
procedure violates the spirit, although not the letter, of the
Administrative
Procedures Act (APA).
The Administration is spending large sums of money on A
controversial project in the absence of congressional
authorization.
This policy cuts out the legislature, and indeed the public, from
the decision to proceed with EES.{220} Only Congress can intervene, because, as
things currently stand, no one has standing to sue. The
Administration's use of a standard-setting procedure to make
substantive policy sets an alarming precedent of rule making with
highly attenuated accountability.
A. EES: The Un-Rule Rule
[Page
765]
Information Processing Standards (FIPS) are
standards and guidelines intended to improve the federal
government's use and management of computers and information
technology, and to standardize procurement of those goods.{222} FIPS are
also used to announce national norms in areas of changing
technology where NIST believes industry would benefit from the
existence of a standard. Officially, the only bodies required to
conform to FIPS are agencies within the federal government (and
in some cases government contractors), although in practice they
are often adopted as de facto national standards by industry and
the public.{223} The
private sector finds FIPS attractive because they allow [Page 766]
conformity with, and sales
to, the government, and because the standards themselves often
have technical merit, or at least reflect a technical consensus
of the many public and private interests that NIST routinely
consults before it promulgates a FIPS.{224} EES is FIPS 185.{225}One of the more serious complaints about FIPS 185 is that it fails to set a standard. One member of the NIST Computer Privacy and Security Advisory Board went so far as to submit a comment calling the FIPS "content- free."{226} Most FIPS describe a conforming device or procedure in sufficient detail for the reader to understand what it is; FIPS 185 does not. Instead, it states, "Implementations which are tested and validated by NIST will be considered as complying with this standard."{227} FIPS 185 requires the use of the SKIPJACK encryption algorithm and a LEAF creation method.{228} But the standard does not define those terms because the specifications for both are classified. Instead, FIPS 185 unhelpfully notes:
Organizations holding an appropriate security clearance and entering into a Memorandum of Agreement with the National Security Agency regarding implementation of the standard will be provided access to the classified specifications. Inquiries may be made regarding the Technical Reports and this program to Director, National Security Agency, Fort George G. Meade . . . .{229}
[Page 767]
Nor does the
standard explain what sorts of devices it covers. It merely
states that "[v]arious devices implementing this standard
are anticipated.
The implementation may vary with the application. The specific
electric, physical and logical interface will vary with the
implementation."{230} Admittedly, FIPS
185 at least has the good grace to acknowledge that it is
"not an interoperability standard. It does not provide
sufficient information to design and implement a security device
or equipment. Other specifications and standards will be
required to assure interoperability of EES devices in various
applications."{231} In sum, FIPS
185 says something to this effect: "Various electronic
devices will contain classified components that will provide
escrowed encryption using a classified algorithm. If you ask
nicely, we may let you use one in your design, and we will tell
you whether we approve of your device and whether we will let you
produce it." This is a strange sort of standard.
2. An End-Run Around Accountability
Such an unorthodox standard is the result of an even more
unorthodox procedure. FIPS 185 is not just a standardless
standard; it is an un-rule rule which seeks to coerce the public
by wielding federal market power to generate a de facto standard
without providing any real administrative accountability.
Despite conforming to the notice and comment procedure of 553
of the APA,{232} and being duly
published in the Federal Register,{233} FIPS 185 is not A legislative rule because
it does not seek, at least on its face, to bind the public.{234} Nor, despite being on its face an [Page 768]
announcement, is FIPS 185 a
nonlegislative rule as the term is usually understood.{235} Familiar types of nonlegislative rules
include interpretative rules, statements of policy and
"publication rulemaking." FIPS 185 fits into none of
these categories.{236} Interpretative
rules set forth an agency's understanding of A statutory
provision, a judicial or administrative decision, or another
rule,{237} and FIPS 185 clearly does not
provide any of these. Nor is FIPS 185 an example of what Peter
Strauss has called "publication
rulemaking"{238} in which agency
staff, acting pursuant to APA [Page
769]
§ 552(a)(1)-(2), publish technical
guidelines, staff manuals, or standards (such as IRS Revenue
Rulings) that inform the public of the agency's likely position
in future enforcement, application-and-approval, or
benefit/reimbursement cases.{239} Nor is
FIPS
185 a statement of policy.{240} Nothing
within the four corners of FIPS 185 establishes or explicates a
policy, unless giving federal agencies the option to purchase
certain devices constitutes a policy.{241}
On its face, FIPS 185 is a minor
internal housekeeping
regulation. Whether anyone, inside or outside of the government,
chooses to comply with it is entirely up to her, although FIPS
185 states that use of EES by nonfederal government organizations
"is encouraged."{242} In
form, EES is a description of something, as well as a grant of
permission for agencies to use that something instead of other
things they are currently using. Yet despite explicitly
disclaiming any intention of legally binding the public, FIPS 185
is part of a strategy to coerce the public by use of the
government's market power to create a de facto national standard.
At the same time that the Department of Commerce promulgated EES,
the Department of Justice announced that it was buying 9000
Clipper-equipped telephones, using money from its Asset
Forfeiture Super Surplus Fund,{243} a fund
comprised of profits from RICO, [Page 770]
drug, and other asset
forfeitures.{244} Expenditures
from the Asset Forfeiture Super Surplus Fund require no
congressional appropriations. The effect is to cut Congress out
of the decision-making process on an issue which may eventually
affect the privacy rights of most Americans. One need not be an
opponent of EES to believe that a decision with significant
potential effects on communication privacy should have been left
to the legislature.
The Department of Defense, too, is considering buying millions of EES-compliant devices,{245} although this purchase may require congressional approval. The government's market power as a bulk purchaser suggests that, all other things being equal, producer economies of scale will allow EES-compliant devices to be the lowest-cost hardware-based civilian cryptography products available. In addition, EES products will have the significant advantage of being able to communicate with the government's telephones, something that any competing technology will lack.{246}
The Clinton Administration also
announced that it will exempt EES products from the export ban in
the ITAR.{247} If the ITAR [Page 771]
are revised in this manner,
EES products will become the only U.S.-made exportable products
offering strong encryption, disadvantaging U.S-based competitors
further.{248} These
efforts have already had an effect: the day that the
Administration announced its plans for Clipper, AT&T
announced that its
new secure telephone, the 3600, would not use a DES device as
originally announced, but would use Clipper instead.{249}
The current Administration makes no
secret of its hope that the combination of federal standard-
setting, federal purchasing power, and fine-tuning of export
control will allow it to impose a de facto standard on the
public, even though there is no statutory authority for the
standard, and even though Congress has never appropriated a penny
to support the standard. In so doing, NIST has pioneered a new
type of un-rule. It is a rule that the Administration indeed
hopes and intends to have a "practical binding
effect,"{250} but not because
the rule announces to the public how the agency will act in the
future, nor because the agency intends to act in compliance with
the rule, nor because the rule describes safe harbors for
compliance [Page 772]
with existing rules.{251} Rather, by issuing
the rule (if a rule it be), the agency hopes to set in motion A
train of events that will coerce the public's compliance.
NIST's use of a FIPS in this manner is an interesting reversal of the usual circumstance of a nonlegislative rule that an agency intends to be binding.{252} In the ordinary situation, an agency has chosen not to use the notice and comment procedure that characterizes informal rule making under APA § 553, and has simply issued the rule, perhaps labeling it "interpretative" or "policy guidance." A party seeking to challenge the rule attempts to demonstrate that the rule is actually legislative and thus invalid without notice and comment. The aggrieved party argues that it was entitled to be consulted on the rule and that the agency may not deprive the party of its right to make comments. Once the comments are duly docketed, the agency has a duty to take them seriously and may not reject them without giving nonarbitrary reasons.{253} In the classic case, the agency responds by denying the substantive import of its rule and arguing that, because the rule breaks no new ground, notice and comment are not necessary.
With FIPS 185, NIST has turned this process on its head. A
proposed version of FIPS 185 was published in the Federal
Register, and NIST solicited comments.{254} It received hundreds.{255} NIST
accepted a few, but rejected many others on the disingenuous
grounds that because the standard was entirely voluntary, it
could cause no harm.{256} NIST thus
invoked the formally voluntary [Page 773]
nature of the FIPS as
justification for dismissing the concerns of commentators who saw
FIPS 185 for what it was, and what NIST itself surely understood
it to be: an attempt to coerce the public through market means.
NIST simply failed to address the merits of many important
complaints, including those challenging the security, necessity,
or wisdom of its proposal, with the result of significantly
devaluing the opportunity to comment.{257} Yet, unlike most agencies that fail to
address the merits of comments received on a proposed rule, NIST
likely has little to fear from judicial review of its decision
because there appears to be no one with standing to challenge its
actions.
Even a competing product manufacturer would be unlikely to have
standing to protest a procurement order for products conforming
to FIPS 185.{258} As A plaintiff,
such a competitor might be able to argue that had it not been for
the permission to purchase the items granted in FIPS 185, the
procuring agency might have purchased the plaintiff's devices
instead. Such a claim would, however, be risky at best. The
plaintiff would have to mount a convincing case regarding
causation, somehow demonstrating that but for FIPS 185, the
plaintiff's products would have conformed with the agency's
requirements;{259} the plaintiff
would also need to [Page
774]
show that the agency would have been unable to
obtain a waiver from the preexisting requirement that it use a
DES product to protect sensitive information.{260} Without an extraordinarily good factual
basis, this barrier is probably insurmountable, leaving the
would-be plaintiff without the direct personal stake in the case
necessary for standing.
One other possible strategy for the plaintiff would be to claim "reputational" injury to its product or firm on the grounds that the FIPS would cause customers other than the government to reject its nonconforming products. Those employing this strategy could then try to invoke Meese v. Keene{261} to overturn the no- standing-to-challenge-a-FIPS rule of Control Data Corp. v. Baldridge.{262}
Otherwise, it is very difficult to imagine who might have standing to sue to overturn FIPS 185. A party seeking relief would have to argue that the FIPS was not as harmless as NIST claimed, and that the replies to comments were therefore defective. Just as NIST was able to ignore critical comments on its draft FIPS by saying that the standard was optional and hence harmless,{263} so too could it argue that because the standard is nonbinding, no one has a legal right to demand that a court review it.{264}
Should the Administration's attempt
to combine technical
standard-setting authority with market power succeed, however, [Page 775]
many parties will be
justly aggrieved. Makers of competing products will lose market
share, and perhaps may be driven out of their market altogether.
Individuals who might have preferred non-escrowed encryption, if
it could be obtained at or near the same price as an EES device,
may find that option closed to them. Such a policy will
establish A new and undesirable process by which the government
will likely be able to avoid the APA in a small, but significant,
class of cases.{265} Current law
does not recognize any of these injuries, save perhaps the claim
of lost market share, as legally cognizable.{266} A major decision as
to the degree of privacy to be afforded to U.S. citizens will
have been made without effective congressional or popular
participation.
Placing all FIPS, or all standard-setting relating to high technology, under the APA would be one way of ensuring that the executive branch can never again use standard-setting to manipulate the market for high technology items, at least not without judicial review for reasonableness. Although this change would vaccinate against the disease, it would also have undesirable side-effects. Neither nonbinding national technical standards nor the government's internal procurement standards should be litigated.{267} If A manufacturer is dissatisfied because a national or procurement standard more closely conforms to a competitor's product than its own, the proper place to fight that battle is the marketplace, not a court. EES is a special case because the technology at issue has social implications far beyond the ordinary FIPS, and because the government is seeking to use its purchasing power to coerce the market to achieve an end other than reliability, ease of use, or technical excellence. It would be a pity if prevention of such special cases were to force so disruptive a change on a system which ordinarily seems to work reasonably well.{268}
[Page
776]
Trying to find an avenue for judicial review of a
coercive but formally voluntary FIPS is probably more trouble
than it is worth.{269} The greatest
procedural problem with FIPS 185 is not the absence of judicial
review but the attempt to evade congressional participation in A
decision that may have major social consequences for many years.
The solution to this problem is logically, if not politically,
simple. If the executive branch did not have funds available
with which to purchase thousands of EES-equipped devices, it
would have to go to Congress for the money. Congress could then
debate the issue and, regardless of what it decided, the process
would conform with the values of openness, explanation, and
representative democracy which the un-rule rule undermines. To
prevent further abuses of the FIPS procedure, either the Justice
Department's Asset Forfeiture Fund should be returned to
the Treasury, or its terms should be narrowed to make it clear
that its proceeds cannot be used to attempt to influence product
markets.{270}
3. Did NIST's Cooperation with the NSA over FIPS 185 Violate
the Computer Security Act of 1987?
NIST's relationship with the NSA is poorly documented.{271} Clipper's critics
argue that NIST's adoption of EES in FIPS 185 violated either the
letter or the spirit of the Computer Security Act [Page 777]
of 1987{272} (Act), because, even
though the Act was designed to ensure civilian control of
computer security issues, NIST effectively and illegally ceded
its powers to the NSA.{273} NIST and
the NSA have refused to make public any information regarding
their discussions that would show whether NIST complied with the
Act. Consequently, it is currently impossible to make an
informed judgment as to NIST's compliance with the Act.{274} All that can be said
pending litigation is that NIST has not proved that it complied
with the Act.{275}
The claim
that NIST violated the Act draws much of its force from the
legislative history of the Act and from NIST's subsequent close
relationship with the NSA, which arguably violates the spirit of
the Act.{276} In 1984
President Ronald Reagan issued National Security Decision
Directive (NSDD) 145, which put in motion a train of events
leading to the Act. NSDD 145 granted the NSA sweeping powers to
make policy and develop standards for the
"safeguarding" of both
classified and unclassified information in civilian agencies and
in the private sector.{277} This transfer to
the NSA of authority [Page 778]
over civilian and
especially private information was the precise evil that the Act
was designed to cure.{278} The
legislative history states that Congress believed that the NSA's
"natural tendency to restrict and even deny access to
information" disqualified it from that role,{279} and Congress therefore rejected the NSA's
suggestion, made in testimony to A House committee, that the Act
should formally place the NSA in charge of all government
computer security.{280}
Nevertheless, the Act does not require a watertight separation between NIST and the NSA. Instead, the Act directs NIST to "draw[] on the technical advice and assistance" of the NSA "where appropriate."{281} NIST is also directed to "coordinate closely" with several other agencies, including the NSA, to avoid duplication of effort{282} and to use the NSA's computer security guidelines to the extent that NIST, not the NSA, determines they should apply.{283}
Soon after the Act became law, NIST
and the NSA signed A Memorandum of Understanding (MOU) setting
out a detailed regime of cooperation regarding computer and
telecommunications security issues.{284} With one
exception, the MOU appears to be designed to create interagency
consultation and to prevent duplication of effort, as required by
the Act. That exception, though, is not trivial: NIST agrees to
submit "all matters" regarding "techniques to be
developed for use in protecting sensitive information" in
its purview to review by a Technical Working Group comprised of
equal numbers of the NSA and NIST staff in order "to ensure
they are consistent with the national security of the United
States."{285} If the two
agencies
are unable to agree, then either agency can refer the matter to
both the Secretary of Commerce and [Page 779]
the Secretary of Defense,
from where it may go to either the National Security Council or
the President for an ultimate decision. Meanwhile, "[n]o
action shall be taken on such an issue until it is
resolved."{286}
It is clear that NIST and the NSA have had extensive contacts regarding EES.{287} Whether these contacts, and in particular the actions of the Technical Working Group, amount to a violation of the Act depends on whether EES was referred to the Technical Working Group, and on how the NIST-NSA relationship worked. The Act clearly requires NIST to make its own decisions;{288} there is no statutory authority for NIST to let the NSA make decisions for it. Just as clearly, the Act requires NIST to consult with the NSA, although it directs NIST to decide when consultation is appropriate.{289}
There is no reason, with or without
the Act or the MOU, that NIST could not allow itself to be
persuaded by the NSA, so long as NIST were to keep the ultimate
power of decision.{290} The MOU [Page 780]
between the NSA and NIST
does, however, suggest two scenarios that would violate the Act.
If the working group deadlocked on some issue, or took votes in
which the two NIST members were outvoted four-to-two (or three-
to-two), and if NIST changed its policies as a result of either
of these votes,{291} then NIST would
no longer be in the position of allowing itself to be persuaded
by the NSA. Instead, the NSA would be dictating to NIST.
This would violate the Act. As the decision to proceed with EES
clearly comes from the highest levels of the U.S. government,{292} in the absence of
firm information one cannot reject the deadlock scenario out of
hand. There is, however, some reason to doubt it.
The deadlock scenario was anticipated in a 1989 codicil to the
MOU.{293} After members
and staff of the House Committee on Government Operations
expressed concern about the apparent grant to the NSA of an
effective veto over NIST's decisions, NIST and the NSA explained
that although the Technical Working Group had broad jurisdiction
as a discussion forum, the appeals process described in the MOU
applied only to "proposed research and development projects
in new
areas."{294} This codicil,
signed by representatives of both agencies with the express
intent of binding their successors, distinguishes between
"promulgation of standards and guidelines" by NIST,
which are not subject to appeal,{295} and [Page 781]
the "early stage
in the standards research and development process-- usually years
before a standard is
promulgated,"{296} from which
appeals are permitted.
Neither NIST nor the NSA have made public statements as to the involvement of the Technical Working Group in the decision to promulgate FIPS 185. Whether the agreement required NIST to refer EES to the Technical Working Group before issuing FIPS 185 is unclear. But it appears that under the distinction set out in the 1989 codicil to the MOU, FIPS 185 would have been within the jurisdiction of the Technical Working Group, but outside the appeals procedure. Thus, if the 1989 codicil controlled, the deadlock scenario could only have applied if NIST preferred an alternative to EES but was persuaded to use EES against its better judgment. Alternately, because SKIPJACK was developed by the NSA, it is entirely possible that the entire EES proposal originated in the NSA, and that by the time the NSA disclosed SKIPJACK to NIST, the NSA had decided that neither SKIPJACK nor EES was A "proposed research and development project[] in [a] new area[]" under the terms of the codicil.{297} Both NIST and the NSA assert that the appeals procedure has never been used.{298} The agencies contend that the lack of appeals is evidence of the success of their cooperation.{299} Whatever the facts, NIST owes the public, and Congress, a clearer explanation of its relationship with the intelligence community. Congress is entitled to an explicit reassurance that NIST remains in complete control of security for civilian federal computer systems as required by the Act. The House and Senate committees with oversight over NIST should force it to provide these assurances. If NIST is unable to do so because it has allowed its judgment to be suppressed by the NSA's veto, then Congress will need to revise the Computer Security Act to create stronger incentives for NIST to preserve its jurisdiction--perhaps even instituting penalties for noncompliance.{300}
Ideally, the escrow
agents would be as incorruptible as possible, possessed of a
clear charter setting out their positive and negative duties,
insulated from pressure from the law enforcement and intelligence
communities, and outfitted with secure facilities to store the
list of key fragments (which may, if EES catches on, become one
of the most valuable items of information held by the U.S.
govern[Page 783]
ment). They
must also be trusted by the public, or the public will not
participate in the EES scheme. With the exception of the secure
facilities, the list of necessary attributes describes a body
resembling the federal judiciary. Not surprisingly, some noted
cryptologists have suggested that the judiciary hold the keys.{305} No doubt the judiciary could acquire the
technical competence and equipment required to generate and
secure the keys.
Whether judges could constitutionally hold one or more key
fragments is a close question.{306} It is
clear that Congress could not hold the
keys, nor could any congressional agent.{307} Holding keys is an executive function.
It would involve judges in the law enforcement process at a time
when there is no case or controversy and, as regards the large
majority of the keys, no prospect of one. Because holding keys
is an executive function, the judiciary (or an agency such as the
Administrative Office of the U.S. Courts, which is responsible
only to judges) can constitutionally hold the keys only if the
function is "incidental" to its Article III
functions.{308} If the task is more
than "incidental," then the principle of separation of
powers requires that it be undertaken by the executive branch or
by private citizens.{309} The court taking
[Page 784]
custody of the
keys would be in a position reminiscent of
Hayburn's Case,{310} which has long
stood for the proposition that
neither the legislative nor executive branches may assign duties
to the judiciary "but such as are properly judicial, and to
be performed in a judicial manner."{311} Unlike Hayburn's Case, however,
the judges would not be asked to decide anything until the
government was granted a search warrant. The court would
presumably disclose the key fragment(s) along with the ex parte
order granting the warrant.
Judges already do a number of things that come close to holding
a key fragment, but each is distinguishable. Courts and their
adjuncts have for many years exercised a wide variety of
ancillary powers such as rule making, and the appointment and
supervision of court personnel, which are "reasonably
ancillary to the primary, dispute-deciding function of the
courts."{312} Courts have also
supervised grand juries for many years.{313} More recently, Congress has given the
judges
and courts additional responsibilities, including membership on
the Sentencing Commission,{314} and the
selection and supervision of independent counsel.{315} Indeed, the granting of warrants (and the
record-keeping which follows) are ex parte proceedings, clearly
within the Article III jurisdiction of the courts. Taking
custody of a key in advance of any adversary or even any ex parte
proceeding, with the knowledge that most keys will never be
subject to such a proceeding, goes beyond any of these
precedents. Perhaps the closest analogy is the court's marshal
who is instructed to keep order even though there is no reason to
believe [Page 785]
that any
particular person will seek to disrupt the court's functioning.
Even the marshals are an imperfect parallel, however, because
their activities impinge only on persons who come into contact
with the court or with court personnel; holding key fragments
could affect the privacy of many who have no other contact with
the judicial system.
Whether the functions of protecting keys from disclosure and disclosing keys to facilitate wiretaps are sufficiently ancillary to the judicial function of issuing wiretap orders and warrants as to be constitutional is ultimately a matter of taste. The existence of the FISA court,{316} whose sole jurisdiction is to receive and rule on petitions for foreign-intelligence-related surveillance, adds some support to the argument that holding a key fragment would be incidental to Article III functions, because the act of holding the keys is only a little more ancillary to traditional judicial functions than are the FISA court's actions.{317}
As a quick fix, the Secretary of
Commerce and the Secretary of the Treasury should each
immediately issue separate regulations, published in the
Federal Register, defining the role of the escrow agents
in their respective agencies and making clear that the escrow
agents have a legal duty to protect the keys from all release
except as specified in the rules. In the longer term, Congress
should pass legislation vesting the escrow function in
independent agencies specifically created for that purpose.{318} Although opinions
differ as to the degree of tenure in office that the Constitution
allows Congress to confer on the heads of independent agencies,{319} there [Page
786]
is no debate that independent agency status
represents an attempt to shield a function from political
manipulation, and that the officers of an independent agency have
at least political insulation from dismissal by A President who
finds them insubordinate. Alternate structures, in which EES-
product users can choose to lodge their keys with any one of a
number of private escrow agents, might provide even greater
security to users, but at the price of some additional
complexity. One can imagine a system in which private escrow
agents would apply to the Attorney General for certification as
suitably secure and perhaps post bond to ensure that they would
deliver up keys when legally ordered to do so. Although this
system might satisfy both the user's desire for security and the
government's desire for certain access, it introduces practical
problems. The government will still need to keep a master list
of chip serial numbers in order to know which escrow agent has
the key. Furthermore, A private escrow agent would have to
charge a fee, to be paid either by the chip user or the taxpayer.
There is also no particular reason to believe private escrow
agents would be less corruptible than the Justice Department,
although if key fragments were distributed among many different
escrow agents, the harm caused by compromise of any given
database would be lessened.{320}
B. Unresolved Issues
In testimony to the haste with which the Administration
launched the EES program, important implementation issues remain
unresolved. [Page 787]
The
proposed Encryption Standards and Procedures Act would have
authorized the President to release keys to foreign governments
when she "determines that such access and use is in the [Page 788]
national security and
foreign policy interests of the United States."{325} Nothing in the draft
legislation would have required that the owner of the chip ever
be notified that her security has been permanently compromised.
It is interesting to speculate whether a company that suffered a
loss due to the release of commercially sensitive information in
this manner would have a takings or a tort claim against the
United States.
2. Clipper Abroad?
Unlike other modern encryption products, Clipper-equipped
products will be exportable. Presumably, U.S. businesses using
Clipper at home will welcome the opportunity to use the same
products in their foreign subsidiaries. Whether other foreigners
would wish to buy a product that comes with a guarantee that the
U.S. government can listen in seems more doubtful.
There are two strategies, however, that the Administration might use to boost foreign sales. The first would be to share the family key with foreign governments and perhaps also allow those governments to be the escrow holders for certain chips. The alternative would be to manufacture some chips with a different family key, perhaps even a different family key for each foreign market. The alternative family key could be disclosed to the foreign government without compromising the security of the U.S. chips, but two chips with different family keys would not be able to communicate in secure mode because they would not recognize each other's LEAFs as valid.
The globalization of commerce means that sensitive commercial
(and, increasingly, personal) communications cross national
borders. Even if EES becomes the de facto U.S. standard, it is
unlikely to meet with wide acceptance abroad as long as the
family key and the chip unique keys are held by the U.S.
government. Why, after all, should non-U.S. buyers acquire a
product designed to make eavesdropping
by the U.S. government relatively easy?{326} Whether[Page 789]
non-U.S. buyers choose a similar
product with a different family key or a different system
entirely, the result will be to make secure communications
between a U.S. party and a non-U.S. party more difficult. If, as
the FBI
suggests, the U.S. has the most to lose from industrial
espionage,{327} EES may
hurt U.S. business more than it hurts anyone else.
A LEAF followed by a wire communication presents a complicated
problem under the Electronic Communications Privacy Act of [Page 790]
1986 (ECPA).{330} The sensible argument that the LEAF is
an integral part of the conversation, and
thus really within the umbrella of the wire communication that
follows, hits a snag due to the ECPA's definition of the
"contents" of a wire communication. Where formerly
Title III had defined the contents of a wire communication as
including any information "concerning the identity of the
parties to such communication,"{331} the ECPA deleted the quoted words, leaving
the
contents of a wire communication defined as only the
"substance, purport, or meaning" of the
communication.{332} Fitting A LEAF
within that definition requires a stretch. The LEAF itself
contains none of the "substance, purport, or meaning"
of the encrypted conversation--just information about the
identity of the chip needed to acquire those things.
If a LEAF were found to be an electronic noncommunication
legally severable from the wire communication that follows it,
the LEAF would enjoy a lower level of statutory protection than
if the LEAF were treated as part of the content of the wire
communication: (1) Law enforcement officials would not need a
warrant to intercept and record a LEAF, but only the more routine
judicial orders required for pen registers;{333} (2) under the ECPA, any Assistant U.S.
Attorney would be allowed to seek a court order to intercept a
LEAF, not just the specially designated high-ranking members of
the Justice Department who have authority to seek a wiretap
warrant;{334} and (3) the statutory
exclusionary rule applicable to wire
communications would not apply.{335} Without
the[Page 791]
statutory exclusionary rule, the victim of an illegal
interception of a LEAF would have a civil remedy (and the
interceptor would face possible criminal prosecution), but no
right to suppress evidence would exist unless the Fourth
Amendment's exclusionary rule applied.{336} If a LEAF is severable in this manner,
it is not as clear as it should be that the LEAF would enjoy any
protection under the Fourth Amendment. Because decrypting the
LEAF with the family key involves listening to at least a few
seconds of the conversation, the act of intercepting and
decrypting the LEAF is a wiretap of an electronic communication
even if the information thus gathered (the identity of the other
chip) is no greater than could be had with A trap and trace or a
pen register. Traffic analysis using pen registers (which record
the numbers called by a telephone) and trap and trace devices
(which record numbers calling the telephone) does not implicate
the Fourth Amendment.{337} Under Title III,
however, both methods require a court order, although an actual
warrant is not
required.{338} Despite
being a wiretap, the interception of a LEAF might not violate the
Fourth Amendment if the telephone user has no reasonable
expectation of privacy for the LEAF.
An EES chip user should have a reasonable expectation of privacy, as the term is used in Fourth Amendment cases,{339} in her LEAF, but the question is not as free from doubt as it should be. The difficulty arises because the user is aware that the government has the information needed to decrypt the LEAF. Although the government has promised to use that information only in specific circumstances, it is just a promise, and as the government cannot be estopped, it is usually free to renege, although in some circumstances this action might amount to a denial of due process.
[Page 792]
A reasonable
expectation of privacy requires both a subjective expectation of
privacy and an "objective" recognition that the
expectation is reasonable.{340} A
Supreme Court that can hold that one has no reasonable
expectation of privacy in the telephone numbers one dials,{341} or in the checks one
allows to be cleared by one's bank,{342} because the information has been disclosed
to
others, is capable of holding that emitting a LEAF with knowledge
that the government can decrypt it puts the LEAF in the same
position as the telephone number dialed.{343}
A LEAF on its own is not worthless,
although it is worth less than a session key. A large-scale
eavesdropper armed with the family key could collect LEAFs.
Because each LEAF contains the chip serial identifier, it allows
a large-scale eavesdropper to conduct traffic analysis{344} without having to gain access to A
telecommunication provider's equipment to set up thousands of
trap and traces or pen registers. If satellite or microwave
telephone signals are being monitored, the LEAF-monitoring method
of traffic analysis is undetectable.{345} Furthermore, if one is trying to collect
all
the calls from a particular machine in an attempt to decrypt
them, decrypting the LEAF allows one to know which calls to
record and file for future reference. Of course, if the
eavesdropper has A warrant, in most cases all of this and more is
easily obtained from the telephone service provider.{346} It would be monstrous, though, to have A
rule that said the government could acquire the LEAF for traffic
analysis after falsely promising the American people that EES [Page 793]
would be secure. A
court construing both the objective and subjective prongs of the
reasonable expectation of privacy test would have a moral
obligation to take this into consideration.
C. Voluntary EES Is Constitutional
Even if EES is unreasonable either on general principles or as
the term is used in the context of the APA, it is still not
unconstitutional. The Constitution allows many unreasonable
things,{347} and actions
that might violate the APA if made by rules within its purview
are not necessarily unconstitutional if achieved by other means.
So long as it remains purely voluntary, EES creates no
fundamental constitutional problems.
EES involves five distinct government actions. First, the government launched the program by making the classified SKIPJACK algorithm available to a manufacturer of EES-compliant products. Second, the government announced FIPS 185.{348} Third, it is pur- chasing large numbers of EES-compliant products for its own use. Fourth, it is encouraging others to use EES products. Fifth, it is setting up the two escrow agents who will hold the keys. As A group, these five actions amount to attempting to create A voluntary national key escrow system. Individually and collectively these activities are constitutional.
The NSA controls access to the SKIPJACK algorithm and the
details of the LEAF.{349} To date it has
made the design of the chips available to one
manufacturer, Mykotronx, Inc.{350} FIPS
185 indicates that only organizations already holding security
clearances need apply for access to the classified specifications
for SKIPJACK. A party lacking such A clearance might have a
legitimate grievance if she were unable to obtain such clearance
for the purpose of [Page 794]
manufacturing EES-compliant
microcircuitry.{351} Indeed, if
potential competitors to the NSA's chosen manufacturer were
denied access to the information they needed to compete with
Mykotronx, they could plausibly allege an equal protection
violation or a violation of procedural due process. The
government has no obligation, however, to make the algorithm
available to anyone who asks.{352}
The government is free to purchase goods and services to meet its needs.{353} Choosing to purchase EES-compliant devices does not, in itself, create any constitutional issues. Such purchases are constitutional even if they work as an indirect subsidy to producers who are able to lower their unit costs. The government could constitutionally provide direct subsidies if Congress chose to do so.{354} Nor is the denial of market share to non-EES products unconstitutional, even if it has the effect of raising their costs.
The government's cheerleading for EES is also constitutionally permissible. So long as no one is threatened with sanctions for failing to adhere to EES, the government is entitled to make its case to the nation for why we would all benefit if we accepted A limit on our privacy.{355}
[Page 795]
The government has the
authority to act as an escrow agent,{356} although there is some question from where
the
money to pay for the escrow agents would come. Preliminary
estimates put the cost of the escrow agents' activities at $16
million per year.{357} These expenses
may require a separate appropriation by Congress, although both
NIST and the Justice Department have funds which arguably might
be tapped for this purpose.{358}
Nor
is the program as a whole unconstitutional. Even if EES becomes
widespread, everyone in the U.S. remains free to use any
alternative, subject only to restrictions on his or her ability
to export the cryptosystem to foreign correspondents.{359} It remains feasible
and legal to preencrypt a message with an ordinary, non-escrowed
cipher, feed it to an EES-compliant device, and make even EES
communications potentially unintelligible to eavesdroppers armed
with the chip unique key.{360} Indeed, the
very ease with which EES [Page
796]
can be circumvented raises the possibility that
the government might some day require key escrow as the price of
using strong cryptography.
D. Voluntary EES Is Unlikely to Displace Un-Escrowed
Cryptography
As we have seen, the Administration's stated motives for EES
are not entirely consistent. The government's "hard
sell" depicts non-EES encryption as a threat that needs to
be avoided.{361} By contrast, the
"soft sell" treats EES as part of a package deal that
the government offers to those who desire government-certified
encryption.{362} EES is
officially voluntary, yet has been introduced in a manner which
the government hopes will induce, even coerce, the public to
choose an EES system over any alternative.{363} In the Administration's view, it is
unreasonable to object to a plan that protects users from
communications interception by everyone except the government.
At worst, the Administration argues, under EES the user bears no
greater risk of government interception (authorized or not) than
do unencrypted callers.{364} Supporters also
point to the need to help law
enforcement in the fight against dangers such as terrorism.{365}
Perhaps the most often repeated
objection to EES is that because people remain free to use
alternatives, EES can never achieve its stated objective of
maintaining law enforcement access to private encrypted
communications. Clipper's critics suggest that it can catch only
stupid criminals. The government has had three
responses to this argument. The least subtle response has been
that [Page 797]
criminals are
often dumber than one thinks.{366} A more
subtle response is that Clipper may at
least postpone the perhaps inevitable adoption of an alternative
cryptosystem that the government cannot easily decrypt.{367} The most subtle
response notes that a secure communication requires compatible
equipment on both ends of the line.{368} If Clipper becomes the de facto standard,
the
existence of a few other devices on the margin will have A
negligible effect on the government's ability to monitor
electronic communication when it feels required to do so.
The government's policy centers on its hope that EES will
become the market standard. Yet EES will not likely triumph in
the marketplace, even with the advantage of massive government
orders, because many people find something deeply distasteful
about being asked to buy a product that comes ready-made to be
wiretapped, even if the wiretapping is designed to be conducted
only in limited circumstances by duly authorized bodies. In
light of likely technical developments, a "threat
assessment" of the government's potential surveillance
capabilities makes the thought of wiretap-ready communications
even more disturbing. This is especially true considering the
history of government abuse of civil rights and the possibility,
however remote, that government policy might change even as
escrowed chip keys remain fixed. In any case, for e-mail,
alternatives to EES already exist which are cheaper, more
flexible, and appear to offer more complete
privacy.{369} Non-EES [Page 798]
voice products are also
becoming available.{370}
1. Why EES Worries People
In addition to the fundamental objection that the government
should not expect Americans to facilitate the decryption of their
private communications, opponents of EES have raised numerous
technical and practical objections to the plan. Critics of EES
take what appears to the government to be an absolutist stand,
refusing to trust anyone with the key needed to decrypt their
communications.{371} To these
critics, the government's protestation that EES adds nothing to
current authority because federal law enforcement agencies need
the same court order to obtain a wiretap on an EES-equipped phone
as on an ordinary telephone, makes no impression. The critics
believe either that current rules provide insufficient privacy or
that the government cannot be trusted to follow the rules.
a. Preserving the Status Quo Prevents a Return to the
Status Quo Ante
The status quo that EES seeks to preserve was not always the
status quo. At the time Americans adopted the Bill of Rights,
private communications were far more secure than they are today.
Before the invention of the telephone, the radio, and the long-
distance microphone, one could have a secure conversation by
going for a quiet walk in an open field. Correspondents could
encrypt letters in ciphers that no government could break.{372} Modern[Page
799]
Who Needs communications have
expanded the circle of people to whom we speak, but this fact
alone does not mean that communications should necessarily be
more vulnerable. Only recently, it was difficult for the
government to trace incoming calls, even pursuant to a court
order, because the telephone company used slow mechanical tracing
devices. Having overcome that problem, the FBI now seeks
legislation to keep it from becoming difficult again.{373} Nor does the possibility that more
criminals
will avoid detection if the privacy available to individuals were
to be increased necessarily mean that choosing to increase
privacy is unwise. The Bill of Rights already includes many
provisions that prefer to provide protections to all citizens at
the cost of providing benefits to the guilty.{374} What this means is that some value
judgments
must be made, and that someone will have to make them.
Where once people only had to worry about eavesdroppers they could see, today an eavesdropper could be anywhere that a telephone signal happens to reach. Modern encryption seems [Page 800]poised to re-create the functional equivalent of the privacy available in the late 1790s and to apply it to devices like telephones and modems, which are increasingly replacing face-to-face contact and letter writing.{375} EES would prevent this return to the status quo ante, at least when the government is the eavesdropper.
Widespread adoption of Clipper and massive wiretapping ability
would make traffic analysis more feasible for a hypothetical
government oblivious to the need to obtain warrants. If Clipper
is widely used, communications encrypted by other means signal
that the user may have something to hide. Indeed, for this
reason some privacy advocates encourage the routine use of strong
cryptography in all communications in order to provide a cloaking
effect for all personal communications. If everyone makes a
habit of using strong cryptography, the presence of an encrypted
message will never be probative of a guilty conscience or a need
for secrecy.{376}
b. EES Does Not Preserve the Status Quo
EES is designed to be inflexible, and this inflexibility will
impose costs on some users. Each chip's unique key is
permanently branded onto it. If for some reason that key should
be
compromised, the user has no choice but to throw away the chip
and buy A new one. This inflexibility is designed to make it
impossible for users to select keys that are not held by the
government.{377} Under Title III, the
government must notify persons who were the subject of an
authorized wiretap.{378} This duty is
unaffected by EES, but [Page
801]
the consequences change. Previously there was
little a citizen needed to do after receiving notice that her
phone had been tapped, but now she must consider whether the
disclosure to law enforcement officials of the chip unique key in
her
telephone means that she should replace it, at a cost,{379} or whether she should
trust government assurances that all records of the key kept
outside the escrow agents have been destroyed.{380}
Two telephones communicating via Clipper Chips use the same session key; thus, when Alice and Bob are talking, a public servant with a warrant for Alice's telephone does not need to know Bob's chip key to decrypt the conversation. Knowing Alice's chip key will suffice because Alice's LEAF will provide all the information needed. Except for the fact that he is overheard talking to Alice, Bob's security is unaffected by a wiretap of Alice's line.
But if Alice and Bob are using e-mail to communicate and Capstone Chips{381} to do their encryption, both Bob and the public servant are in a different position. Capstone is designed to allow Alice and Bob to use public key encryption for their session keys.{382} Bob's Fortezza card knows Alice's public key, but not her private key or her chip key, so the only LEAF it is able to generate is one that relies on Bob's own chip key. This creates a lot of work for a public servant tapping Alice's line. Every time she gets an e-mail from a new correspondent, the public servant must decrypt its LEAF with the family key and then go to the escrow agents and request the chip unique key for the new person. If Alice communicates with many people who use Fortezza cards, the public servant may wind up holding a large, and rather valuable, collection of chip keys.
Because the wiretap order mentions only Alice, the
court that issued the order has discretion to decide whether each
of the people whose session keys were disclosed should be
notified of that [Page
802]
fact.{383} Although nothing
in
Title III or the Attorney General's rules requires it, Bob
deserves to be told.
Bob's Fortezza card will provide his digital signature as well
as encryption for his e-mail. Disclosure of the digital
signature key to anyone who might even be tempted to sell or make
use of it would represent an enormous risk to Bob. Anyone
holding Bob's key to his digital signature could masquerade as
him and authenticate any transaction or correspondence (for
example, in a raid on Bob's electronic bank account) with a
digital signature that Bob would be powerless to disavow.
Fortunately, current plans for Fortezza call for separate keys
for message encryption and for digital
signatures.{384} Furthermore, although Bob is
powerless to change the chip unique key used
to encode his e-mail's LEAF, Fortezza will allow him to change
the key to his digital signature. Thus, Bob's ability to
uniquely identify himself remains secure.
c. The Status Quo May Not Be Stable
The biggest divide between the two sides to the EES debate
concerns what they consider relevant. The Clinton
Administration, as one would expect, operates on the assumption
that government officials can be trusted to act legally.{385} The government therefore measures the
social
consequences of its proposals by the effect on the government's
lawful powers and the citizen's lawful rights. Critics of EES,
however, tend to discount this approach. Instead, they undertake
a threat analysis of the EES proposal.{386} It may seem a little silly to conduct A
threat analysis of a cryptographic proposal by a government that
has the raw physical power to do far worse things than spying on
its citizens, but in fact threat assessment enjoys a grand
tradition. The Framers of the Constitution did [Page 803]
not assume that "men
were Angels."{387} They conducted a
kind of threat analysis of government and decided that it could
only be trusted if centralized power were divided in a manner
that set interest against interest so as to protect the
governed.{388} The
impulse to rely as much as possible on structures that force
proper behavior by government officials, and as little as
possible on simple trust, is as old as the nation.{389}
Some of these threats to the status quo are political. For example, one glaring risk in the current EES proposal is that the escrow procedures exist entirely within the purview of the Attorney General, and could be changed at any time without any warning.{390}
Some threats
consist of individual or official malefaction. In this age of
spy scandals, it is always possible that the escrow agents,
through negligence or corruption, may allow someone to acquire
the full list of key segments.{391} The
method by which keys are generated for the EES chips may lend
itself to subversion of the escrow scheme from the moment the
keys are generated. Although hedged with elaborate safeguards,
all keys are generated by a single computer in a secure facility
closed to public inspection. Because users are not in a position
to monitor the key-generation procedure, they must trust that the
published safeguards are being observed. Even if the risk of
surreptitious subversion of the generation process were small,
the risk to communications security would be greater than if the
keys had never been escrowed.[Page 804]
Some threats to the status quo are mathematical. Critics argue that a classified algorithm such as SKIPJACK--one that has not been exposed to merciless attack by academic cryptologists--is less likely to be secure than one subject to full peer review and thus might contain an intentional, or even unintentional, "back door" that would make it vulnerable to sophisticated mathematical attack.{392} The government's response is that SKIPJACK's security is certified by the NSA{393} and by independent outside experts.{394} The government classified SKIPJACK not out of fear that publicity might expose the algorithm to attack, but to prevent users from enjoying the fruits of its research and development while at the same time avoiding participation in its key escrow system. The Administration argues that SKIPJACK is so strong that, were people able to use it without escrowing their keys, they would undermine the goal of easy government access to encrypted messages that EES is designed to achieve.{395} Some critics remain unsatisfied by this explanation. They argue that because EES is voluntary, the government should not attempt to require compliance with the escrow procedure as a condition of using SKIPJACK.{396} The Administration's response is, in effect, that if users wish to use a government-certified algorithm, they should be prepared to take the bitter with the sweet.
Some threats, perhaps the most realistic, are technological. Changes in technology are likely to make electronic eavesdropping easier, more effective, and cheaper for the government.{397} All other things being equal, a rational government would react to these changes by increasing the use of electronic eavesdropping. As government eavesdropping becomes more affordable, the reasonable citizen's desire for countermeasures ought to become greater as well.
[Page 805]
The technological
threat appears more ominous if one tries to forecast what the
government may be able to do a decade from now. Currently, all
the wiretapping technology in the world is useless if there is no
one to listen to the conversations. The physical and economic
limit of what is currently achievable is demonstrated by the East
German Ministry for State Security, the Staatsicherheit or
Stasi, which at its peak was probably the most sophisticated and
far-reaching internal surveillance organization ever created.
Out of a population of 17 million, the Stasi had 34,000 officers,
including 2100 agents reading mail and 6000 operatives listening
to private telephone conversations, plus 150,000 active informers
and up to 2 million part-time informers.{398} Together they produced dossiers on more
than
one out of three East Germans, amounting to one billion pages of
files.{399} There are
fifty-nine times more telephones in the United States than there
were in East Germany and about fifteen times as many people.{400} The people (and machines) in the United
States make about 3.5 trillion calls per
year.{401} Even if
every telephone service provider in the United States were to
record every conversation in the country, the government could
not make use of the tapes because it lacks the human resources
necessary to listen to them. Even if political constraints could
not prevent the growth of an American Stasi, the financial
constraints are currently insurmountable.{402}
The cost may soon shrink
dramatically. EES, the Digital
Telephony initiative,{403} and advances in
computer power, combined with
the increasing links among federal databases{404} and [Page
806]
advances in voice recognition protocols, suggest
that soon the physical constraints on
widespread, government-sponsored eavesdropping may disappear.
Voice recognition already allows computers to pick out a
particular speaker's voice from the babble of communications;{405} combined with the
power to search for particular words in all messages, this
advance in technology will provide a powerful surveillance tool
to any government willing to use it. Computers can monitor
communications twenty-four hours per day, and they do not collect
overtime. In the absence of physical and economic constraints,
the only
constrictions on omnipresent automated telephone monitoring will
be legal and political.{406}
2. Spoofing EES: The LEAF-Blower
EES suffered a glancing blow when a researcher at AT&T
discovered that it could be "spoofed," albeit with some
effort.{407} The
protocol that produces the spoofs quickly became popularly known
as the "LEAF-blower."{408} The
process is too slow to be of[Page
807]
much practical value in Clipper-telephone
communications, but might be applied by patient e-mail users of
Capstone.{409}
Recall that an EES- compliant device will only decrypt a message that comes headed by what appears to be a valid LEAF. A "spoof" replaces the real LEAF with a simulacrum, which appears valid to the decrypting chip, and even an eavesdropper armed with the family key, but is in fact meaningless. Because the actual session key is negotiated before the LEAF is generated, the absence of the true session key in the LEAF does not affect communications so long as the LEAF passes the validity check. Because the decrypting chip checks the LEAF against a 16-bit checksum,{410} which uses the actual session key as one of its inputs, a spoof requires more than just copying a LEAF off a previous transmission. A spoof is computationally complex because the spoofer must use trial and error to generate a LEAF with a phony session key whose checksum equals that of the real session key. Each time the LEAF-blower is used, an average of 32,768 LEAFs must be tried before one works. Tests at AT&T on a prototype Capstone-based PCMCIA card showed that, on average, more than forty minutes would be needed to produce a valid-looking spoof.{411}
A LEAF-blower allows a "rogue" EES device to communicate with all other EES devices, without the recipient even knowing that the sender has spoofed the chip. Because it can take up to forty-two minutes to counterfeit the LEAF, however, the technique is likely to remain primarily of interest only to very patient people. Interestingly, NIST claims it was always aware that a LEAF-blower device could be constructed. It found the risk acceptable, however, because the technique was too slow to be of practical value.{412} Furthermore, because the chip serial number contains A field identifying the manufacturer as well as the chip, anyone who decrypts a rogue LEAF with the family key will be able to recognize a bogus chip serial number without having to consult the escrow agents.{413}
[Page 808]
Thus, the way to feign
compliance with EES remains preencrypting the message with some
other system before using the EES device. Preencryption is
undetectable with the family key alone, but is discernable only
after the escrow agents have released the chip unique key.
Preencryption is relatively easy for e-mail, but it is difficult
to achieve for real-time voice communication. As a result, an
eavesdropper armed with the family key should be in a good
position to
monitor compliance with EES even if she cannot decrypt the
conversation.{414}
E. What Happens If EES Fails?
The large number of government orders and the attraction of
SKIPJACK for those who need the security of a government-
certified cryptosystem means that EES is unlikely to disappear,
especially in its incarnation as the Fortezza PCMCIA card.{415} It has, however,
engendered enough opposition to put its future in doubt.{416} The existence of
other well-regarded ciphers such as triple-DES{417} and IDEA,{418} combined
with public
distaste for wiretap-ready telephones, the many unanswered
questions about the proposal, the cost premium for a hardware (as
opposed to a software) cryptosystem, the inflexibility of EES,
and the lack of interoperability with foreign cryptosystems will
likely combine to render EES if not stillborn, then at least
stunted.
It seems reasonable, therefore, to speculate as to
how the government will react if EES fails to become the
standard. Assuming the government does not come up with a wholly
new system to replace EES, two options exist:{419} (1) do nothing; or (2) [Page 809]
forbid the use of
unescrowed cryptography. The former option is implicit in the
"soft sell" policy that describes EES as the price the
private sector must pay for using SKIPJACK. If the private
sector refuses EES, it forgoes SKIPJACK. That is its privilege,
and no further government action would be needed.
The latter of the two approaches is implicit in the "hard sell" for EES. If widespread unregistered encryption can be used by "drug dealers, terrorists, and other criminals," to quote the White House,{420} then the country cannot afford to do nothing. But with unregistered cryptography already widely available, the only option may be a "Digital Volstead Act."{421}
The Clinton Administration considered
banning unescrowed
encryption,{422} but then
concluded that it would "not propose new legislation to
limit use of encryption technology."{423} A future administration might, however,
reverse this decision, particularly if an investigation into A
high-profile crime, such as the terrorist bombing of a major
building or the management of a child pornography ring, was found
to have been seriously hampered by the use of advanced
cryptography. The current Administration has carefully left that
option
open for its successors, noting that by forgoing a ban on
unescrowed encryption it is not "saying that [Page 810]
`every American, as a
matter of right, is entitled to an unbreakable commercial
encryption product.'"{424}
The government is clearly willing to require that communications be made wiretap-ready, at least when it knows that its dictates can be enforced.{425} It is also "apparent that the law enforcement community is still looking for a way to meet its surveillance needs in the age of digital communications."{426} If EES fails, the law enforcement and intelligence communities, at least, will seek to preserve their capabilities. Legislation requiring that all strong cryptographic programs use key escrow may be the only remaining solution. As FBI Director Freeh commented, "If five years from now . . . what we are hearing is all encrypted" material that the FBI is unable to decipher, then the policy of relying on voluntary compliance with EES will have to change.{427} "The objective is for us to get those conversations whether they are . . . ones and zeros [or] wherever they are, whatever they are, I need them."{428} As a result, Part III examines the legal problems that would flow from hypothetical legislation making key escrow mandatory.
[Page 811]
session keys or use a LEAF-
equivalent so that the government could determine the session key
without informing the parties to the communication that an
investigation is in progress.With a mandatory key escrow statute of this type, the government would be asking all citizens to surrender their collective right to technical countermeasures to the "progress of science in furnishing the Government with means of espionage."{430} Mandatory key escrow could use a hardwired chip key like Clipper, or it could be implemented through software designed to resist tampering by the user.{431} Would such a statute be constitutional?
This Part provides a whirlwind survey of relevant First,
Fourth, and Fifth Amendment doctrines, as well as evolving
conceptions of the constitutional right to privacy. The focus is
analytic and predictive, rather than prescriptive. This Part
attempts to sketch how courts, given the current state of the
law, would be likely to rule on the constitutionality of a
mandatory key escrow statute. It suggests that mandatory key
escrow would reduce associational freedoms, chill speech, and
constitute an intrusive search. The statute also might require a
form of self-incrimination and would infringe personal privacy
rights. Under existing doctrines, however, the analysis of the
constitutionality of mandatory key escrow legislation would turn
on the court's balancing of the potential costs to personal
privacy against the perceived gains for law enforcement and
national security. On balance, private, noncommercial users of
encryption probably have a Fourth Amendment right to resist
mandatory key escrow and might have a First Amendment claim as
well. Whether commercial users or corporations would have such
[Page 812]
rights under
current doctrines is less clear. Even the vitality of the rights
of private noncommercial users appears to be A distressingly
close question given the current state of civil rights doctrine
and the great importance courts accord to law enforcement and
national security. A description of a more holistic, less
myopic, view of the issue, as well as most
recommendations, are deferred until Part IV.
The volume of relevant constitutional doctrine imposes a
greater and more harmful constraint on this discussion than the
need to summarize ruthlessly and put off (most) prescriptions
until Part IV. Even though constitutional cases establishing a
right to some form of privacy recognize that the right is
grounded in the First, Fourth, and Fifth Amendments,{432} the four areas remain doctrinally
distinct. Reflecting this separation for ease of exposition
risks survey at the price of synergy and synthesis. It is
important to remember that this is an area in which the whole is,
or at least should be, greater than the sum of its clause-bound
parts.
A. First Amendment Issues
The First Amendment states that "Congress shall make no
law . . . abridging the freedom of speech, or of the press; or
the right of the people peaceably to assemble."{433} Scholars debate
whether the First Amendment is a means or an end, and, if a
means, then to what end.{434} Whether
understood as protecting self-realization as an end in itself or
political expression as a means of preserving the political
process, conventional First Amendment doctrine offers numerous
obstacles to mandatory key escrow. None, strangely, is
insurmountable.
Mandatory key escrow affects public debate in three ways.
First, mandatory key escrow forces users of cryptography to
disclose [Page 813]
something
they would prefer to keep secret, which amounts to compelled
speech. Second, it chills speech by persons who seek to remain
either secure or anonymous when speaking, whether for fear of
retribution or other reasons. Third, it chills the associational
freedom of persons who wish to band together but do not wish to
call attention to the fact of their association or to their
participation in a known
association.
1. Compelled Speech
Mandatory disclosure of keys can be viewed as compelled speech,
akin to laws requiring disclosure of financial records by
charities and of market-sensitive information by publicly traded
companies.{435} The Supreme Court
treats compelled disclosure of noncommercial information as akin
to a content-based restriction on speech, demanding the strictest
scrutiny.{436} To pass
this test, a regulation must be motivated by a compelling state
interest, avoid undue burdens, and be narrowly tailored.{437} Thus, in Wooley
v. Maynard{438} the Supreme Court
struck down a New Hampshire law requiring automobiles to display
license plates bearing the state motto "Live Free or
Die."{439} The statute was held
unconstitutional because the state required citizens to use their
private property as mobile billboards for the state's message,
even though the state, by allowing cars to carry disclaimers too,
compelled no affirmation of belief.{440}
[Page
814]
Mandatory key escrow differs from the issues in
the leading mandatory disclosure cases{441} because the
disclosure is not public. Instead, the government says it will
keep the chip key secret and will decrypt the LEAF only for good
cause. The Supreme Court has stated that mandatory disclosure
laws will be sustained only if there is "a `relevant
correlation' or `substantial relation' between the governmental
interest and the information required to be disclosed."{442} If the state interest in telling donors
how charities use their contributions is sufficient to justify a
mandatory disclosure statute,{443} then the
state
interest in crime fighting and national security should be
sufficiently compelling too.{444} Because
the government keeps the key in escrow, the rule is more narrowly
tailored than a public disclosure rule.{445} The critical
question therefore is whether the burdens--forcing the user to
utter a LEAF or the equivalent and introducing doubt as to the
security of what might otherwise be a completely secure system--
are worth the gain to national security and law enforcement.
This is a value judgment, one that cannot be settled easily by
doctrinal argument, yet one that the courts would have to make to
resolve the issue.{446} As with [Page 815]
many value judgments,
reasonable people may differ on the outcome; the less speculative
the claim that harms will flow from allowing promiscuous
unescrowed encryption (that is, the more terrorists who have
managed to blow things up because they used secure telephones),
the more likely the courts would find that the measure passed
strict scrutiny insofar as it compels speech.{447}
2. Chilling Effect on Speech
"Few thoughts are more threatening to people who value autonomy than the thought of being constantly watched. . . ."{448}Because mandatory key escrow applies to all who use strong encryption, regardless of what they say, it can be considered A content-neutral regulation of speech and association.{449} As such, it is subject to an intermediate level of scrutiny involving A balancing of interests. Because mandatory key escrow directly regulates a mode of speech, the review will be more searching than it would be if the statute had only an incidental effect on speech.{450}
[Page
816]
In practice, the Supreme Court balances the
following factors: (1) the extent to which speech is likely to
be chilled; (2) the degree to which the prohibition falls
unevenly on a particular group as opposed to society at large;
and (3) the availability of alternate channels of
communication.{451} It seems evident
that speech will be chilled, although exactly how much is
uncertain.{452} To the extent that the
prohibition falls unevenly on society, it will tend to affect
those with access to computers and scrambler telephones. This is
not the group whose speech the Court
traditionally takes the most care to protect, because wealthy and
well-educated people have the greatest access to alternative
channels of communication.{453} The
critical issue is likely to be whether mandatory key [Page 817]
escrow "`unduly
constrict[s] the opportunities for free
expression.'"{454} Because a
mandatory key escrow scheme promises to release keys only with
just cause, the Court would likely find the constricting effect
to be relatively minor. Ultimately, however, the standard
collapses into a balancing test in which distinguishing
"due" from "undue" content-neutral
restrictions requires highly contextual judgments.{455}
3. Anonymity and the Freedom of Association
"[L]ets hold more chat.Anonymity is "essential for the survival of [some] dissident movements."{457} Identification requirements "extend beyond restrictions on time and place--they chill discussion itself."{458} They also can infringe the right of assembly.{459} Cryptography
In private then.
I am best pleased with that."{456}
[Page 818]
allows unprecedented
anonymity both to groups who communicate in complete secrecy and
to individuals who, by sending electronic mail through
anonymizing remailers, can hide all traces of their identity when
they send mail to other persons.{460}
Combined with the
ability to broadcast messages widely using services such as the
Internet, anonymous e-mail may become the modern equivalent of
the anonymous handbill. Unlike the anonymous handbill, the
anonymous remailer can allow two-way communication in which
neither party can determine the identity of the other party.{461} By encrypting their
return addresses using a public key belonging to the remailer,
all parties can carry on a conversation without revealing their
identities. If the parties use a series of secure remailers as
intermediaries, and if they encrypt the text of their messages,
no one will be able to connect the parties to the communication.
Cryptography thus enhances communicative privacy and
anonymity.Key escrow threatens this anonymity in two ways. First, and of greater significance, it makes it possible for eavesdroppers armed with the escrowed key to identify the ultimate source and actual content of encrypted e-mail messages being sent out to anonymous remailers. Second, key escrow makes it possible for eavesdroppers armed with the escrowed key to identify the person to whom the target of a wiretap is speaking; without the key, the only information gleaned would be call set-up information, which merely identifies the telephone on the other end of the conversation.
In the last thirty years, the Supreme Court
has struck down several statutes requiring public disclosure of
the names of members of dissident groups,{462} stating that "[i]nviolability of
privacy
in [Page 819]
group
association may in many circumstances be indispensable to
preservation of freedom of association."{463} Nevertheless, the right to privacy in
one's political associations and beliefs can be overcome by a
compelling state interest.{464} Thus,
the Court held that associational freedoms do not trump the
application of statutes forbidding discrimination in places of
public accommodation. In so doing, however, the Court reiterated
that "the Constitution protects against unjustified
government interference with an individual's choice to enter into
and maintain certain intimate or private relationships."{465} As the Court stated
in Board of Directors of Rotary International v. Rotary Club
of Duarte,{466} two key
issues affecting the degree of constitutional protection to be
afforded to an association are the degree of intimacy and whether
the relationship is conducted "in an atmosphere of
privacy" or one where the group seeks to "keep their
`windows and doors open to the whole world.'"{467} Impediments to the
right to choose one's associates, including (presumably)
publicity, can violate the First Amendment.{468}
[Page
820]
A requirement that group members communicate in a
fashion that is accessible to lawful government wiretaps is both
less and more intrusive than A requirement that groups publish
their membership lists. It is less intrusive because no actual
intrusion occurs until and unless a warrant is granted allowing
the government to eavesdrop on communications. It is more
intrusive because, once the intrusion occurs, specific facts
about individuals will be disclosed in addition to the fact of
membership in the group. Thus, while A national security/law
enforcement justification for a narrowly tailored limit on
associational privacy is likely to be at least as compelling as
the state's legitimate desire to root out invidious
discrimination, the countervailing interests are arguably greater
also.
Groups seeking to change the social order in ways likely to be resented by police and others in positions of power will have reason to fear that state actors will find ways to access their keys. Indeed, in Buckley v. Valeo{469} and again in Brown v. Socialist Workers '74 Campaign Committee{470} the Supreme Court recognized that minor political parties may be able to show A "reasonable probability" that disclosure of membership information will subject those identified to "threats, harassment, and reprisals"--including harassment from the government.{471} Ultimately, therefore, the courts again will be left with an essentially nonlegal value judgment: whether the interests supporting mandatory key escrow are sufficiently great to justify the increased risk of harassment to political dissidents.
A challenge to mandatory key escrow as an infringement on the freedom of association would increase its chances of success if the challengers could demonstrate that mandatory key escrow closes off a channel of anonymous communication that has no true alterna-tive.{472} Indeed, no substitute exists for the anonymous remailer: unlike anonymous leaflets, no one can see an e-mail being created, and thanks to the anonymous remailer, no one can see it being distributed, either.
[Page 821]
On
October 12, 1994 the Supreme Court heard arguments in McIntyre
v. Ohio Elections Commission.{473} Like Talley, the McIntyre
case concerns the validity of a state statute that imposes a flat
ban on distribution of anonymous political campaign leaflets.
The decision in
McIntyre may have a very significant impact on the law
surveyed in this subsection.{474}
4. The Parallel to Antimask Laws
The simmering debate over antimask laws prefigures the debate
over mandatory key escrow, and demonstrates how close a question
mandatory key escrow could present.{475} Mandatory key escrow would make it an
offense
to communicate in a manner that shields the identity of the
speaker from the government. Similarly, strict liability
antimask statutes prohibit mask-wearing on public property,
except on designated holidays such as Halloween.{476}
In states with strict liability
antimask statutes, demonstrations and all travel by masked
persons are illegal. Investigators of racially motivated crimes
credit antimask laws for preventing those responsible from
traveling in disguise. The prohibition on masked rallies also
makes it easier for police to make arrests, after the fact if
necessary, when demonstrations become violent.{477} Antimask [Page
822]
laws have been justified as a means of helping to
prevent violence, but this justification has met with a mixed
reception by courts and commentators.{478} The Supreme Court of
Georgia accepted that the state interest in preventing crimes of
violence and intimidation associated with mask-wearing was
sufficiently compelling to justify an incidental infringement on
First Amendment rights.{479} On this
reasoning, mandatory key escrow would
probably pass constitutional muster also. Not everyone agrees,
however, that First Amendment guarantees can be compromised
merely by reference to the history of violence associated with
mask-wearers. Some courts and commentators believe that the
First Amendment requires that there be specific evidence that A
particular masked person or demonstration presents a threat of
violence before an antimask statute can be applied without
violating the Constitution.{480}
Perhaps inhibited by the irony of having to rely on NAACP v.
Alabama ex rel. Patterson,{481} few Ku Klux Klan challenges to antimask
laws have been predicated on the right to associational freedom
of mask-wearing travellers and demonstrators. As a result, only
one state supreme court and one federal court have ruled on an
associational freedom challenge to an antimask law, and they
disagreed.{482} The constitutionality of
antimask laws remains [Page 823]
largely unsettled,
suggesting that the First Amendment aspects of mandatory key
escrow would present an equally close and disputed question.
B. Fourth Amendment Issues
The Fourth Amendment guarantees "[t]he right of the people
to be secure in their persons, houses, papers, and effects,
against unreasonable searches and seizures." It also states
that "no Warrants shall issue but upon probable cause . . .
particularly describing the place to be searched, and the persons
or things to be seized."{483}
Americans already acquiesce to substantial invasions of privacy
by government fiat, without a warrant. We disclose personal
details of our lives on tax returns. We consent to having our
belongings x-rayed, opened, and searched as our persons are
scanned for metal (sometimes followed by a pat-down) as a
condition of being allowed to board an airplane or enter some
public buildings. The law says the government may paw through a
citizen's garbage without a warrant,{484} and that she lacks a reasonable
expectation of privacy in relation to telephone numbers dialed.{485} The police may fly over her house in A
helicopter at four hundred feet{486} and use
special cameras to photograph everything below.{487} The government may [Page 824]
use satellites to spy in
her windows;{488} it may use heat-
detection gear to monitor heat
emanations from her home;{489} it may use dogs
to sniff her luggage and her person.{490} Once the
government has arranged for an informant to plant a beeper on A
citizen, the government may use the signal to track the citizen's
movements.{491} When
national security is at risk, many procedural [Page 825]
protections that are
required in the ordinary course of an investigation are
suspended. For example, the government may, for reasons of
national security, break into some premises without a warrant to
plant a bug, whereas the same action in an ordinary criminal
investigation would require a warrant.{492} National security wiretap requests go to
a secret court that meets in camera and never issues opinions.{493}
On the other hand, mandatory key
escrow differs from each of these examples in significant ways,
especially as it affects private, noncommercial use. Absent
exigent circumstances such as fires, hot pursuit, or the like,
the Supreme Court has yet to approve a [Page 826]
warrantless intrusion into
a home occupied by an ordinary taxpayer, much less one who has
made efforts to shield herself from detection.{494} Except for consent
to x-rays and searches at airports and public buildings, none of
the examples above require the target of the probe to take any
action to aid the prober, much less to ensure that the probe is
successful; and this exception does not reach into the home.
In principle, warrants are required for all domestic security wiretaps.{495} The next subsections describe how the Fourth Amendment also prohibits warrantless mandatory key escrow for private, noncommercial uses of encryption.{496} Commercial and corporate uses, however, present a more difficult question. These uses may not be entitled to Fourth Amendment protection against mandatory key escrow.
[Page
827]
power to criminalize conduct and the executive's
power to enforce the criminal laws of the United States stem from
the grants of power in Articles I and II of the Constitution,
such as the Commerce Clause{498} and the
Necessary and Proper Clause.{499} Those
powers are, in
turn, limited by the Bill of Rights, of which the Fourth
Amendment is a part.
The absence in the Fourth Amendment of an affirmative grant of
power to make effective searches, however, does not determine
whether the affirmative grants in Articles I and II give the
government the power to subject communications to nonconsensual
searches. It simply means that from a Fourth Amendment
perspective, mandatory key escrow poses strictly traditional
problems: Is
mandatory key escrow, which takes place without a warrant, a
search and seizure?{500} If so, is it a
reasonable warrantless search or seizure, or should A warrant be
required?
[Page 828]
government into matters for which
individuals have A (subjectively and objectively) reasonable
expectation of privacy ordinarily require a search warrant.{502} Not every acquisition of information by the government from sources reasonably expected to be private is necessarily a search. For example, the Supreme Court has held that unintrusive means of piercing personal privacy, such as overflights{503} or the use of dogs to sniff for contraband,{504} are not searches for Fourth Amendment purposes. Although wiretapping is also unobtrusive, there has been no question since Olmstead v. United States{505} was overturned{506} that wiretapping constitutes a Fourth Amendment search or seizure.
Not every search affecting matters reasonably expected to be
private requires a warrant. Examples of legitimate warrantless
searches include "regulatory searches,"{507} searches incident to
valid arrests,{508} searches
conducted under exigent circumstances (such as the likely
destruction of evidence),{509} and border
search[Page 829]
es.{510} Absent A specific national security
rationale directly related to the conversation, the speaker, or
exigent circumstances, however, a warrant is required for a
wiretap both under the Fourth Amendment and under Title III.{511}
A key is not itself a conversation, however, but the means to decrypt one. Nevertheless, there should be no doubt that absent government action to force disclosure, a properly guarded key to A cryptographic system would be an item of information for which the user would have both a subjectively and objectively reasonable expectation of privacy.{512} Indeed, the entire point of having A cryptographic system is to increase or create privacy. This is especially true in a public-key cryptographic system, in which the private key is never disclosed.{513} A requirement that keys (or the means to decrypt them) be turned over to the government is thus clearly A search or seizure for Fourth Amendment purposes.
The Fourth Amendment regulates both the issuance of search warrants and the conduct of a valid search. Key escrow seeks to preserve the government's ability to carry out a valid search by taking action in advance of any warrant.
One can imagine many actions that the government might take to preserve its ability to conduct valid searches. It might, for example, require all citizens to live in glass houses by prohibiting the use of any other building material. More reasonably, the government might prevent banks from renting out safe deposit boxes that would destroy the contents unless opened with the right key. Or, the government might prohibit the construction of homes with armored walls. Each of these hypothetical rules might raise constitutional problems, but none of them is in itself a search.
[Page 830]
In contrast, a
key is information that the government forces the user to
disclose. This distinguishes key escrow from other, closely
related,
hypothetical situations in which the government might make
preemptive rules designed to make the execution of a valid search
warrant easier. The dissimilarity is, however, nothing more than
a weak and unreliable distinction between requiring an act and
proscribing alternatives to that act. The question then becomes
whether this search or seizure falls into any of the classes of
exceptions to the warrant requirement.
3. Mandatory Key Escrow as a "Regulatory
Search"
Only the regulatory search exception to the warrant and
particularity requirements of the Fourth Amendment seems at all
likely to apply to mandatory key escrow, but this single
exception is enough. The requirement that all users of strong
cryptography escrow their chip keys or other means to decrypt
their session keys closely resembles routinized searches, such as
employee drug testing, for which the Supreme Court no longer
requires a warrant. Unlike traditional law enforcement searches,
which are designed to find evidence of a crime, regulatory
searches are "aimed at deterrence of wrongdoing through fear
of detection."{514} Like the
warrantless, wide-ranging, regulatory searches approved by the
Supreme Court, the government's acquisition of keys will not
provide evidence of anything criminal. Rather, by requiring the
disclosure of keys, the government seeks to remove the shield of
strong cryptography from what it believes would otherwise be
socially undesirable uses.
The leading regulatory search case is National Treasury
Employees Union v. Von Raab,{515} in which
the Supreme Court endorsed a [Page
831]
Customs Service program of mandatory employee drug
testing.{516} The Court stated that
"neither A warrant nor probable cause, nor, indeed, any
measure of individualized suspicion, is an indispensable
component of reasonableness in every circumstance."{517} Instead, "where a Fourth Amendment
intrusion serves special governmental needs, beyond the normal
need for law
enforcement," one should "balance the individual's
privacy expectations against the Government's interests to
determine whether it is impractical to require a warrant or some
level of individualized suspicion in the particular
context."{518}
It is difficult to imagine a case in which the government would find it easier to plead "special needs," such as the need to prevent the development of "hidden conditions" and the impracticality of warrants for every key,{519} than in its attempt to compile a database of chip keys or session keys.{520} Mandatory key escrow fits several of the criteria enunciated in Von Raab. In particular, mandatory key escrow is not designed to produce evidence for criminal prosecutions (wiretaps do that, but they require warrants or other authorization), but rather to deter crimes that might otherwise be furthered by the use of encryption.{521} The key's owner knows that the key is being escrowed. In addition, if encryption becomes widespread, a more particularized approach would be difficult if not impossible.{522} Finally, because the government only plans to use the key segments for legitimate searches, it can argue that the cost to personal privacy is low.{523}
[Page
832]
On the other hand, although the courts have
allowed warrantless regulatory searches in the workplace, at
airports, in prisons, at the border, and in schools, none of the
leading regulatory search cases has involved a search that
entered into the home, unless the home was the scene of a fire or
was occupied by a parolee, probationer, or welfare recipient.{524} Indeed,
in Camara v. Municipal Court{525} the Supreme Court refused to eliminate the
warrant requirement for routine searches that penetrated
residential property hunting for violations of the city's housing
code.{526} The Court
characterized the housing inspectors' intrusions into the home as
too "significant" to be allowed without a warrant{527}--although the same
Court then went on to balance the interests at stake and
concluded that warrants could be issued with a lesser showing of
need than that traditionally required for probable cause.{528}
Mandatory key escrow would affect many different types of users, including both business and personal users who send messages both commercial and political. The regulatory search precedents, particularly Von Raab, suggest that Congress might be able to require mandatory key escrow for businesses and other commercial users without implicating the Fourth Amendment as it is currently understood. The broad sweep of the special needs justification, however, is not easily confined to the market sector of society, and there is nothing in the logic of Von Raab that requires it remain there.
Although the Court decided Wyman v. James{529} after Camara,
to date the Court has not extended the special needs
justification of Von Raab into the home. This suggests
that private, noncommercial [Page
833]
users of encryption might not fall within any of
the currently specified special needs categories of the
regulatory search exception to the Fourth Amendment. As
currently understood, therefore, the Fourth Amendment probably
prohibits warrantless mandatory key escrow, at least for private,
noncommercial users of encryption.{530}
C. Fifth Amendment Issues
The Fifth Amendment guarantees that "[n]o person . . .
shall be compelled in any criminal case to be a witness against
himself."{531} The
"`historic function'" of this part of the Fifth
Amendment is to protect a "`natural individual from
compulsory
incrimination through his own testimony or personal
records.'"{532} Currently, there
is a tension in the Supreme Court's treatment of the reach of the
Fifth Amendment. On the one hand, the Court interprets the right
narrowly, to apply to criminal defendants only.{533}
[Although the Court] has often stated [that the Fifth Amendment] protect[s] personal privacy, . . . [the] Court has never suggested that every invasion of privacy violates the privilege. . . . [T]he Court has never on any ground, personal privacy included, applied the Fifth Amendment to prevent the otherwise proper acquisition or use of evidence which, in the Court's view, did not involve compelled testimonial self- incrimination of some sort.{534}On the other hand, the Court has never questioned the special nature of some private noncommercial personal papers, such as diaries, and has held that these retain their Fifth as well as Fourth Amendment protection.{535} With one exception, neither the Fourth nor the Fifth Amendment has ever been understood to allow the government to require
[Page 834]
civilians, in peacetime, to
structure their lives to make hypothetical future searches by law
enforcement easy. That exception, the required records doctrine,
is inapposite to mandatory key escrow.{536} Instead, the Fifth Amendment is
potentially relevant to mandatory key escrow in two ways. The
required disclosure of the chip key resembles the required
disclosure of A private paper, which may have some Fifth
Amendment protection, and the forced utterance of a LEAF may be
the type of incriminating testimony proscribed by the Fifth
Amendment.
[Page
835]
compulsory production and authentication."{541} Nevertheless, the rule found
"abhorrent" in 1886 is now practically the law.{542} The Supreme Court has eliminated most
Fifth Amendment protections from incriminating documentary
evidence sought by compulsion. First, the Supreme Court narrowed
the privilege so that it applies only if the act of producing
papers or records, by itself, has A self-incriminatory
communicative or testimonial aspect. If the act of handing over
the papers is noncommunicative--that is, if it neither reveals
the existence of the document nor authenticates it--then the
Fifth Amendment ordinarily does not apply.{543} Second, only natural
persons can find shelter under the Fifth Amendment, and only for
papers they both own and control. Thus, corporations can never
claim the privilege, and neither can natural persons with regard
to corporate records, even if they created and now control those
records.{544} Third,
once papers are handed to [Page
836]
another, the legitimate expectation of privacy
needed to maintain a claim under either the Fourth or Fifth
Amendments disappears.{545} Fourth, records
required to be kept for legal
or regulatory purposes are outside the privilege.{546} Fifth, and only
tangentially related to documents, the Supreme Court has held
that persons can be forced to perform nontestimonial acts such as
giving handwriting samples.{547} Sixth,
aliens outside the sovereign territory
of the United States do not ordinarily enjoy Fifth Amendment
rights.{548}
The Court's narrowing
interpretations notwithstanding,
Boyd is not completely toothless. Boyd has A
residual vitality for nonbusiness, nonfinancial, private papers
and documents that are kept in the home, if only because the
Supreme Court has yet to compel production of such a document.{549}
2. Is a Chip Key or a Session Key
"Incriminating"?
The hornbook rule is that testimony must be incriminating when
uttered in order to be entitled to protection under the Fifth
Amendment. The testimony must relate to past conduct and, if it
does not directly incriminate ("Yes, I did it") must at
least create a "substantial" and "real"
hazard of prosecution for the Fifth Amendment to apply.{550}
The Fifth Amendment does not protect
testimony that might become incriminating through future conduct.
In United States v.[Page
837]
Freed,{551} the
Supreme Court
upheld a National Firearms Act registration requirement against A
Fifth Amendment claim that the disclosed information might be
used against the defendant if he committed an offense with a
firearm in the future.{552}
Forced disclosure of a chip key and a session key fit uneasily into this framework. The forced disclosure of a chip key{553} before the chip has ever been used to communicate cannot be incriminating because nothing has happened yet. Thus, mandatory key escrow itself fits squarely within the Freed rationale. In contrast, the LEAF raises a more delicate problem. Because the LEAF precedes the actual conversation, forced utterance of a LEAF could be said to fall within the Freed rationale also. But this is really too facile to be credible. The encrypted session key within the LEAF is unique, and it is directly tied to the conversation that follows it. In any case, whether the LEAF is part of the conversation or not, it is an utterance that creates A "substantial" and "real" hazard of prosecution if the conversation that follows is an incriminating one, and A public servant happens to be listening.{554} On the other hand, the Supreme Court has emphasized that nontestimonial compelled disclosures are not privileged,{555} and the LEAF itself is not testimonial, save insofar as it ties A particular conversation to a particular pair of chips.
In
summary, the Fifth Amendment may not protect disclosure of A chip
key against mandatory key escrow, but it protects individuals
against the routine warrantless use of that key to decrypt
the LEAF and, especially, to decrypt an incriminating
communication. Because the stated purpose of escrowed encryption
is to
allow the government to retain the abilities it currently has,
and the government accepts that a warrant is required to conduct
A wiretap, the Fifth Amendment imposes no significant restriction
on a[Page 838]
mandatory key
escrow proposal of the type hypothesized.
D. Privacy Issues
The constitutional right to privacy derives from the First,
Third, Fourth, Fifth, Ninth, and Fourteenth Amendments, although
it exceeds the sum of its parts.{556} The
right to privacy has at least three components: (1) a right to
be left alone; (2) a right to
autonomous
choice regarding intimate matters; and (3) a right to autonomous
choice regarding other personal matters.{557} There is no question that mandatory key
escrow would infringe on each of these component rights. The
question, already partly canvassed above,{558} is whether the courts
would consider the intrusions reasonably related to a
sufficiently compelling state interest to justify the intrusion.
As might be expected, the limitations on mandatory key escrow
arising from the right to privacy conform closely to those
derived from the First, Fourth, and Fifth Amendments from which
the privacy right partly emanates. Privacy jurisprudence is in
some turmoil, however, and it is possible that privacy will prove
to be the most fertile areA for legal adaptation to the new
challenges posed by increasing state surveillance power and
compensating private responses such as cryptography.
[Page 839]
such as
the choice of friends, political party, vocation, and other
allegiances.{559} Disputes
concerning
this category, such as alleged infringements of associational
freedom, tend to be adjudicated directly under the rubric of one
or more amendments in the Bill of Rights rather than by appeal to
privacy principles. These aspects of privacy law were canvassed
above{560} and will not
be repeated here.
Mandatory key escrow appears
comparable in its intrusive effects to the regulatory scheme
upheld in Whalen, so long as the courts hold the
government to its promise that keys will remain secret and will
be released only pursuant to a warrant or to a very limited
number of other lawful orders. Without that proviso, mandatory
key escrow would verge upon unjustified datA collection.{568} The
warning in Whalen that the Court is "not unaware of
the threat to privacy implicit in the accumulation of vast
amounts of personal information in computerized data banks or
other massive government files"{569} suggests, however, that informational
privacy
rights may grow in response to new technological threats to
privacy.
[Page 841]
described
certain decisions about intimate association and family- and sex-
related decisions as falling within a special privacy zone for
"marriage,
procreation, contraception, family relationships, and child
rearing and education."{570} The
contours of this zone have always been fuzzy, in part because of
long-standing decisions forbidding certain religious minority
marriage practices{571} that would
logically
appear to belong within the zone of privacy described by cases
such as Griswold v. Connecticut, Eisenstadt v.
Baird, Paul v. Davis, and Roe v. Wade.{572} The fuzziness
currently is at an all-time high because of the Court's decision
in Bowers v. Hardwick{573} and the
continuing controversy concerning the
right[Page 842]
to abortion.{574}As applied, this second strand of privacy jurisprudence is primarily directed at the preservation of personal autonomy,{575} and especially that autonomy relating to the sexual and reproductive practices and values of the traditional family and the "traditional unmarried couple."{576} Secrecy has a role to play here too, because sometimes secrecy is a prerequisite to the exercise of autonomy, even (or especially) within the family.{577}
Furthermore, electronic
communications will increasingly become a critical part of
intimate association. In a world in which the commuter marriage
is increasingly common, electronic communications such as the
telephone, fax, and especially e-mail (which is cheaper and less
intrusive than a telephone, more private than a fax, and often
instantaneous) are increasingly becoming the glue that holds
marriages and other intimate relationships together.{578} The current rule,
which provides much greater privacy protection to the bedroom
than to the intimate, transcontinental, interspousal e-mail,{579} may soon need revision.{580} Such A revi[Page
843]
sion should begin by reaffirming what remains of
Boyd, particularly as it applies to personal papers such
as diaries.{581}
E. What Next?
On balance, as the law stands today, private, noncommercial
users of encryption probably have a Fourth Amendment right to
resist mandatory key escrow. Whether commercial users or
corporations would have such a right under current doctrines is
less clear. Even the existence of the right for private,
noncommercial users appears to be a distressingly close question
given the current state of civil rights doctrine and the great
importance that the courts give to law enforcement and national
security. The law in this area has undergone great change in the
past two decades, and there is no reason to believe that the
evolution has stopped.
The collapse of the distinction
between home and office, fueled in part by the growth of
telecommuting, will place a further strain on existing rules that
attempt to distinguish between private, noncommercial activities
whose classical locus is the home, and less private, more
commercial activities whose traditional location was the office.
If the courts further erode the remnant of the zone of privacy
that still surrounds the home, the growth in freedom to work at
home will have come at a high price.
IV. Ideas Are Weapons
The Bill of Rights is predicated on assumptions about
technological limits that may soon be falsified. For example,
the First
Amendment is premised on a theory of speech that never imagined
that scientific speech might become inherently dangerous.{582} Some have argued
that the publication of advanced cryptographic algorithms
constitutes inherently dangerous speech. Although I [Page 844]
think these arguments are
misguided because the devices described by the speech cannot
directly hurt anyone, it is certainly possible to imagine that
one day someone will create a device that is easy to build, runs
on house current, and is incredibly destructive. Publication of
the plans for such a device might be dangerous indeed, and
arguably outside the First Amendment, at least as originally
conceived. In light of technical changes, "the Justices are
now faced . . . with the difficult task of deciding just how high
a price our constitutional commitment to open, meaningful
discussion requires us to pay in terms of . . . competing
concerns."{583}
The Fourth Amendment also has implicit assumptions about the limits of technology that will soon break down, if they have not already done so.{584} The Fourth Amendment was written, debated, ratified, administered, and interpreted for its first two hundred years on the basis of the assumption that there are physical limits to the government's ability to make "reasonable" searches. Although this presumption has begun to fray in the face of technical augmentation of the public servant's ability to peer into private property,{585} the basic assumption that the police cannot be everywhere at once remains unshaken.{586} This assumption may have to change, and soon. If it does, our thinking about the Fourth Amendment will need to change with it.{587}
[Page
845]
When the Bill of Rights was adopted in 1791,
speech took place between persons who saw each other eye-to-eye,
or who wrote letters by hand, or printed pamphlets, books, and
newspapers using hand-set type. Today those forms of
communication have been supplemented, and often
supplanted, by electronic communications including telephones,
desktop publishing, and e-mail; additionally, letters are often
typed on computers before being mailed. These new media--
particularly the telephone, but increasingly e-mail--are the
bases of modern social and commercial relations. They play
significant roles in political and cultural organizations as
well.{588} Broadcasting viA radio and
television (including cable television) is now the dominant mode
of mass communication, having long supplanted the written word.
But new forms of electronic communication such as the electronic
bulletin board, the Internet USENET newsgroup, and the Internet
mailing list promise to revive the importance of the written
word.
Under current law, a person communicating via new media is less
able to ensure her privacy than were speakers in the late
eighteenth century. If Thomas Jefferson wanted to speak
privately to John Adams, they could go for a walk in an open
field where they could see any potential eavesdroppers from a
mile away.{589} Letters could be
encoded with the then-unbreakable VigenŠre cipher, although this
would have been a slow process and was thus rarely used.{590} In contrast, modern
eavesdroppers, particularly[Page
846]
wiretappers, are invisible. Strong cryptography
offers the prospect of restoring the privacy enjoyed by Jefferson
and Adams to anyone with a computer or a scrambler telephone,
thereby making it again possible to enjoy this privacy, albeit
with a larger number of correspondents spread out over greater
distances.
As the preceding sections have shown, however, it is far from
obvious from the Court's recent decisions that the Constitution
will be read to block a future government's attempt to dampen the
privacy-enhancing effects of the cryptographic revolution. Part
III argued that the constitutionality of hypothetical mandatory
key escrow legislation would most often turn on a balancing test
in which the Supreme Court would seek to weigh security claims
against privacy interests.{591}
Regardless of how the Court decides to strike the balance, it
will involve a process requiring decisions not compelled by any
precedent. As is often the case when the law encounters new
technology, the decisional law is indeterminate or at the very
least distinguishable. In order to predict where the law
relating to cryptography may be going and to suggest feasible
alternatives, one needs to understand the concerns that are
likely to influence the balance between claims asserting security
and privacy
interests. The remainder of this final Part looks at how this
balancing of incommensurables might work.
A. Caught Between Archetypes
The protections we find in the Constitution turn in part on the
horror stories and heroic legends we tell ourselves;
constitutional law is indeed "haunted by archetypes."{592} Examples of archetypes in constitutional
law include Chief Justice Marshall, President Lincoln, the
Lochner Court, President Nixon, and the phenomenon of
McCarthyism.{593} The key escrow
debates have the misfortune[Page
847]
to be situated at the intersection of entrenched
but conflicting social/political archetypes: the totalitarian
state and the conspirator.
1. Big Brother
There can be no doubt that the power of the national government
has grown with every response to a national challenge, be it the
Civil War, subsequent wars, the depression, or the moral,
political, and practical demands of the modern welfare state.
The original American constitution, the Articles of
Confederation, collapsed because it was too decentralized. The
replacement, our current Constitution, was bitterly opposed by a
substantial minority who believed it to be an overcorrection
towards
centralized rule.{594} Since the
adoption of the Constitution, or at least since the installation
of Chief Justice Marshall, the tendency has been to ratchet power
towards the center, away from states. This
progression is exemplified by the evolution of the Commerce
Clause.
Gradually, the federal power to regulate under the Commerce
Clause has come to encompass almost all economic transactions, no
matter how local.{595} The
evolution of the formal relationship between the central
government and the individual is, however, more complex. The
size and complexity of the national government's economic, legal,
and regulatory powers have grown greatly, but so too (if perhaps
not as quickly) have the formal legal rights of the paradigmatic
citizen.{596}
Despite these
developments, no sophisticated constitutional analysis is
necessary to recognize that the government still lacks the
authority to require that every television set sold in the United
States carry with it a microphone for police to eavesdrop on[Page 848]
conversations in the
home.{597} The
deficiencies in this proposal would not be cured by hedging the
microphone's remote controls with the most elaborate safeguards
and requiring careful judicial review before activating any of
the devices. The very idea is, fortunately, laughable, so much
so that it is hard even to imagine any government official
proposing it.{598}
Similarly, it is hard to imagine that a statute designed to ease surreptitious entry by the police, requiring every citizen to give the government a set of keys to her house, would survive even cursory judicial review.{599} These hypothetical statutes require so little analysis because they immediately evoke George Orwell's 1984.{600} Big Brother and his thought police provide a vivid archetype of state overintrusion into private life.{601}
Vivid as the Orwellian archetype may be, it is far from all-
powerful, as demonstrated by the large number of dissenting
opinions vainly invoking it.{602} It may
be that mandatory key[Page
849]
escrow will produce the same public outcry as the
wiretap-ready TV set{603} or A telephone
that can be turned into a bugging device even when on the hook.{604} It should. And if
it does, half of this Article is an exercise in pedantic
overcaution, and the fact that the state of the law produces a
need for such caution is a deeply damning indictment of the
Burger/ Rehnquist Court's privacy jurisprudence. For if it is
obvious from the Zeitgeist that mandatory key escrow is
impossible, yet far from obvious from the relevant cases, then
the cases are revealed as being significantly more state-oriented
and illiberal than what "society is prepared to recognize as
`reasonable.'"{605}
Even
overcaution has its place. The spirit of the times is always
subject to change. As a result, we have a human court to
interpret the Constitution in light of changing conditions.{606} Interpreters of the
Constitution are unable to be completely insensitive to the felt
necessities of the times, whether consciously or not. The
decision about the constitutionality of mandatory key escrow
would be no exception to this rule. The same dynamic that
produced the facts of Korematsu v. United States and the
decision itself might arise again,{607} especially if those necessities are felt
strongly by officials gripped by a sense of insecurity or
danger.{608} "Collectively
we[Page 850]
face no greater
challenge than maintaining
sensible perspectives on national security issues," but we
sometimes fail the test.{609} One need not
believe that the executive is inevitably repressive or that the
judiciary is inevitably supine to recognize that both of these
tendencies can be found in our nation's history, and that we
therefore cannot rule out their recurrence with total confidence.
2. The Conspirator
Cryptography is the conspirator's ideal tool. This fact, more
than anything else, will provide the emotional (if not the legal)
justification for any attempt to control the use of unescrowed
cryptography. Anxiety that people will be able to do terrible
things without being detected until it is too late has been A
recurring worry since the Puritans colonized North America.{610}
Given the public opposition to
Clipper,{611} the government is
unlikely to propose mandatory key escrow without some triggering
event. In the wake of a great crime, perhaps by terrorists or
drug cartels--the detection of which could plausibly have been
frustrated by encryption--that which today looks clearly
unconstitutional might unfortunately appear more palatable.{612} Suppose, for example,[Page 851]
that Senator McCarthy
had been able to demonstrate in 1952 that Communists were using
strong encryption to protect their secrets. It is not hard to
believe that some form of key escrow or an outright ban on
cryptography would have been proposed, if not adopted. Big
Brother may yet look more comforting than the reigning criminal
archetypes.
a. Panics Over Plotters
The fear that two or more persons may unite in common cause to
carry out an unlawful or reprehensible plot against a social
community and its privileged ideals is an archetypical American
concern.{613} The real
or imagined conspirators have varied, being alternately foreign,
domestic, or an unholy alliance of the two.{614} The fear of conspirators is not a recent
phenomena dating back to the McCarthy era, but rather is rooted
in the Puritan jeremiad about the dangers of succumbing to
Satanic, Catholic, or nonconformist [Page
852]
conspiracies.
"In a nation in which every man is supposed to be on the
make, there is an overriding fear of being taken in."{615} Fear of conspiracy
is hardly unique to America,{616} although
"Americans have been curiously obsessed with the contingency
of their experiment with
freedom."{617} The fear of
conspiracy in America is a subject so vast that no small sketch
can do it justice.
The seventeenth-century Puritan colonists of New England worried about conspiracies among Indians, Quakers, witches, and the (Catholic) French, all of whom threatened the new Israel, and in so doing also threatened to prevent the fulfillment of the millennial vision that informed those communities' founding religious vision.{618} When their relations with England soured, the colonists blamed imperialistic conspiracies they believed were aiming to destroy their liberties.{619}
In the eyes of the
victors, independence only made the new nation a more attractive
target to its enemies. George Washington's instantly canonical
Farewell Address sounded the theme, warning of the danger posed
to the American nation by secret enemies. To President
Washington, it was "easy to foresee that from different
causes and from different quarters much pains [would] be taken,
many artifices employed, to weaken" the American commitment
to its new nation. Commitment to the American
experiment was precarious, vulnerable, and thus would be
"the point in your political fortress against which the
batteries of internal and external enemies will be most
constantly and
actively" directing their energies, "though often
covertly and insidiously."{620} The
diversity of the American people, with widely varying ancestries,
differing [Page 853]
sects and
creeds, and different experiences
depending on class and region, served to make the nation even
more vulnerable to "faction" and conspiracy.{621}
Indeed, New England was soon gripped by a panic that the (mythical) Bavarian Illuminati, a secret, French-controlled, atheistic, antidemocratic cabal, was conspiring to undermine American liberty and religion. Despite the absence of any extant Bavarian Illuminati, many believed that the United States was in grave danger and "the vast majority of New England Federalists accepted the conspiracy charges as entirely plausible, if not completely proven."{622} The evils of the Terror in post-Revolutionary France only served to confirm the awesome power of conspirators, for only a vast conspiracy could explain the otherwise bewildering series of events.{623} The image of the Terror, and the Masonic or Jacobean conspirators behind it, was one of the themes sounded to justify the Alien and Sedition Acts.{624}
A related, if less sanguinary, imagery dominated the attack on the Second Bank of the United States, an institution that Jacksonian Democrats viewed as a secretive "hydra of corruption" designed to favor the rich against the majority of the country.{625} Here, perhaps for the first time, the conspiracy was understood to be home- grown, if dominated by "aristocratic" elements fundamentally alien in spirit to the American popular democracy desired by the Jacksonians.
In the decades before the Civil War, as the threat of foreign
military invasion receded, Americans were gripped by the threat
of vast shadowy conspiracies in which various groups, including
[Page 854]
Freemasons,
Mormons, or Catholics, plotted secretly to subvert the American
(Protestant) way of life.{626} The American
commitment to openness meant that any organization that was not
itself open, that held secret meetings or, worse, had secret
rituals, was not just suspect, but threatening.{627} The Civil War itself
came to be portrayed as an apocalyptic battle between the forces
of freedom and a conspiratorial "Slave Power."{628}
Post-Civil War industrialization
provided the environment for A redefinition and bifurcation of
the nature of the conspiracy, one which persisted until the First
World War. On the one hand, there was the conspiracy of labor
organizations, be it the International Workers of the World, the
Socialist International, or the beginning of the American union
movement. An illustrative example was the reaction to the
Haymarket Riot of 1886, in which Chicago workers battled police,
and which culminated in the explosion of a dynamite bomb that was
widely ascribed to anarchists. Convictions were returned by a
jury made up of persons who admitted prejudice against the
defendants, and whom the judge charged to convict if "there
was a conspiracy to overthrow the existing order of society"
and that the defendants and the bomb-thrower, whoever he might
be, were parties to it.{629} Others, notably
the Progressives, focused on
what they perceived was the conspiracy among corporations, trusts
and a small cabal built on wealth and privilege{630} designed, they
thought, to wrest control of the economy and thus undermine the
democratic process.{631} The entry of the
United States into the[Page 855]
First World War shortly
after President Wilson's promise to stay out of it, combined with
Communism's triumph in Russia, unleashed an orgy of
antisubversive activity, although the distinction from simple
nativism was not always clear.{632}
Late-twentieth-century anti-Communist antisubversion is a topic
all of its own, spanning McCarthyism,{633} loyalty oaths,{634} and a large number of important civil
liberties decisions, notably those alternately expanding and
contracting the protections of the First Amendment.{635} The low water mark,
achieved in Dennis v. United States,{636} amounted to the
criminalization of "any radical political doctrine."{637} Recently, in A climate of lessened
insecurity, the courts have been considerably more willing to
allow a wide range of radical speech.{638} The important points
for present purposes are[Page
856]
that, although one could debate what constitutes
the norm, decisions significantly curtailing civil liberties are
far from unique and that these decisions are often driven by a
fear of criminal or subversive plotters, a fear with deep roots
in our culture. Given an appropriate trigger, it is entirely
possible that a similar dynamic could apply to the regulation of
cryptography.
b. Modern Incarnations: The Drug Kingpin and the
Terrorist
Today's most significant criminal archetypes are the drug
kingpin and the terrorist. Perhaps because criminal, anarchist,
and terrorist archetypes arise from events rather than
literature, there is no single figure or movement with the
evocative power of Big Brother, now that the international
communist conspiracy has left the anxiety closet and joined the
Wobblies and the Jacobins in the care of the History Department.
Jack the Ripper, the
prototypical serial killer, may be the closest thing to A
threatening criminal archetype that has lasted more than two
generations, although Al Capone, Charles Manson, and the urban
gang member all have achieved some near-mythic status as well.
What criminal archetypes lack in longevity, they make up in menace. At least until his capture, the terrorist archetype for many years was Carlos the Jackal.{639} Carlos's capture, coming as it does on the heels of the collapse of any credible international communist movement, may create a small window of opportunity for privacy activists to seek legislative or judicial ratification for their argument that Big Brother is a greater menace. This window may not be open for long.{640}
[Page 857]
The "War on
Drugs" proclaimed by President Ronald Reagan in 1982 and
enthusiastically endorsed by Presidents Bush and Clinton may be
our "longest war."{641}
"Refusing to shoulder" the "unbearable
burden" of communities "devastated" by drugs,
"Americans have given the government a mandate to eliminate
the scourge before its effects become
irrevocable."{642} Champions of the
War on Drugs claim "almost universal
acceptance [of the notion] that the drug problem is `the worst
disease that plagues our nation today.'"{643} Notably, "[b]y
the mid 1980s . . . anti-drug sentiment encompassing even casual
use became a national cause."{644}
Likewise, fear of the drug kingpin quickly reached Congress, which reacted first by passing the continuing criminal enterprise statute, better known as the drug kingpin statute, imposing long mandatory minimum sentences.{645} When that was not enough, Congress passed A statute permitting the death penalty in drug cases connected to A killing.{646} Along the way, Congress made attempting to be a drug kingpin and conspiring with a drug kingpin punishable by the same penalty as the offense itself.{647}
With Congress and the
Executive in such agreement, it is no surprise that the War on
Drugs is a major social phenomenon, having criminalized behavior
engaged in by somewhere between fourteen and twenty-three million
Americans per year.{648} Experts[Page 858]
recently began to study the
war's longer-term effects, including the incarceration of
hundreds of thousands of persons, clogged courts, corrupt police
officers and corrupt judges, and a disproportionate impact on
African-American males (and, through them, on their families).{649} The War on Drugs has achieved these and
other
dubious results through the expedient of demonizing the drug
user, the drug "pusher," and especially the drug
"kingpin."{650}
The fear of drug trafficking puts great pressure on the police. The mutually consensual nature of the drug crime requires a special type of police work because "law enforcement lacks its staunchest ally, the victim,"{651} who might report the crime and testify against the perpetrator. As a result, the police are driven to use surveillance, wiretapping, and informants because they have little else upon which to rely.{652} In 1993, about two-thirds of the court-ordered wiretaps were for drug-related offenses.{653} A similar dynamic has stimulated the expansion of surveillance into the workplace.{654}
The attempt to control
drug trafficking also puts pressures on the courts that go beyond
increasing caseloads. The War on Drugs creates pressure to
support efforts to eradicate the drug
"plague." The results of such pressure include Supreme
Court descriptions of drug traffickers as "sophisticated
criminal syndicates" that create[Page 859]
an unprecedented obstacle
for law enforcement.{655} In addition, the
antidrug crusade has had a major impact on the Fourth Amendment:
since President Reagan declared the War on Drugs, the government
has prevailed in nearly every search and seizure case before the
Supreme Court.{656}
As James
Boyd White points out in his analysis of Chief Justice Taft's
majority opinion in Olmstead v. United States,{657} the "organiza-
tion, scale, enterprise, and success" of a criminal
enterprise can become the occasion for a morality tale in which
the government represents good struggling against the forces of
evil.{658} In the context of
actions that can be painted as enormous threats, constitutional
texts like the Fourth Amendment can be made to seem irrelevant.{659} The implication for
mandatory key escrow is obvious: the more closely the purposes
of a statute are aligned with control of drugs or terrorism, and
especially the more closely the facts of a test case conform to
the apprehension of an archetypical evildoer, the less chance it
will be declared unconstitutional.{660}
B. Mediating the Clash: A Metaphoric Menu
Social and political archetypes influence constitutional
decision- making, but the decisions themselves are couched in
more traditionally legal terms. These terms have a power of
their own, which accounts for some of the law's partial autonomy
as A discipline. Thus, for example, the homeless defendant in
Connecticut v. Mooney{661} sought
to exclude evidence seized from his habitual abode on public land
under a bridge abutment on the ground that[Page 860]
the place had become his
"home."{662} "Home"
is a powerful metaphor; it is also a legal category that implies
certain outcomes and
forecloses others.
It is old news that common-law legal reasoning is both
analogical and taxonomical,{663} and that
metaphor is a powerful tool for both.{664} Nevertheless, the
observation that "[t]he power of a metaphor is that it
colors and controls our subsequent thinking about its
subject"{665} is particularly
relevant and powerful when the law encounters a new technology.{666} The law's first
reaction to a[Page 861]
new
technology is to reach for analogies and to explain why the new
technology can be treated identically to an earlier technology.
Railroads, for example, could be slotted into the existing legal
categories created to deal with highways, collisions, and freight
tariffs.{667} In
contrast, airplanes--a technological advance on the same order as
the railroad--required a significant change in the law because to
be useful the airplane must fly over land, a classical trespass,
without a right of way.{668}
In the case of cryptography, as with other new technologies, the dominant mode of judicial and perhaps political judgment is likely to be classification by analogy and argument by metaphor. Given that in recent years Big Brother seems to have inspired less fear in any branch of government than has Big Drugs, the selection of the right metaphor is critical.
Four metaphors seem particularly likely to appeal to the
courts, which can be summarized under the rubrics of
"car," "language," "house," and
"safe." These four metaphors reflect two fundamentally
different
characterizations of cryptography. Both "car" and
"language" characterize cryptography as part of the
means used to transmit the message. In this view, an encrypted
message is simply another communication, one which can best be
understood as a special case of the general rules regulating
communications. In contrast, "house" and
"safe" treat
cryptography as something that happens before the message leaves
the sender. Both "house" and "safe" suggest
that the proper approach is to start with the sender's decision
to encipher the message in order to exclude unwanted recipients,
and then explore the implications of this choice for the
government's ability to monitor communications. The
differences[Page 862]
among these metaphors go
beyond putting a positive or negative gloss on encryption; they
amount to different definitions of the nature of the thing
itself. Interestingly, both the general metaphor of
communication and the metaphors of exclusion are sufficiently
indeterminate to permit the introduction of more specific, and
contradictory, metaphors that support both sides in the mandatory
EES debate.
1. Focus on Communication
"Communication" as we tend to understand it is itself
a metaphor. English speakers tend to speak as if they share A
particular mental image of how words work. As Michael Reddy has
persuasively demonstrated, the English language defaults to A
metaphor of communication as a conduit for human thoughts and
feelings.{669} In this
cognitive shorthand, the speaker begins with a meaning that she
"puts into words," which are then "gotten
across" to the auditor who then "unpacks,"
"gets," or "absorbs" the speaker's meaning.{670} To put it another
way, the speaker/author "encodes" meanings into words
that "convey" meanings that are then
"decoded" by the recipient.{671}
The ubiquity of the conduit metaphor
suggests that encryption, if understood as a communications
conduit, stands the best chance of being accepted by the courts,
Congress, and the public. The same ubiquity, however, also means
that different analogic
embodiments of the same general metaphor can lead to
diametrically opposite views of the constitutionality of a
hypothetical ban on unescrowed cryptography. Indeed, if the
encrypted message is seen[Page
863]
as a mobile message unit--a "car" on the
information superhighway--then mandatory EES appears far less
troubling than if the encrypted messages are analogized to
language itself.
a. "Car"--How Messages Travel
One could say that the escrowing of a key is akin to having
one's picture taken when applying for a license to drive on the
information superhighway. Or, perhaps, the chip's unique key
could be
seen as something like a license plate. If the reigning metaphor
for use of electronic communications is that of a car on the
road, it is but a small step to fixed or random checkpoints, and
other minor electronic detentions.
The LEAF feature in Clipper and Capstone makes checkpoints for compliance with mandatory key escrow particularly easy to implement, in a way that other forms of escrowed encryption might not. Telephone and e-mail messages can be intercepted and copied at random, or according to some pattern, then decrypted with the family key. If there is no LEAF at all, the message is clearly in violation of the mandatory escrow rule. If there is a LEAF, although the text of the message remains inaccessible without recourse to the escrow agents,{672} public servants can check whether the message has been encrypted with an EES-compliant device because the chip serial number is supposed to contain a recognizable string identifying the manufacturer.{673} If this string is instead random, law enforcement knows it has encountered a spoofed LEAF.{674} The beauty of this system from a constitutional perspective is that the intrusion on privacy rights is relatively small. Law enforcement does not need to decrypt actual messages without a search warrant.
The car metaphor also provides law enforcement with a
solution to a timing problem. The information highway patrol may
not want to wait to obtain a warrant to find out whether a
message's cryptography is registered. Just as police who stop
cars on the highway have authority to conduct a search without a
warrant,{675}[Page
864]
the same might be done on the information highway,
and for similar reasons. Waiting for A warrant takes too long--
the car is gone; the message is gone.{676}
The car metaphor leads naturally, at
least from the law
enforcement perspective, to random traffic stops. If the analogy
is to
vehicular checkpoints, the examination of the LEAF can easily be
characterized as a minimally intrusive investigation in a way
that it cannot be if the focus is on the act of encryption as
something occurring in the home or office. In the case of real
cars, the Supreme Court applies a reasonableness analysis to
vehicle stops, weighing the gravity of the public concern, the
degree to which the seizure advances that concern, and the
interference with the individual's liberty. In United States
v. Martinez-
Fuerte,{677} the
Court held that limited seizures of vehicles for questioning of
the occupants could be constitutional even absent a
particularized suspicion due to the government's compelling
interest in reducing the influx of illegal aliens, the Border
Patrol's inability to do so at the border, and the minimal harm
to the traveller.{678} In Michigan
Department of State Police v. Sitz,{679} the Court found that even if only 1.5% of
the
drivers stopped were drinking, the governmental interest was
great enough to justify the intrusion of a highway checkpoint.{680} Sitz used A three-prong test{681} requiring (1) a[Page 865]
weighing of the gravity of
the public concerns served by the seizure, (2) some empirical
evidence that the program was
effective,{682} and (3) A weighing of the
severity of the interference with individual liberty.
{683} The car metaphor will no
doubt have appeal beyond law
enforcement if only because it is already commonplace to talk
about the Internet as a highway, complete with "on-
ramps," "fast lanes," and "maps."
Perhaps it is time to abandon these convenient phrases. If the
car metaphor prevails, there will be far fewer constitutional
rights in cyberspace than if any other metaphor comes to
dominate.{684}
b. "Language"
A cipher resembles a foreign language. Indeed, during World
War II, the U.S. Navy used speakers of obscure Native American
languages to exchange radio messages that could not be understood
by the Japanese.{685} A federal law
requiring that English be the sole mode of
communication for telephones or e-mails would be unconstitutional
on First Amendment grounds and would violate the Equal Protection
Clauses of the Fifth and Fourteenth Amendments.{686} If one accepts the
analogy, it follows that no [Page
866]
cryptosystem may be outlawed. Nor, continuing the
analogy, can the government require that users provide it with A
translation of their messages. Not only would this have Fifth
Amendment implications,{687} but it would
chill free speech.{688}
Although a
cipher resembles a foreign language, it is not[Page 867]
identical. No one can
speak in DES without mechanical aids, and no one can understand a
DES-encrypted message if they do not understand the language of
the plaintext. Cryptologist Dorothy Denning argues that these
differences are so great that in some important sense encrypted
speech "is not speech."{689} As Denning notes, languages have semantic
blocks such as words, phrases, sentences, or ideograms that
"carry" meaning and that can be manipulated, but
ciphertext has no such blocks.{690} Also,
Denning argues, all languages share the
property that thoughts, emotions, beliefs, requests, offers, and
concepts can be expressed without knowledge of any other
language.{691} In
contrast, ciphertext not only needs to be decrypted to be
understood, but the recipient must understand the language of the
plaintext in order to comprehend the message.{692}
Existing First Amendment
jurisprudence provides guidance that helps determine whether
these differences between enciphered communications and ordinary
language speech should be considered legally significant.{693} The First Amendment protects
communicative acts, not specific modes of communication.{694} The First[Page
868]
Amendment protects many communicative acts that do
not use words including photographs, pictures, nude dances,{695} and silent protest.{696} Although they do not use words, these
protected communications do have messages. They have semantic
units and referents of the type described by Roland Barthes and
others.{697} Indeed,
the Supreme Court recently reaffirmed the importance of the
connection between an unspoken assertion of the speaker's
identity and the communicative content of the message. In
City of Ladue v. Gilleo,{698} the
Court held that an ordinance prohibiting the display of signs in
the front yard of a house violated the resident's right to free
speech as "the identity of the speaker is an important
component of many attempts to persuade."{699} Similarly, there is
no reason to believe that the First Amendment is concerned with
the degree to which a communicative system depends upon another
or stands alone.{700} If the First
Amendment protects works of art so obscure that they can only be
understood by their creator,{701} it can
equally well be applied to protect encrypted speech. Thus, from
a First Amendment standpoint, the differences between ciphertext
and ordinary language identified by Denning are irrelevant to[Page
869]
whether a communication is protected speech.
From the viewpoint of the language metaphor, the most troubling difference between ciphertext and classic protected speech is that, in the case of ciphertext, a mechanical aid is required to both create and comprehend the message. This difficulty should not be exaggerated--metaphors, after all, are invoked when things are not identical, not when they are precisely the same. It is unlikely that the mechanical aid would appear as troublesome if, instead of a cryptographic system, the device in question were a prosthetic device, such as a voice synthesizer or an eye-movement operated computer.{702} In the case of these prosthetic devices, one would not expect to find arguments that the communication emanating from the device, or the signals used to operate the device, were any less entitled to First Amendment protection than ordinary speech. Again, the ciphertext example is not identical because the parties to the communication presumably have alternate forms of communication available to them, but this just underlines the fact that it is not the mechanization of the communication that ought to be the stumbling block. As new technologies such as voice recognition become commonplace, one would not expect to see arguments that the speech is somehow less protected while in the binary form that intermediates between sound waves and text, even if the author could have used a pencil instead of speaking to a computer. In any event, because a work of art requiring computer animation or computer-aided virtual reality would depend critically on the assistance of a machine, it would clearly be entitled to the same First Amendment protections as A painting.
Encrypted speech is not exactly an ordinary
language, but it is similar to what we ordinarily mean by
language. Moreover, most of the differences are not the ones
that usually matter in the First Amendment context. The most
significant difference between encrypted speech and ordinary
speech is the role of a machine. Indeed, the encrypted
communication much resembles the telephone call, as voice is
translated into a ciphertext (an electrical or fiber-optical
signal) that is transmitted to the recipient who then[Page 870]
decrypts it (plays it on a
speaker). Telephone calls are subject to wiretaps not because a
machine is involved in the communication, but rather because once
public servants have obtained the appropriate warrant, the
signals are in the same position as unmechanized speech inside
the home.
Rejection of the language metaphor might lead to undesirable
consequences. If the government were able to require that users
of strong cryptography ensure the government's ability to decrypt
their messages, it might be only a small step to imposing limits
on the use of languages other than English.{703} Would the last two speakers of a dying
language be required to provide translations of their
conversations if the government charged that they were conspiring
in it? Such A rule would be as difficult to apply as it might be
difficult to distinguish among impenetrable slang or strange
accents, A language, and a code.
2. Focus on Exclusion
Just because somebody wishes to hide something does not mean
that the Constitution necessarily protects it. If desire
sufficed to produce exclusive control over information,
successful prosecutions would be rare. Conversely, just because
the government would find it convenient to know something about
an individual does not mean that the individual has any duty to
make the information easily accessible.{704} In
mediating between these extremes, the Supreme Court has given at
least lip service to the subjective and objective reasonableness
of the individual's desire to block the state's access to
information.
Reasonableness of expectations is a particularly manipulable
socially constructed term because the courts' decisions are an
important determinant of what is reasonably expected. If,
despite this, one grants that the idea of reasonable expectations
has some exogenous content, then the courts' willingness to
protect strong cryptography against government control is likely
to be influenced by the extent to which judges find something
familiar and
reasonable in the act of encrypting a message. Cryptography that
feels like a PIN number used to access cash machines will be
treated differently from cryptography that feels like the tool of
drug dealers and terrorists.{705}[Page 871]
a. "Safe"
A cipher is armor around a communication much like a safe is
armor around a possession. A person who puts something in a safe
to which they have the only key or combination surely has both A
subjective and objective reasonable expectation of privacy
regarding the contents.
Simply putting something into a safe does not, however, ensure
that it is beyond the law's reach. It is settled law that A
criminal defendant can be forced to surrender the physical key to
a physical safe, so long as the act of production is not
testimonial.{706} Presumably a
similar rule compelling production would apply to A criminal
defendant who has written down the combination to a safe on a
piece of paper. There appears to be no authority on whether a
criminal defendant can be compelled to disclose the combination
to a safe that the defendant has prudently refrained from
committing to writing, and in Fisher v. United States,{707} the Supreme Court
hinted that compelling the disclosure of documents similar to A
safe's combination might raise Fifth Amendment problems.{708} Perhaps the combination lock problem does
not arise because the police are able to get the information from
the manufacturer or are simply able to cut into the safe. These
options do not exist when the safe is replaced by the right
algorithm. Although brute-force cryptography is a theoretical
possibility,{709} neither safe
cracking, nor number crunching, nor an appeal to the manufacturer
is a practical option when the armor is an advanced cipher. The
recently released Federal Guidelines for Searching and Seizing[Page 872]
Computers{710} suggest that "[i]n some cases, it
might be appropriate to compel a third party who may know the
password (or even the suspect) to disclose it by subpoena
(with limited immunity, if appropriate)."{711}
Even if ciphertext is analogous to a document in an uncrackable safe whose combination has never been written down, there are important differences between a paper in a vault and an encrypted e-mail. A safe is a container into which people put things and take them out again, preserving the contents over time by keeping unauthorized persons from having access to them.{712} Ordinarily, a safe stays put. E-mails usually move around.
Current law on moving containers is not very friendly towards privacy. The Supreme Court has stated that "some containers (for example, a kit of burglar tools or a gun case) by their very nature cannot support any reasonable expectation of privacy because their contents can be inferred from their outward appearance."{713} That, at least, can never be the case with an encrypted message, because the external appearance of ciphertext gives no clue as to its content.
The moving safe begins to look a little like luggage.{714} Intuitively, the
privacy interest in a safe seems greater than the privacy
interest in luggage, which is perhaps fortunate because the
privacy interest in luggage has been shrinking towards the
vanishing point.{715} Ordinary luggage
can
be searched without a warrant upon reasonable suspicion.{716} There appear to be
no recent[Page 873]
reported
cases of police forcing a combination lock on luggage without
either consent,{717} A warrant, or an
alert from a drug-sniffing dog.
The privacy interest in locked luggage is insufficient to protect the owner against brief detentions of her property, in order to permit a dog sniff.{718} According to the Supreme Court, the sniff is not a Fourth Amendment "search" because the suitcase remains closed during the test, the dog discloses only the presence or absence of narcotics and cannot reveal the contents of the suitcase, and the "canine sniff" is the least intrusive method of ascertaining the presence of narcotics in the baggage.{719}
A sniff has some similarities to the investigation of a LEAF--low intrusion, bag/message remains closed/encrypted during investigation, and the investigation discloses nothing else about the contents of the bag/message. Like a dog alert, detection of an invalid LEAF could tell police that the message has been encrypted with unescrowed cryptography. Unlike dog alerts, LEAF- sniffing will prove generally unreliable because preencrypting a message with another cipher will hide the contents while presenting a valid LEAF for the world to see.{720}
Overall, the safe analogy is
appealing. Unfortunately, it either[Page 874]
maps the problem onto an
equally unsettled area of law or collapses back to another form
of the conduit metaphor.{721} It also has
the moderate defect of being vulnerable to technical changes,
although for the foreseeable future high-level encryption will
remain far, far easier than brute-force decryption. It may be,
however, that despite the potential for instability after the
next technological or
cryptographic revolution, the absence of law relating to
combination
locks without written combinations (the case most analogous to A
strong cipher with a secret passphrase) creates an opportunity to
make new law unencumbered by the baggage of the Supreme Court's
luggage precedents. There is no obvious reason why a person's
privacy interest in the contents of a safe, or a ciphertext,
should decrease sharply because the object is in transit, and it
would not be difficult to have the law reflect that reasonable
expectation.
b. "House"--Where Messages Come
From
Just as the car is a place where constitutional protections are
near their weakest, the house is where they approach their
strongest. The difference between the house and car metaphors is
perhaps best illustrated by California v. Carney,{722} in which the Court
had to decide whether a mobile home was a house or a car. If it
were a house, then it could only be searched with a warrant; if A
car, then no warrant was needed. The Court held that a moving RV
is a car for Fourth Amendment purposes, but left open the case of
a mobile home that is up on blocks.{723}
The Supreme Court's first encounter
with wiretapping produced A five-to-four decision holding that a
wiretap was neither a search nor a seizure because it took place
outside the home and did not interfere with the transmission of
the message. "The
reasonable view," Chief Justice Taft announced, "is
that one who installs in his house a telephone instrument with
connecting wires intends to project his voice to those quite
outside"; having left the sanctity of the home, those
messages and those wires "are not within the[Page 875]
protection of the Fourth
Amendment."{724} The majority's
first encounter with the law enforcement aspects of the telephone
treated it as a familiar thing: an instrument to send a message
out into the world, to meet whatever fate might befall it once it
was outside the constitutionally protected zone of the home--a
protection that relied in large part on the
homeowner's property interest in the residence.{725} Justice Brandeis's
dissent relied instead on Boyd's holding that the Fourth
Amendment prevents the government from forcing a person to
produce an incriminating document.
Katz abandoned the idea that the test for Fourth
Amendment protection rested on location. "[T]he Fourth
Amendment," Justice Stewart wrote, "protects people,
not places,"{726} and it thus
protected
a conversation originating in an enclosed public telephone booth.
Or rather, as Justice Harlan put it in his concurrence, the
Fourth Amendment protects a person who has "exhibited an
actual (subjective) expectation of privacy" and the
expectation is "one that society is prepared to recognize as
`reasonable.'"{727} Justice Stewart
cautioned that "[t]o read the Constitution more narrowly is
to ignore the vital role that the public telephone has come to
play in private communication."{728} Dissenting in
Katz, Justice Black accused the majority of choosing
"to rely on their limited understanding of modern scientific
subjects in order to fit the Constitution to the times and give
its language a meaning that it will not tolerate."{729} His complaint was
that Justice Harlan's concurrence, and by implication the
majority's opinion, argued that it was "`bad
physics'"{730} to maintain the
rule
originating in Olmstead v. United States that electronic
eavesdropping was not a search.{731} Justice
Black believed that the Fourth Amendment's protection of the
"right of the people to be secure in
their persons, houses, papers, and effects, against unreasonable
searches and seizures" connoted the idea of "tangible
things with size, form, and weight, things[Page 876]
capable of being searched,
seized, or both" and that an overheard conversation, even on
a wire, was none of these.{732}
In fact, it was Justice Black's physics that were faulty. Electrons have size, form, and mass, as do digitized and encrypted messages. Yet despite its questionable physics, Justice Black's legal conclusion appears to be gaining support: since, Katz, the Fourth Amendment, and its emanations have been read more and more narrowly. The Court has examined expectations of privacy that often seem greater, and more objectively reasonable, than those of a telephoner in a public phone booth, but has nonetheless found those expectations--when held by guilty parties--to be unreasonable.{733} Similarly, Boyd has been whittled away to the point that what vitality it retains is limited to personal, noncommercial papers, and even that is now in doubt. The rationale for Boyd's original holding has been effectively abandoned.{734}
In the place of
Katz and Boyd, the Supreme Court has substituted an
anemic form of the property theory of the Fourth Amendment that
animated the majority in Olmstead and Justice Black's
dissent in Katz, a theory that in its new form rarely
seems to extend outside the curtilage of the home.{735} Although not stated
as a general principle, as a practical matter the Katz
test has come to depend on the objective reasonableness of an
expectation of privacy,{736} and the Court has
routinely turned to legal or
administrative sources to define the parameters of
reasonableness{737}-[Page 877]
except when it ignores
them.{738} Thus, for
example, once trash is placed on the curb to be picked up, the
property interest in it is gone, and the trash is up for grabs.{739} If it is lawful to
fly over a property, it is objectively unreasonable for the owner
to expect that the property was safe from aerial inspection
regardless of the frequency of such flights or community
standards of
reasonableness, whatever those may be.{740} But despite the trespass by the observer,
there is no reasonable expectation of privacy against ambulatory
police intrusion into an "open" field, though
surrounded by woods, behind a locked gate with a "No
Trespassing" sign, because the field is open to view.{741}
Katz still applies to its
facts, which involved a wiretap of a telephone call from an
enclosed area. And the home clearly retains a special status,
because in United States v. Karo{742} warrantless monitoring of a beeper placed
by a government informant became unconstitutional at the point
where it "reveal[ed] a critical fact
about the interior of the premises that the Government . . .
could not have otherwise obtained without a warrant."{743} But given the
Court's record with reasonable expectations, the reason that
Katz is still good law seems to have more to do with the
existence of Title III than any constitutional principle. It has
come to the point where the citizen's privacy interests might be
better protected if the[Page
878]
discredited property-based theory of Olmstead were revived
instead of reviled.{744} Coincidentally,
the development of mass-market strong cryptography means that the
property theory, combined with a computer or A scrambler
telephone, produces a level of constitutionally protected
communications privacy comparable to the original Katz
standard.{745} Today,
with the right cryptography, it no longer matters as much if
Katz is narrowed or overturned, at least as far as
communications privacy is concerned.{746} Now there is something that the citizen
can do, inside the home, before a message is sent out on its
vulnerable journey throughout the world.
The value of the legal protections that can pertain to the home if based on a property theory like that in Olmstead should not be overstated. So long as we hold to current conceptions of the home, as a physical structure with walls and sometimes A curtilage too, the interconnections between homes will continue to be classified as "outside" the "house." As A result, regulations, including a ban on the use of unescrowed strong cryptography in communications that leave the house, remain a real possibility.
The "house" metaphor may provide some protection
against the complete reversal of Boyd, depending on
whether a court could compel the production of a key that had not
been committed to paper. If the court were unwilling to do this,
say on Fifth Amend[Page
879]
ment grounds, strong cryptography would provide a
nearly unbreakable means of protecting one's private papers
stored in the home computer.{747}
The decision to classify the cryptographic key as akin to A private paper located in the home also may have interesting legal consequences as the idea of the home evolves. Assumptions about space, about what constitutes the "inside" of a home or an office, may need to be reexamined:
[I]n the era where people work for "virtual corporations" and conduct personal and political lives in "cyberspace," the distinction between communication of information and storage of information is increasingly vague. The organization in which one works may constitute a single virtual space, but be physically dispersed. So, the papers and files of the organization or individual may be moved within the organization by means of telecommunications technology. Instantaneous access to encryption keys, without prior notice to the communicating parties, may well constitute a secret search, if the target is a virtual corporation or an individual whose "papers" are physically dispersed.{748}In this vision, the cryptographic key becomes the thing after which it is named and is transformed from a metaphor into the actual key to a virtual--or is that actual?--electronic home or office.{749}
A court, or any other interested party, that might be called
upon[Page 880]
to select among
competing metaphors will naturally be concerned about where they
might lead. Widespread cryptography may have social implications
that are difficult to predict. Strong privacy may not be to
everyone's taste. Secret identities will protect the anonymous
hate mailer, the drug dealer, the purchaser of seedy movies, the
congenitally shy, and the electronic political pamphleteer with
equal efficacy. The implications of anonymous transactions for
taxes, product liability, and copyright, to name only a few,
remain to be worked out.
Sissela Bok hypothesizes a society in which "everyone can keep secrets impenetrable at will. All can conceal innocuous as well as lethal plans, the noblest as well as the most shameful acts, and hatreds and conspiracies as much as generosity and self-sacrifice. Faces reveal nothing out of turn; secret codes remain unbroken."{750} Although Bok recognizes that some version of such a society "might develop precisely in response to the felt threat from increased [information] transparency,"{751} she views such A society as clearly undesirable because "[i]t would force us to disregard the legitimate claims of those persons who might be injured, betrayed, or ignored as the result of secrets inappropriately kept."{752} Strong protection of cryptography may lead exactly there.
However,
the absence of the refuge of digital anonymity may be worse. As
identifying data on each of us becomes more voluminous and more
easily accessible to government and to private parties,{753} our lives are changed, and not necessarily
for the better. Indeed, although law enforcement agencies
believe they benefit greatly from their electronic eavesdropping
capabilities, it is unclear whether society as a whole enjoys a
net benefit when one considers both past abuses and the
possibilities for future abuses. To foreclose an option that
would give the lie to Justice Douglas's dystopian warning that
"[w]e are rapidly entering the age of no privacy, where
everyone is open to surveillance at all times; where there are no
secrets from the government"{754} would be a communal confession[Page 881]
of lack of trust in our
fellows and in ourselves. It is chilling to think we are fully
capable of making this confession, and that we may even deserve
it.
In making legal judgments about the Constitution and cryptography, one should keep in mind what is possible and what is not. This means that one should consider both the extent to which long held ideas about the meaning of the Bill of Rights need to be rethought in light of technological changes{755} and that some prohibitions simply are unenforceable. Just as the ITAR have failed to stop the spread of strong cryptography abroad (and the development of indigenous substitutes abroad), so too would any attempt to ban unescrowed cryptography be doomed to failure.{756}
Privacy, constitutional law, and law
enforcement are not games. It is unsettling to think that one's
rights may turn on the extent to which people are able to find a
technological means to defeat what would otherwise be legitimate
government action. The good news is that technological change
can provide an opportunity to rethink, and perhaps increase the
coherence of, some constitutional doctrines. When technology
changes social realities,
interpretations [Page 882]
of
the Constitution should recognize the change to the maximum
extent the text permits. Thus, although we might be better off
today with the Olmstead standard than into what the
Supreme Court has turned Katz, we should not lose sight of
the fact that Olmstead was wrongly reasoned. Chief
Justice Taft's formalistic unwillingness to recognize that a
telephone call was different from a message on a billboard or a
shouted conversation on a busy street is not an attitude that
deserves emulation.
The bad news is that sometimes a
technological development is unstoppable. It may be that the
Administration intends Clipper only as a delaying action.{757} It may be that a future administration,
acting for what seem to be good reasons at the time, will attempt
a form of cryptological Prohibition. If so, it will fail as
Prohibition failed, as the War on Drugs is failing, and as the
ITAR are failing with respect to cryptography.{758}
The executive branch's primary concern has been to accommodate
the interests of banks and others who require strong
cryptography, while also preserving to the greatest extent
possible law
enforcement and intelligence capabilities. Noncommercial social
implications of cryptography have received relatively little
attention.
The private sector's motives are more difficult to summarize,
but[Page 883]
there has
clearly been a demand for cryptographic products, and this demand
is expected to grow rapidly.{760}
The executive branch's desire to maintain its ability to eavesdrop on electronic communications at will has driven it to abuse the technical standard-setting process. By manipulating the FIPS procedure, the Clinton Administration has achieved its initial objective of promulgating a standard that is insulated from any meaningful public comment and immune from judicial review. The Administration thus hopes to create a de facto rule where it lacks the statutory authority to create a rule de jure. Despite the seemingly underhanded aspects of the executive branch's behavior, there is no clear evidence that it has failed to comply with existing laws or the Constitution. There is, however, room for doubt, as some of the critical information regarding whether NIST retains its statutorily mandated independent judgment is classified. Congress would be well advised to reassure itself and the public that NIST has complied with the Computer Security Act's requirement that it not delegate decision-making to the NSA. If the NSA is calling the shots, firm persuasion, and perhaps corrective legislation, will be required.
The Administration hopes to coerce acceptance of an escrowed encryption product through its vast purchasing power, but whether this hope can be realized remains unclear. If this attempt fails, the next step may be to seek legislation requiring users of strong cryptography to allow the government some form of access to their cryptographic keys. If, despite what currently seems to be A prevailing opposition to even voluntary key escrow, such a bill were nonetheless to become law, mandatory key escrow would create serious constitutional problems that the courts would have to resolve.
Under current law, the judicial reaction to a hypothetical mandatory key escrow statute would be limited primarily to A balancing test analysis, although private noncommercial users would have a particularly strong Fourth Amendment argument on their side, and a good First Amendment argument as well. Recent history suggests, however, that the government's interest in national security or law enforcement often outweighs the citizen's right to privacy.
By their nature, balancing tests almost demand that courts
give[Page 884]
some play to
the judge's hopes and, especially, fears. A mandatory key escrow
statute would evoke two conflicting sets of fears, one over
control and the other over lawlessness, symbolized by the
archetypes of Big Brother and the criminal cabal. In the end,
the conflict may be decided by the way the courts characterize
cryptography. Just as the
cryptographic "key" is a metaphor, so too may the
choice among possible metaphors determine how much constitutional
protection an encrypted message gets. If the courts treat A
ciphertext as if it had been written in a foreign language, it
will trigger a First Amendment analysis that will result in
giving cryptography more protection than if the courts focus on
the place where the message is encrypted. If encryption is
considered no more than the outer envelope in a message
transmission system--essentially a "car" on the
information superhighway--it is likely to receive the lowest
level of protection.
Encryption has much to offer the commercial, professional, and personal users of telephones, computers, and computer networks. As these and other uses grow, they will breed conflict, some of which will inevitably be brought to the courts. The legal ecology of cyberspace is currently underpopulated, but not for long. Clipper and Capstone are only the first of many attempts by the government, and no doubt others, to protect the status quo from changes that upset long-established power relationships. The choices made in the next few years will shape the evolution of electronic commu-nication, and society in general, for decades to come. It would be sad if cyberspace became the first place that the government required civilians, in peacetime, to structure their private communications to make hypothetical future eavesdropping by law enforcement easier.
[Page 885]
There are two kinds of cryptography in this world: cryptography that will stop your kid sister from reading your files, and cryptography that will stop major governments from reading your files.{761}
Cryptography makes it possible to talk privately even if someone is listening in on your telephone line:{762} you can have a secure communication over an insecure channel. Cryptography also allows you to keep electronic records in a form that is easily accessible to you but inaccessible to snoops, whether siblings or governments. Ironclad protection, however, requires effort.
Part of the art of cryptography consists of choosing an
appropriate level of security because, for long texts, the
highest-level
encryption can be slow even with a computer.{763} If the purpose of
the encryption is to keep someone from peeking at the answer to A
riddle, printing the answer upside down may be enough. If the
purpose is to keep salacious material out of the hands of the
lower classes, then Latin was for many years the traditional
cipher.{764} If the encryption is
used, however, to authenticate wire transfers of funds or to send
messages to submarines in wartime, a high-level method is
required.
In each case, the determination is driven by considerations of
the time and effort it takes the sender and receiver to use the
system, the resources that a third-party would[Page 886]
likely be willing to devote
to cracking the system, and the costs of guessing wrong.
The strength of a cryptographic system is usually measured by the amount of effort that would be required to crack it by an enemy who knows the algorithm. This means, absent the discovery of a latent vulnerability in the cipher that allows the attacker to take A mathematical short-cut,{765} a "brute-force" attack in which computers try every possible key.{766} Surprisingly, modern cryptography does not require that the algorithm be kept secret. Indeed, the rule of thumb is that ciphers that depend on keeping the algorithm secret are unreliable.{767} In modern cryptography, only the key, and not the method of encryption needs to remain secret to preserve the security of the message.{768}
The
optimal cryptographic system would be easy to use and impossible
to crack. The real world imposes tradeoffs between ease of use
and vulnerability to attack,{769} although
computers now put very powerful cryptography within the reach of
anyone who has access to A personal computer (PC). Cryptography
can be used to defend against various forms of attack including
decryption by third parties, modification of messages, and
fabrication of authentic-[Page
887]
looking but spurious
messages. A strong cryptographic system protects messages that
may be intercepted by an enemy, and also authenticates messages
received.
A. Brute-Force Cryptanalysis
There are three fundamental ways for a third party to crack A
cipher where the algorithm is known but the key is not. First,
an enemy can simply steal the key or suborn a key-holder.
Second, if the enemy knows the algorithm but not the key, the
enemy can try to analyze the cipher, hoping to find a weakness in
the algorithm. Some very trusted ciphers have fallen to
mathematical analysis, at which point decryption becomes easy.{770} Third, and of particular relevance to
modern
cryptography, the attacker can mount a "brute- force"
attack using computers, perhaps with large numbers of specially
optimized chips running in parallel, to try every possible key
until the message is decrypted. Finding the sender's key gives
the attacker more than the text of a single message; it gives
access to all future messages that use the same key. If the
sender uses A digital signature, the attacker can forge the
sender's signature as well.
If the only (known) way to decrypt a cyphertext is to try every
possible key, then a longer key makes for a stronger cipher
because a longer key has more possible values. An eight-bit {771} key has 2^8 (256)
possible values. A computer would have to try all 256 possible
values to be certain of finding the key, although the average
number of possible values the computer would have to test before
encountering the answer will be only 128. Similarly, if the key
is 128 bits long, which is the equivalent of the maximum amount
of information in a sixteen-character message on a personal
computer,{772} a[Page 888]
brute-force attack would
require that 2^128 keys be tested to be certain of finding the
key.
To put this number in perspective, a computer processing a
million keys per second would require about 10^25 years to
complete the task, an amount of time 10^15 times greater than the
estimated age of the universe.{773}
Although 10^15 times the age of the universe makes for an impressive statistic, it is misleading. Chips capable of trying 256 million keys per second are foreseeable within the decade, and several of these can be harnessed to work together. Trying out keys is often ideally suited to parallel processing. Parallel processing allows a problem to be split up between many computer chips running simultaneously, trying many millions, even billions, of keys per second. The chips need not be part of a single computer: the key-cracking problem can be distributed among large numbers of workstations on a network, with each workstation being only slightly more powerful than today's desktop PCs. Indeed, the problem can be parceled out to several different networks, each communicating with one another only infrequently. A distributed processor of this nature, or a single optimized parallel processor, can try vastly more than a million keys per second, making large keys susceptible to being broken in a reasonable period of time.{774} Parallel processors already on the drawing board might make it possible to break even a 512-bit key at a cost which, although out of reach of the average citizen, would be well within the means of the poorest government.{775} The cryptographer's solution, of course, is to use a longer key.
Exactly how large a key would be vulnerable to an economical
brute-force attack is a matter of debate. Advances in computer
power continue to make longer and longer keys vulnerable, but the
same advances make it easier and cheaper to encrypt and decrypt
with longer keys. If one assumes the existence of economically[Page 889]
rational, albeit
immoral, attackers seeking to maximize the return on an
investment in the computing power needed in order to mount a
sustained attack, then high-value secrets are more at risk than
low-value ones. They therefore deserve longer keys. No
consensus exists as to how much security is enough for high-value
secrets, however, and in any case that point is probably well
past the security provided by DES.
DES's inventors
originally planned to use a 128-bit key, which would have
provided 40 sextillion (40,000,000,000,000,000,000,000) times
more security, but the NSA persuaded them that 56 bits
sufficed.{776} A 56-bit
key provides a keyspace of 2^56, or 72 quadrillion possible
keys.{777} Although this is A very
large number, even upon DES's adoption as the U.S. standard in
1977, critics predicted that an optimized computer costing $20
million to build could break an average of two 56-bit DES keys A
day. Depreciated over the machine's first five years of service,
this would have worked out to only $5000 per solution.{778} This was and is A sum well within the
reach of many governments, and the price would be much lower
today.{779} DES's inventors estimated
that the cost of building such a computer would be closer to $200
million, or $50,000 per key.{780} If DES
had used the 128-bit key originally contemplated, breaking each
key would have cost an average of $200 septillion in the 1970s,
even using Diffie and Hellman's lowest-cost assumptions. Perhaps
as a result of its shorter key, DES is not considered
sufficiently secure to protect classified data.{781} Despite suggestions
from the NSA that DES might not deserve recertification after
1988,{782} NIST recently
recertified DES as suitable for commercial purposes for five more
years.{783}
[Page 890]
B. Public-Key Cryptography
Before the invention of public-key cryptography in 1976,{784} a sender and receiver
who wanted to use a cipher had to agree on a key in order to
communicate securely. This method was very burdensome to both
parties. First, sender and receiver needed a secure means to
transmit the key itself. For example, if you were trying to send
a message to your agent behind enemy lines, and you were afraid
that the bad guys had cracked your cipher, it was too late to
send a coded message saying "the new key is
X."{785} Second, even if
the
key was transmitted securely (for example, by handing it to the
agent before she left for her mission), the security of a single-
key cipher evaporated as soon as the key was compromised. If you
wrote the secret key down, someone could have found it; if you
did not write it down, either it must have been short or you
needed A phenomenal memory. Third, the ever-present danger of
key compromise cast a doubt over the authenticity of every
message. A third party who had the key could use it to alter
messages, or to send fake messages purporting to be from any of
the parties who legitimately held the key.
Public-key cryptography solves all of these problems. As A
result, public-key cryptography has been described as A
"revolutionary technology," which will make routine
communication encryption ubiquitous.{786} In a public-key system, each user creates
A public key, which is published, and a private key, which is
secret.{787} [Page 891]
Messages encrypted with one
key can be decrypted only with the other key, and vice-versa.
Thus, if Alice wants to send a secure e-mail message to Bob,
and they both use compatible public-key cryptographic software,
Alice and Bob can exchange public keys on an insecure line.
Alice would have to input her plaintext and Bob's public key.
The program outputs the ciphertext, which is a stream of
characters that looks like garbage to anyone who should happen to
see it. When Bob receives the ciphertext, he inputs both it and
his private key to his program, which reveals Alice's plaintext.
One of the wonderful properties of public-key encryption is
that, so far as we know,{788} a third party's
possession of Alice's public key and a complete description of
the encryption algorithm puts that third party no closer to
deducing Alice's private key or reading her messages than if the
third party had the ciphertext alone. Thus, it is easy to
establish a secure line of communication with anyone who is
capable of implementing the algorithm. (In practice, this is
anyone with a compatible decryption program or other device.)
Sender and receiver no longer need a secure way to agree on a
shared key. If Alice wishes to communicate with Bob, a new
recipient with whom she has never communicated before, Alice and
Bob can exchange the plaintext of their public keys or look each
other up in a freely accessible directory of public keys. Then,
Alice and Bob can each encrypt their outgoing messages with the
other's public key and decrypt their received messages with their
own secret private key.{789}
One
drawback, however, is that public-key encryption and [Page 892]
decryption is much slower
than commonly used single-key systems such as DES.{790} Thus, although
public-key encryption is ideal for short messages, it is less
than ideal for longer ones, and is particularly unsuitable for
high-speed real-time applications like fast data transfer or
telephone conversations. The speed problem can be overcome,
however, by using a hybrid system. In a hybrid system, two
parties who wish to communicate securely over an insecure medium
use a public-key system to agree on a session key which
then becomes the one-time key for a faster, conventional,
relatively secure, single-key cipher such as DES.{791} Each time the parties initiate a new
conversation, they generate A new session key, which, though
lasting for the entire conversation, is never repeated.
If Alice and Bob are going to change their session key every
time they talk so as to maximize the security of their single-key
cipher, they need a secure means of agreeing on a session key
each time they want to communicate. Ideally, the method would
allow them to choose the session key in public, or on an insecure
line, without fear of eavesdroppers. Public-key cryptography
allows Alice and Bob to achieve this feat in either of two ways.
Using the first method, Alice generates the session key, encrypts
it with Bob's public key, and sends it to him. Bob decrypts the
message with his private key, inputs the session key to his
single-key software or telephone, and then the data exchange or
conversation begins.{792} Alternatively,
the parties can use Diffie-Hellman Key Exchange, in which Alice
and Bob publicly send each other numbers from which they, and
only they, can jointly calculate a session key. In Diffie-[Page 893]
Hellman Key Exchange,
Alice and Bob agree on a number, b, which does not have to
be secret, to use as the basis for their calculations. They also
agree, again without any attempt at secrecy, on a large prime
number which will serve as their modulus, m. (A modulus
is the base for arithmetic operations. Usually we calculate in
base ten; in binary arithmetic we calculate in base two. Alice
and Bob will calculate using base m, where m is a
large prime number.) Alice then selects a (secret) large random
number A; Bob meanwhile selects a (secret) large random
number B. Alice sends Bob a number she calculates as
b^A (modulus m). Bob sends Alice a number
he calculates as b^B (modulus
m). Both Alice and Bob then compute b^AB
(modulus m). This becomes their secret session key.
Diffie-Hellman works because it is computationally easy to calculate powers of numbers modulus m, but very, very difficult to find the logarithm of a large exponent of a number modulus m. Yet, this is what an eavesdropper would have to do to compute b^AB (modulus m) without knowing either A or B. Thus, if m is about 1000 bits long, Alice and Bob can do their calculations in seconds. At the current state of the art, however, the logarithmic computation would take a powerful computer a quintillion (a billion billion) years.{793}
Diffie-Hellman Key Exchange works for real-time communication because the parties can generate a session key at the start of the exchange, but it imposes potentially long delays on e-mail because the parties must exchange several messages to generate a session key before the parties can have a secure conversation. On the other hand, e- mail can be encrypted and decrypted at leisure, so the entire message can use public key encryption rather than just the session key. As a result, all that Bob needs in order to send Alice a secure e-mail is a reliable way of getting Alice's public key.
Key servers provide a simple way of making public keys
generally available. Essentially, a key server is a computer
with a white pages approach to public key management. Bob enters
Alice's name and the key server replies with Alice's public key--
if she has registered it. Key servers, whether run by the U.S.
Post Office or [Page
894]
others, are likely to be an essential element of
the National Information
Infrastructure.
Key servers generally work on one of two principles: the certification authority or the web of trust. Under the certification authority paradigm, some central body authenticates the identity of the registrant when the key is first deposited. For example, the U.S. Post Office has proposed that it act as A certifying authority. Alice could then identify herself to the Post Office by providing identification similar to that currently required to get a passport. The Post Office would then add her key to its server, and/or provide Alice with a copy of her public key signed with the Post Office's private key.{794}
By contrast, there is no central
authority for web-of-trust systems: Alice can upload a key to
the key server at anytime. In order to demonstrate that the key
purporting to be
"Alice's" is hers, Alice must then find other persons
to "sign" her key by uploading authentications signed
with their private keys. Typically this is done by meeting face-
to-face and exchanging public keys and showing identification.
If Alice has her key signed by Carol, whom Bob knows or trusts,
Bob can safely assume that the signature purporting to be from
"Alice" is not in fact an impostor's. Suppose,
however, that Alice and Bob do not have any friends in common,
but that Bob's friend Carol has signed Ted's key, and Ted has
signed Alice's key. From Bob's point of view this is not as good
as if Carol, whom he knows, has signed Alice's key, but it is
considerably better than nothing. Bob needs to decide how many
intermediaries he is willing to accept before he considers a
public key
unreliable. The increase in the length of the chain of
authentication can be offset by finding multiple routes to Alice.
For example, Bob may still feel reasonably secure if he can
establish three relatively long but independent chains of
authentication.{795} This web-of-
trust approach is the foundation of the PGP encryption system.{796} [Page
895]
Public-key systems also allow users to append a so-called digital signature to an unencrypted message. A digital signature uniquely identifies the sender and connects the sender to the message. Because the signature uses the plaintext as an input to the encryption algorithm, if the message is altered in even the slightest way, the signature will not decrypt properly, showing that the message was altered in transit or that the signature was forged by copying it from a different message.{798} A digital signature copied from one message has an infinitesimal chance of successfully authenticating any other message.{799}
The importance of a Digital Signature
Standard (DSS), the prerequisite to a system of electronic
commerce, has not been lost on the federal government.
Nevertheless, for many years NIST was unable or unwilling to
promote a standard. Although NIST had begun working on a DSS in
the early 1980s, its progress was slowed by the close relation
between digital signature and cryptographic systems:
an algorithm that produces a secure digital signature can[Page 896]
also be used to produce A
secure cipher. The U.S. government, however, wished to find a
DSS which, although secure enough to be useful, nonetheless could
not be used as an encryption system that the government might be
powerless to break.{800} In August 1991,
NIST announced its selection of the Digital Signature Algorithm
(DSA) which, having been patented by an NSA employee, would be
available royalty-free to all users.{801} NIST then encountered patent
difficulties. These were exacerbated when A U.S. corporation
acquired the rights to the main patent allegedly infringed by the
DSA. In June 1993, NIST announced that it would give the
corporation an exclusive license to the DSA; the U.S. government
would have free use but everyone else would have to pay.
Reaction from industry, which was already gravitating towards A
competing corporation's product and was poised to accept NIST's
standard only if it were royalty-free, was very negative.{802} In May 1994, NIST
announced that it would not give an exclusive license to anyone,
and that the DSA would be available royalty-free after all.{803} The alleged patent
infringement was dealt with by the statement that NIST --has
concluded there are no valid claims."{804} The DSS is now
scheduled to be incorporated into Fortezza PCMCIA chips.
Meanwhile, academic cryptologists, ever vigilant for intentional
weaknesses in government-sponsored cryptography, found an
interesting property in the DSS proposed by NIST and approved by
[Page 897]
the NSA. Small "subliminal"
messages can be inserted into the digital signature without the
signer's knowledge.{805} This property
"allows an unscrupulous implementer of DSS to leak a [small]
piece of the [user's] private key with each
signature."{806} One commentator
noted that the DSA "provides the most
hospitable setting for subliminal communications discovered to
date."{807} The
NSA has not let on whether it knew about this feature when it
agreed to the DSS.{808}
The proposed Capstone Chip combines the encryption functions of Clipper with the DSA.{809} It will be available in a PCMCIA card called Fortezza. Thus, if users wish to encrypt computer datA transmissions with a government approved algorithm, they will have to take the DSA as part of the package.
Although digital signatures use strong cryptographic algorithms, they raise fewer, and different, policy issues from the Clipper Chip. The U.S. government is not concerned about the world-wide use of authentication software because affixing a signature to a plaintext message does not make that message any harder to read.{810} The real power struggle centers on whether the government will retain the capability to read private electronic messages and listen to private telephone conversations at will.
[end of document] Send feedback here, please.
since counting restarted on Sept. 17, 1997.