Unless otherwise specified, this Article reflects legal and technical developments occurring on or before January 1, 1995.
1. Kim L. Scheppele, Legal Secrets 302 (1988) (footnote omitted).
2. "Secrecy" refers to the intentional concealment of information so as to prevent others from "possessing it, making use of it, or revealing it" to third parties. Sissela Bok, Secrets: On the Ethics of Concealment and Revelation 6 (1982). It also refers to "the methods used to conceal [information], such as codes or disguises." Id.
3. Privacy is "that portion of human experience for which secrecy is regarded as most indispensable." Id. at 7. Secrecy and privacy are not identical, however. See id. at 10. Privacy is "the condition of being protected from unwanted access by others--either physical access, personal information, or attention. Claims to privacy are claims to control access to what one takes . . . to be one's personal domain." Id. at 10-11.
4. In this sense, "the right to
privacy has everything to do with delineating the legitimate limits
of governmental power." Jed Rubenfeld, The Right of
Privacy, 102[Page 713]
Harv. L. Rev.
737, 737 (1989). Of course, true privacy also requires delineating
the limits of the power of private parties, including detectives,
credit bureaus, and others.
5. Cryptography cuts across the law in many interesting ways. Most of the statutory issues, however, are outside the scope of this Article. In particular, this Article does not discuss cryptography as it relates to intellectual property law.
6. Cryptology is the study of cryptography and cryptanalysis. See David Kahn, The Codebreakers at xvi (1967).
7. See id. at xiii-xvi; see also Horst Feistel, Cryptography and Computer Privacy, Sci. Am., May 1973, at 15, 15 (drawing A distinction between codes and ciphers). Back to text
8. Henry W. Longfellow, The
Landlord's Tale: Paul Revere's Ride, in 4 The
Poetical Works of Henry Wadsworth Longfellow 25, 25 (1966).
For an example of a literary cipher, see Edgar A. Poe,
The Gold-Bug, in The Complete Tales and Poems of
Edgar Allan Poe 42, 62-67 (1938). See also Terence
Whalen,[Page 714]
The Code for Gold: Edgar
Allan Poe and Cryptography, 46 Representations 35
(1994).
9. Eric Bach et al., Cryptography FAQ (03/10: Basic Cryptology) § 3 (Oct. 31, 1994), available online URL ftp://rtfm.mit.edu/pub/usenet/news.answers/cryptography-faq/part03. A message that has never been disguised is called a cleartext. See Kahn, supra note 6, at xvi. Back to text
10. The number of possible values of A key is called the keyspace.
11. See infra Technical Appendix, part B (describing public-key cryptography).
12. See Bruce Schneier, Applied Cryptography 4 (1994) (defining cryptanalytic terms).
13. On the early use of telegraphic cryptography in military combat, see Kahn, supra note 6, at 190-91.
14. See Approval of Federal Information Processing Standards Publication 185, Escrowed Encryption Standard (EES), 59 Fed. Reg. 5997, 5998 (1994) [hereinafter FIPS 185] ("Key escrow technology was developed to address the concern that widespread use of encryption makes lawfully authorized electronic surveillance difficult."). For a discussion of Federal Information Processing Standards (FIPS), see infra notes 222-25 and accompanying text. Back to text
15. Although the chip is universally known as "Clipper," the government has alternately adopted and abandoned the name. See U.S. Gen. Accounting Office, Communications Privacy: Federal Policy and Actions 6 n.6 (1993) [hereinafter GAO Communications Privacy] (explaining that the name was used, then dropped). The official use of the name "Clipper" recently has been revived. See Susan Landau et al., Association for Computing Machinery, Inc., Codes, Keys and Conflicts: Issues in U.S. Crypto Policy 52 n.1 (1994) [hereinafter ACM Report] (stating that Intergraph Corp., which had trademarked the name for one of its microprocessors, "graciously ceded" the rights to the name).
16. The technical name for the Clipper- compliant family of devices is the Escrowed Encryption Standard (EES). For the nonclassified specifications for these devices, see FIPS 185, supra note 14, at 6004-05. The Clipper Chip itself is designed for use in secure telephones; its cousin, the Capstone Chip, will be used for electronic mail, digital signatures, see infra Technical Appendix, part C, public key exchange, see infra Technical Appendix, part B, and random number generation. For a brief introduction to the Capstone Chip and its technical specifications, see generally National Inst. of Standards and Technology, Capstone Chip Technology (Apr. 30, 1993), in Building in Big Brother, supra note * (manuscript at 147) [hereinafter Capstone Chip Tech-nology]. A PCMCIA card (Type 1) using Capstone will likely be purchased in bulk by the Pentagon. See infra text accompanying note 245. The PCMCIA card was formerly known as a "Tessera" card, but the National Institute of Standards and Technology (NIST) has now changed the name to the "Fortezza" card because a private company had previously trademarked the name "Tessera." See Interview with Gary Latham, Mantech Strategic Associates, Ltd., in Miami, Fla. (Sept. 30, 1994) (Mr. Latham is a consultant employed by NIST); see also Matt Blaze, Protocol Failure in the Escrowed Encryption Standard, in Building in Big Brother, supra note * (manuscript at 131, 145) (noting that "Tessera" is a trademark of Tessera, Inc., which has no connection with the EES project).
[Page 716]
The entire EES project has been
plagued by problems with intellectual property law. Not only did
the names originally selected for the EES chips conflict with
existing trademarks, but the algorithm for the escrow concept
itself was the subject of an infringement claim by MIT professor
Silvio Micali. Professor Micali claimed he had patented the escrow
concept. After initially denying there was infringement, NIST
agreed to settle Professor Micali's claim by purchasing A nonexclusive license for all EES systems "developed for
authorized government law enforcement purposes" whether inside
or outside the government. U.S. Dep't of Commerce, Patent
Agreement Removes Perceived Barrier to Telecommunications Security
System (July 11, 1994) (press release); see also Ellen
Messmer, NIST Acknowledges Patent Infringement, Network
World, July 25, 1994, at 20 (noting that the exact terms of the
NIST settlement agreement were not being revealed).
17. Established in 1952 by presidential directive, the NSA is the U.S. government's chief signals intelligence and cryptological department. See Kahn, supra note 6, at 675-84 (outlining the development of the NSA from 1952 through 1966); Jeffrey Richelson, The U.S. Intelligence Community 15-20 (1985) (describing the bureaucratic structure of the NSA). See generally James Bamford, The Puzzle Palace: A Report on America's Most Secret Agency (1982) (tracing the develop-ment of the NSA between 1952 and 1982).
18. Vice President Gore has suggested that the proposal might be modified in the future to allow some companies to use certified private escrow agents rather than depositing their keys directly with the government. See Letter from Vice President Al Gore to Congresswoman Maria Cantwell (July 20, 1994), availableonline URL ftp://ftp.eff.org/pub/EFF/Policy/Crypto/Clipper/gore_clipper_retreat_cantwell_072094.letter [hereinaf- ter Gore-Cantwell Letter]. But see Statement of Patrick Leahy on Vice President Gore's Clipper Chip Letter (July 21, 1994), available online URL ftp://ftp.eff.org/pub/EFF/Policy/Crypto/Clipper/gore_clipper_retreat_ leahy.statement (stating that the Gore letter "represents no change in policy").
NIST is currently exploring alternatives to the existing EES proposal that would rely more heavily on third-party escrow agents. See Interview with Gary Latham, supra note 16.
19. See infra part I.C.1.c.i (discussing the International Traffic in Arms Regulations (ITAR), which restrict the export of cryptographic software).
20. The government can require that
federal agencies and government contractors use Clipper. Indeed,
the government has announced that the Attorney General
will[PAGE 717]
purchase "several thousand"
Clipper-equipped
telephones. See Office of the Press Secretary, The White
House, Statement by the Press Secretary 2 (Apr. 16, 1993),
in Office of the Press Secretary, The White House,
Government-Developed "Key Escrow" Chip Information Packet
(Apr. 16, 1993) (information packet accompanying press release)
[hereinafter "Key Escrow" Information Packet].
If Clipper becomes the exclusive encryption protocol used by the U.S. government, then anyone who wishes to communicate with the government concerning nonclassified but sensitive information will have to use Clipper.
21. Without access to relevant classified information, it is impossible to know whether the NSA or other government agencies might have discovered a means of breaking even the most sophisticated publicly available ciphers. Considering the intense secrecy that would surround such a cryptanalytic capability, however, one can safely act as if it does not exist. Even if the government had the capability to break supposedly unbreakable cryptography, such cryptanalysis would be a vital national secret--so vital that the government would never use that capability in a manner that would risk revealing its existence before the middle of the next large war. Back to text
22. Cryptography remains of paramount importance in guarding military and other national-security-related secrets during both peacetime and wartime. These uses of cryptography are outside the scope of this Article, although it bears mentioning that to date the government remains by far the largest producer and consumer of cryptography in this country. See ACM Report, supra note 15, at 12 (noting that the private market for cryptography remains a niche market in which a handful of companies gross only a few tens of millions of dollars annually).
23.But see infra part I.C.1.b (discussing the NSA's traffic analysis); infr a note 405 and accompanying text (discussing the use of voice recognition as a surveillance tool).
24. See, e.g., Scheppele, supra note 1, at 302 (commenting on the importance of secrecy at the individual level).
25. Of course, cryptography will not protect against all breaches of security. There are many other ways to lose one's privacy, including trusting the wrong people, leaving things in plain sight, and, of course, simple stupidity. Electronic listening devices also make people vulnerable. See, e.g., Kent Greenfield, Comment, Cameras in Teddy Bears: Electronic Visual Surveillance and the Fourth Amendment, 58 U. Chi. L. Rev. 1045, 1047-48 (1991) (describing miniature video cameras and other surveillance technologies); see also High-Tech Tools for Police Will "See Through" Clothes, Int'l Herald Trib., Dec. 19, 1994, at 4 (reporting that police may soon carry electromagnetic wave imagers that detect guns concealed under clothing).
26. See Gilles Garon & Richard Outerbridge, DES Watch: An Examination of the Sufficiency of the Data Encryption Standard for Financial Institution Information Security in the 1990s, Cryptologia, July 1991, at 177, 177 (stating that since its adoption in 1977, DES "has become the most widely used cryptographic system in the world"); Lance J. Hoffman et al., Cryptography Policy, Comm. ACM, Sept. 1994, at 109, 111 (noting that the Clearing House Interbank payment system currently moves an average of $1 trillion each day via wire and satellite). Both systems use the U.S. DatA Encryption Standard (DES), which arguably has reached or will soon reach the end of its useful life for high-value security. See infra part I.B (noting that existing methods of encryption are beginning to look dated and vulnerable).
27. See Gerald Murphy, U.S. Dep't of the Treasury, Directive: Electronic Funds and Securities Transfer Policy--Message Authentication and Enhanced Security, No. 16-02, § 3 (Dec. 21, 1992).
28. See American Nat'l Standards Committee on Financial Services, X9 Secretariat, American Bankers Ass'n, American National Standard for Personal Identification Number (PIN) Management and Security 9 (1982) (describing proper ATM encryption standards); see also Beth E. Secaur, Note, Automated Teller Machines Under the New York Banking Law: Do They Serve Community Credit Needs?, 37 Syracuse L. Rev. 117, 120-23 (1986) (discussing the technology and development of ATMs).
29. See E-mail from Ross Anderson, University of Cambridge Computer Laboratory, to Michael Froomkin (Feb. 14, 1994) (on file with author) (discussing the technology and development of ATMs).
30. See Schneier, supra note 12, at 221 (citing cryptographic standards for bank transactions). Banks rely primarily on the U.S. DatA Encryption Standard (DES). See infra part I.B.1 (discussing how DES became the standard and why that standard is becoming increasingly vulnerable). Nevertheless, consider this disturbing boast: "`Give me $1 billion and 20 people and I'll shut America down. I'll shut down the Federal Reserve, all the ATMs; I'll desynchronize every computer in the country.'" Technology as Weaponry, Info. Wk., Jan. 10, 1994, at 48, 50 (quoting futurist Alvin Toffler's recollection of an unidentified intelligence official's statement).
31. See Digital Privacy and Security Working Group, Electronic Frontier Found., Privacy, Security, and the National Information Infrastructure 2 (1993) ("Without strong cryptography, no one will have the confidence to use networks to conduct business, to engage in commercial transactions electronically, or to transmit sensitive personal information."); Hoffman et al., supra note 26, at 111 ("One of the consequences of an increasingly electronics-oriented economy will be the need to provide some amount of anonymity and privacy for users of such A digital cash system in order to ensure that electronic money remains anonymous and untraceable . . . ."). For a discussion of digital signatures, see infra Technical Appendix, part C.
32. See infra text preceding note 798 (noting that digital signatures uniquely identify the sender and connect the sender to the message).
33. A properly generated digital signature copied from one message has an uninfinitesimal chance of successfully authenticating any other message. See infra note 799 and accompanying text.
34. See infra text accompanying note 798.
35. See Inquiry on Privacy Issues Relating to Private Sector Use of Telecommunications-Related Personal Information, 59 Fed. Reg. 6842, 6842 [hereinafter Inquiry on Privacy Issues] ("As the [National Information Infrastructure] develops, Americans will be able to access numerous commercial, scientific, and business data bases . . . [and] engage in retail, banking and other commercial transactions . . . all from the comfort of their homes."); see also Microsoft and Visa to Provide Secure Transaction Technology for Electronic Commerce, PR Newswire, Nov. 8, 1994, available in WESTLAW, PRNews-C database (announcing plans to provide secure electronic bankcard transactions across global public networks using RSA encryption).
36. Inquiry on Privacy Issues, supra note 35; cf. Jeffrey Rothfeder, Privacy for Sale: How Computerization Has Made Everyone's Private Life an Open Secret 28 (1992) ("As the population grows more computer literate and databanks become more prevalent and sophisticated, long-distance, invisible assaults on privacy will occur more frequently.").
37. See infra part I.A.5 (discussing use of cryptography by criminals).
38. See Economics and Statistics Admin. & Bureau of the Census, U.S. Dep't of Commerce, Statistical Abstract of the United States 1993, at 596 [hereinafter 1993 U.S. Statistical Abstract]. This sum includes all research and development conducted outside the government, regardless of whether funded by industry or government.
39. See GAO Communications Privacy, supra note 15, at 12.
40. See, e.g., ACM Report, supra note 15, at 1 (describing electronic industrial espionage against an Alaskan oil company and by British Airways against Virgin Atlantic Airlines).
41. See James Daly, Laptop Thefts Spur Security Efforts, Computerworld, Oct. 12, 1992, at 1, 12 (discussing theft of laptops to obtain corporate plans, and the ways devised by firms to deny access to information on laptops).
42. See Key Escrow: Its Impact and Alternatives 6 (May 3, 1994) (testimony of Dr. Whitfield Diffie, Distinguished Engineer, Sun Microsystems, Inc., before the Subcommittee on Technology and Law of the Senate Judiciary Committee) (on file with author) (discussing factors making security more essential and more difficult to achieve).
43. According to FBI Director Louis Freeh, the governments of at least 20 nations are "actively engaged in economic espionage." Louis J. Freeh, Address at the Executives' Club of Chicago 8 (Feb. 17, 1994) (transcript available at the FBI) [hereinafter Freeh Speech]; see also ACM Report, supra note 15, at 1 (describing Soviet electronic surveillance of the IBM corporation in the 1970s); id. at 24 (describing the U.S. as the "greatest potential prey" of communications intelligence); David Silverberg, Spy Charges Fray Ties Between U.S., France; French Officials Refute Espionage Accusations, Def. News, May 3, 1993, available in LEXIS, News Library, Curnws File (describing U.S. accusations of industrial espionage by the French government allegedly aimed at 49 U.S. manufacturing companies, 26 financial institutions, and various U.S. government laboratories).
44. See Freeh Speech, supra note 43, at 11 (urging private businesses to be vigilant in protecting valuable information such as research and development results, marketing plans, and corporate negotiating positions).
45. CIA Director James Woolsey described economic and industrial espionage by the CIA as "the hottest current topic in intelligence." Ross Thomas, Industrial Espionage: The CIA's New Frontier, L.A. Times, July 18, 1993, at M2. The suggestion that the CIA diversify into industrial espionage received some support. See, e.g., Gerard P. Burke, Economic Espionage: Government Help Is Needed, Gov't Executive, Nov. 1992, at 56 (noting that the U.S. government should attend to the "intelligence being directed against American business . . . without pause for philosophical agonizing"). It was also criticized because it was unclear how the CIA proposed to define a "foreign" corporation and how it proposed to decide which "domestic" corporations would enjoy the spoils. See William T. Warner, Economic Espionage: A Bad Idea, Nat'l L.J., Apr. 12, 1993, at 13. Later in 1993, Director Woolsey stated that for "ethical and legal reasons" the CIA had decided not to embark on the project. Tim Kennedy, Men and Matters, Moneyclips, Dec. 12, 1993, available in LEXIS, News Library, Moclip File (quoting Director Woolsey's statement on Larry King Live). But see Robert Dreyfuss, Company Spies: The CIA Has Opened a Global Pandora's Box by Spying on Foreign Competitors of American Companies, Mother Jones, May- June 1994, at 16, 16 (suggesting that the "CIA has already begun a clandestine effort to help the American auto industry").
46. A 1993 survey by the American Bar Association Legal Technology Resource Center found that almost 75% of attorneys have a computer assigned to them. See Betty Cline, Protecting Electronic Confidences, Legal Times, June 20, 1994, at S30, S30); see also David P. Vandagriff, Opening the Computer Door, A.B.A. J., Aug. 1994, at 92, 92 (describing a law firm which relies on e-mail to communicate with and attract clients). This does not of course prove that all lawyers use their computers.
47. Communications between a client and her attorney, made in confidence by the client when seeking legal advice, are privileged, unless this protection is waived by the client or her representative. This privilege can be waived by unintentional disclosure. See 81 Am. Jur. 2d Witnesses § 379 (1992) ("The presence of a third person indicates a lack of intention that the communications . . . are meant to be confidential."). But see 2 B.E. Witkin, California Evidence § 1074, at 1019 (3d ed. 1986) (discussing California Evidence Code § 954, which permits the holder of the privilege to prevent disclosure of privileged communications, and extends the privilege to communications which are overheard by an "eavesdropper, finder or interceptor").
48. See Jeffrey I. Schiller, Secure Distributed Computing, Sci. Am., Nov. 1994, at 72, 72 (suggesting an increasing frequency of "passive attacks"--eavesdropping--on the Internet). For an assessment of the security of commercial services, such as CompuServe, as A medium for attorney-client confidences, see Ronald Abramson, Protecting Privilege in E-mail Systems, Legal Times, Aug. 15, 1994, at 29.
Although the Internet grew out of the Defense Department network it is now insecure. As a result, the U.S. intelligence community has created the "Intelink," a secure alternative network for communications too important to be entrusted to the Internet. See William F. Powers, Cloak and Dagger Internet Lets Spies Whisper in Binary Code, Wash. Post, Dec. 28, 1994, at A4 (noting that the intelligence community created the "Intelink" because the "very public, very uncontrollable global mesh of computer networks [(the Internet)] was too risky a place to do business").
49. See Peter H. Lewis,
Computer Snoopers Imperil Pentagon Files, Experts Say,
N.Y. Times, July 21, 1994, at A1, B10 (reporting that there
"`are probably no secure systems on the Internet'"
(quoting Peter G. Neumann, principal scientist at SRI
International, a think tank formerly known as the Stanford Research
Institute)); see also Terri A. Cutrera, Comment, The
Constitution in Cyberspace: The Fundamental[PAGE 725]
Rights
of Computer Users, 60 UMKC L. Rev. 139, 140-42 (1991)
(surveying "hackers' skirmishes with the law").
[T]here has been a migration of communications from more secure media such as wirelines or physical shipment to micro- wave and satellite channels; this migration has far outstripped the application of any protective measures. Consequently, communications intelligence is so valuable that protecting its flow . . . is an important objective of U.S. national security policy.ACM Report, supra note 15, at 24.
51. See, e.g., Cline, supra note 46, at S30 (describing the risks and ethical concerns of the increased use of technology in the legal field as well as the possible ways to protect confidential information).
52. See Vincent M. Brannigan & Ruth E. Dayhoff, Medical Informatics: The Revolution in Law, Technology, and Medicine, 7 J. Legal Med. 1, 48- 50 (1986) (noting that there are several different approaches in the law to protect patient privacy, including tort litigation and violation of state or federal privacy acts).
53. See, e.g., 81 Am. Jur. 2d Witnesses § 448 (1992) (describing physicians' eviden-tiary privileges).
54. The recent theft of a laptop computer from the hypnotherapist who treated the Princess of Wales illustrates the dangers to doctors and their patients. After the theft, the British press began an orgy of speculation about the revelations that might emerge, see Edward Pilkington, Theft from Princess's Bulimia Therapist Raises New Privacy Fears, Guardian, Aug. 1, 1994, at 3, although none did.
55. See generally infra Technical Appendix, part C (discussing digital signatures).
56. See Gustavus J. Simmons, Subliminal Communication Is Possible Easy Using the DSA, in Advances in Cryptology-EUROCRYPT '93: Workshop on the Theory and Application of Cryptographic Techniques 218, 219 (Tor Helleseth ed., 1994) (describing the various types of information that can and may be digitized onto an ID card).
57. In the 1970s the Pentagon admitted that the Army was stamping discharge papers with 530 different "SPN" code numbers that gave savvy employers derogatory information about servicemen, including some with honorable discharges. The codes did not appear on discharge papers issued to servicemen but were available to employers who asked for more detailed records. Classifications included "drug abuse," "disloyal or subversive security program," "homosexual tendency," "unsuitability--apathy, defective attitudes and inability to expend effort constructively," and "unsuitability--enuresis [bed wetting]." See Dana A. Schmidt, Pentagon Using Drug-Abuse Code, N.Y. Times, Mar. 1, 1972, at 11. Receipt of antiwar literature sufficed to be classified as disloyal or subversive. See Peter Kihss, Use of Personal- Characterization Coding on Military Discharges Is Assailed, N.Y. Times, Sept. 30, 1973, at 46. In response to public pressure, the Pentagon abandoned the program and reissued discharge papers without the codes. See Pentagon Abolishes Code on Discharges of Military Misfits, N.Y. Times, Mar. 23, 1974, at 64; Uncoded Discharge Papers Are Offered to Veterans, N.Y. Times, April 28, 1974, at 33.
58. U.S. paper money is not completely anonymous, however, because each (authentic) bill carries a unique serial number and bills can be marked to facilitate tracking.
59. For example, when my spouse and I purchase surprise gifts for each other, we tend to pay in cash because we have joint checking and credit card accounts.
60. See David Chaum, Achieving Electronic Privacy, Sci. Am., Aug. 1992, at 96, 96-97 (discussing electronic cash). See generally infra Technical Appendix, part C (describing digital signatures).
61. A "perfect crime" works as
follows: The criminal commits an act of extortion, for example,
blackmail or kidnapping, which does not require face-to-face
contact with the victim to make the demand for money. Instead of
demanding small unmarked bills, the extortionist demands that the
victim publish the digital signatures of a large quantity of E$ in
a newspaper. Because the "payoff" occurs via publication
in a newspaper, there is no danger of being captured while
attempting to pick up a ransom. And because the E$ is untraceable,
the extortionist is able to spend it without fear of marked bills,
recorded serial numbers, or other forms of detection.
[PAGE 728]
Currently, this strategy would require A sophisticated criminal, because the extortion demand would have to
include the result of computations based on large random numbers,
but not the random numbers themselves. These computational results
would be used by the digital bank as inputs for its production of
the verified E$ and would not only ensure the untraceability of the
E$ but also prevent anyone but the criminal--who is the only one
who knows the large random numbers--from using the E$ whose digital
signatures are published in the newspaper. See Sebastiaan
von Solms & David Naccache, On Blind Signatures and Perfect
Crimes, 11 Computers & Security 581, 582-83 (1992)
(describing the mathematical steps that must be followed in order
to effectuate a "perfect crime"). If, however, digital
money becomes commonplace, all the necessary functions will be
built into easily available software. This may not be too far
away. See Peter H. Lewis, Attention Shoppers: Internet
Is Open, N.Y. Times, Aug. 12, 1994, at D1 (describing
the purchase of a compact disc via the Internet by using digital
signatures and high-grade cryptography to encrypt a credit card
number).
62. See, e.g., Dan Lehrer, Clipper Chips and Cypherpunks, 259 Nation 376, 376 (1994) (describing William Steen's use of PGP to encrypt what sheriff's deputies claimed was potential evidence of traffic in child pornography). Similarly, the defendant in Commonwealth v. Copenhefer would have benefitted from encryption. See 587 A.2d 1353, 1355-56 (Pa. 1991) (holding that an additional warrant was not required to retrieve incriminating datA "deleted," but still recoverable, from the defendant's computer's hard disk).
63. See infra text accompanying note 761.
64. See Hoffman et al., supra note 26, at 111.
65. See 18 U.S.C. § 2511 (1988) (providing that "any person who . . . intentionally intercepts . . . any wire, oral, or electronic communication . . . shall be fined . . . or imprisoned not more than five years, or both").
66. See John Markoff, Electronics Plan Aims to Balance Government Access with Privacy, N.Y. Times, Apr. 16, 1993, at A1, A18 ("[C]ellular phone calls can be monitored by anyone with an inexpensive scanner.").
67. See ACM Report, supra note 15, at 10 (referring to "the government's STU-III secure telephone system, which is inaccessible to the general public").
68. See supra note 50.
69. See Hoffman et al., supra note 26, at 111.
70. New AT&T Security Device Targets Spying by Fax, PR Newswire, June 13, 1994, available in LEXIS, News Library, Curnws File.
71. See Microsoft At Work Fax Software Debuts in Windows for Workgroups 3.11, Business Wire, Oct. 5, 1993, available in LEXIS, News Library, Curnws File.
72. See Hoffman et al., supra note 26, at 111.
73. See Schneier,
supra note 12, at 437. The most
popular program is Phil
Zimmermann's Pretty GoodTM Privacy (PGPTM), currently in MIT
freeware version 2.6.2 for noncommercial use only, and for
commercial use in PGP Viacrypt version 2.7. See Philip
Zimmermann, PGPTM User's Guide Volume I: Essential Topics
(Oct. 11, 1994), available online URL ftp://net-dist.mit.edu/pub/PGP
[hereinafter PGPTM User's Guide]. PGP is available to U.S.
and Canadian residents for file transfer protocol[PAGE 730]
(FTP) from rtfm.mit.edu. Ftp is the file transfer program by
which documents and program files can be retrieved from any
computer on the Internet, where they have been placed for public
copying.
In order to comply with U.S. export restrictions, however, the Massachusetts Institute of Technology requires that would-be downloaders read the warnings located in a file whose name changes every 30 minutes before obtaining the short-lived access code needed to download the program. Foreign residents, or those with less patience, can download the file by connecting to an English server: ftp://ftp.ox.ac.uk/pub/crypto/pgp, or a German server: ftp://ftp.informatik.uni-hamburg.de:/pub/virus/crypt/pgp, and then selecting the appropriate sub-directory for the operating system and PGP version of their choice. For an excellent introduction to PGP, see Simson Garfinkel, PGP: Pretty Good Privacy (forthcoming Jan. 1995).
74. See infra part I.C.1.a (discussing law enforcement's view of the importance of electronic intelligence gathering).
75. The Alien and Sedition Act made it A crime to publish "false, scandalous and malicious
writing" against the United States government, Congress, or
the President, with intent to excite "hatred" against
them. 1 Stat. 596, 596 (1798). The Act, supported primarily by
the Federalist Party, did not make it a crime to excite hatred
against the Vice President or publish falsehoods about him because
the Vice President at the time was Thomas Jefferson, who was not A Federalist. See generally James M. Smith, Freedom's
Fetters: The Alien and Sedition Laws and American Civil
Liberties (1956) (discussing the background, enforcement, and
implications of the[PAGE 731]
Alien and Sedition
laws).
76. See William Preston, Jr., Aliens and Dissenters: Federal Suppression of Radicals 1903-1933, at 208-37 (1963) (describing the secret, mass roundups in the 1920s of some 10,000 immigrants and others active in the labor movement, the Socialist Party, the Communist Party, and other dissident groups; their interrogation without access to counsel or bail; the illegal seizure of their records; the attempts to extort confessions from them; and the FBI investigations of government officials who sought to ensure due process for these arrestees).
77. See Exec. Order No. 9066, 7 Fed. Reg. 1407 (1942) (authorizing internment camps); see also Korematsu v. United States, 323 U.S. 214, 217-18 (1944) (rejecting several constitutional challenges to the internment of U.S. citizens of Japanese descent pursuant to the Act); Hirabayashi v. United States, 320 U.S. 81, 100-01 (1943) (rejecting an equal protection challenge to a curfew order pursuant to the Act); Act of Mar. 21, 1942, Pub. L. No. 77-503, 56 Stat. 173 (criminalizing the refusal to comply with internment orders of a military commander). See generally Eugene V. Rostow, The Japanese American Cases--A Disaster, 54 Yale L.J. 489 (1945) (describing and criticizing the treatment of Japanese aliens and U.S. citizens of Japanese origin during World War II). Back to text
78. See Frank J. Donner, The Age of Surveillance: The Aims and Methods of America's Political Intelligence System 20 (1980) (arguing that COINTELPRO, an FBI counterintelligence program, is a form of punishment directed at individuals or readily identifiable groups for past actions without trial, and is thus an attainder).
79. Robert Michels, Political Parties 15 (Eden Paul & Cedar Paul trans., 1962) (arguing that "oligarchy . . . is an intrinsic part of bureaucracy or large-scale organization").
80. National security is defined as the "national defense and foreign relations" of the United States. Exec. Order No. 12,356, 47 Fed. Reg. 14,874 (1982), reprinted in 50 U.S.C. § 401 (1988).
81. S. Rep. No. 755, 94th Cong., 2d Sess., pt. 2, at 4 (1976) [hereinafter Church Committee Report].
83. [**PAGE 732**]Id. at 7, 54-57. By 1958, the FBI had whittled down the list to only 12,870 names, but the FBI placed the names it removed from the round-up list on its "Communist Index" (renamed the "Reserve Index" in 1960) for "priority consideration" for "action" after the first group had been detained. Id. at 55-56.
By 1972, the FBI had access to the fingerprints of more than 85 million U.S. residents. See Morris D. Forkosch, Freedom of Information in the United States, 20 DePaul L. Rev. 1, 97 n.347 (1971).
84. See Sanford J. Ungar, FBI 137 (1975) (describing wiretapping of the Black Panthers).
85. See Church Committee Report, supra note 81, at 6 (noting that the FBI had 500,000 domestic intelligence files, many with more than one name included).
90. See David Burnham, The Rise of the Computer State 20, 23-25 (1983). Although the IRS Code, 26 U.S.C. § 6103 (1988 & Supp. V 1993), provides for the confidentiality of tax returns, one commentator has described this restriction as "quite permeable." Steven A. Bercu, Toward Universal Surveillance in an Information Age Economy: Can We Handle Treasury's New Police Technology?, 34 Jurimetrics J. 383, 429 (1994).
91. See Church Committee Report, supra note 81, at 56-59 (discussing domestic CIA activities).
92. For example, a former FBI officer stated, "We never gave any thought to [whether proposed actions were legal] because we were just naturally pragmatists. The one thing we were concerned about was this: will this course of action work, will it get us what we want . . . ." Id. at 968 (quoting testimony of former FBI Assistant Director for Domestic Intelligence William Sullivan).
93. Imagine that a new client comes to consult you and says that she is about to form a new political action group. This group will organize demonstrations and will encourage a general strike to support an unpopular political or social opinion. The group intends to consult you frequently so as to stay within the bounds of the law, but it intends to use every politically and socially disruptive tactic that the law allows in order to gain the maximum audience for its platform. Your client also believes that at times some group members may conclude that extreme cases of injustice require peaceful civil disobedience. On the way out, your new client asks if you think the group should worry about having its telephones tapped by the police. What do you say?
94. Pub. L. No. 90-351, tit. III, § 802, 82 Stat. 197, 211-25, reprinted in 1968 U.S.C.C.A.N. 237, 253 (current version at 18 U.S.C. §§ 2510-2521 (1988 & Supp. V 1993) (Electronic Communications Privacy Act of 1986)) [hereinafter Title III].
95. The volumes of successful suppression
orders demonstrate that the police wiretap more than the courts
believe is justified; most suppression orders, however, involve
cases where a court issued some process, if only on the basis of an
inadequate foundation. Exposure of illegal, bad-faith, and
warrantless wiretaps seems to have become rare, although clearly
not completely a thing of the past. See, e.g., Eric
Schmitt, Suffolk Officers Testify They Wiretapped Illegally,
N.Y. Times, Jan. 14, 1988, at B3 (reporting narcotics
officers' testimony that they conducted illegal wiretaps
with[PAGE 734]
the approval of their supervisor and the
chief of
the District Attorney's narcotics bureau).
96. Robert D. Hershey, Jr., I.R.S. Staff Is Cited in Snoopings, N.Y. Times, July 19, 1994, at D1.
97. See Office of Technology Assessment, Congress of the United States, Information Security and Privacy in Network Environments, 2-3 (1994) (OTA- TCT-606) [hereinafter OTA Information Security].
98. Elaine Viets, And the Winner Is: The Trashiest Ever, St. Louis Post-Dispatch, May 29, 1990, at 3D; see also Mark Miller, Vermont Writer Wins PEN Award, Wash. Times, Apr. 21, 1993, at E4, available in LEXIS, News Library, Curnws File (quoting ex-postal worker E. Annie Proulx as saying, "I worked in the post office, and I know perfectly well that everyone loves to read postcards. . . . Not only do [the postal workers] read them, but if they get a particularly juicy one, they show it to their co-workers" (alteration in original)).
99. See Sam J. Ervin, Jr., The Whole Truth: The Watergate Conspiracy 133-37 (1980) (discussing Nixon's electoral dirty tricks); Jonathan Alter & Richard Sandza, When Tricks Turn Dirty, Newsweek, July 18, 1983, at 18, 18 (reporting that Jimmy Carter agreed not to bug television networks at the 1976 Democratic Convention only after a media advisor raised the specter of Watergate). See generally Bruce L. Felknor, Dirty Politics (1966) (discussing the history of dirty tricks in American politics).
100. See, e.g., Bell of Pa., Consumer Yellow Pages: Philadelphia 276-77 (1994).
101. See Tom Seibel, The Spy Shop: Now Anyone Can Play James Bond, Chicago Sun- Times, May 16, 1993, at 4, available in LEXIS, News Library, Majpap File.
102. See Cryptographic Algorithms for Protection of Computer Data During Transmission and Dormant Storage, 38 Fed. Reg. 12,763, 12,763 (1973) ("The increasing volume, value and confidentiality of these records regularly transmitted and stored by commercial and government agencies has led to heightened recognition and concern over their exposure to unauthorized access and use. . . . The need for protection is then apparent and urgent."); Encryption Algorithms for Computer Data Protection, 39 Fed. Reg. 30,961, 30,961 (1974) ("Because of the significant value or sensitivity of communicated and stored data, the need for adequate protection of this data from theft and misuse has become a national issue.").
103. See Encryption Algorithm for Computer Data Protection, 40 Fed. Reg. 12,134, 12,134 (1975) ("In order to insure compatibility of secure data, it is necessary to establish a data encryption standard and develop guidelines for its implementation and use.").
104. The algorithm the NBS ultimately
selected did not meet all these criteria [Page
736]
because other agencies considered it too strong to
export. See infra part I.C.1.c.i (describing U.S. export control of
encryption software under the International Traffic in Arms
Regulations (ITAR)).
105. Bamford, supra note 17, at 347; see U.S. Senate Select Comm. on Intelligence, Unclassified Summary: Involvement of the NSA in the Development of the Data Encryption Standard, reprinted in IEEE Comm., Nov. 1978, at 53, 53-55 (discussing the debate over the NSA involvement in the development of the algorithm).
106. DES, issued as FIPS 46 in January 1977, was reviewed, slightly revised, reaffirmed for federal government use in 1983 and 1987, and reissued as FIPS 46-1 in January 1988; on September 11, 1992, NIST announced a third review of FIPS 46-1, DES, and reaffirmed it for another five years as FIPS 46-2. See Revision of Federal Information Processing Standard (FIPS) 46-1 Data Encryption Standard (DES), 58 Fed. Reg. 69,347, 69,347-48 (1993) [hereinafter FIPS 46-2].
DES is identical to the ANSI standard Data Encryption Algorithm (DEA) defined in ANSI X3.92-1981. See Eric Bach et al., Cryptography FAQ (05/10: Product Ciphers Cryptology) § 5 (Aug. 30, 1993), available online URL ftp://rftm.mit.edu/pub/usenet/news.answers/cryptography-faq/part05.
107. See FIPS 46-2, supra note 106, at 69,348.
108. See Garon & Outerbridge, supra note 26, at 179.
109. See FIPS 46-2, supra note 106, at 69,348.
110. See Bamford, supra note 17, at 346 (stating that the key was originally 128 bits long); Schneier, supra note 12, at 221 (stating that the key was originally 112 bits).
111. See Bamford, supra note 17, at 348 (noting that the if the IBM 128-bit key had been used, "[a]s opposed to the moderate $5000 price tag, each solution would have cost an unimaginable $200 septillion, or $200,000,000,000,000,000,000,000,000"); infra text accompanying notes 776-80.
112. "Back doors" are
sometimes inaccurately called "trap doors,"
although[PAGE 737]
technically a "trap door"
function
is one which is computationally easy in comparison to its inverse.
For example, multiplying large prime numbers is much, much easier
than factoring a very large number whose only factors are two large
primes. See Brian Hayes, The Magic Words Are Squeamish
Ossifrage, 82 Am. Scientist 312, 312 (1994); see
also 19 Authors, Essay, The Law of Prime Numbers, 68
N.Y.U. L. Rev. 185, 188 n.14 (1993) (I had to cite it).
113. See Bamford, supra note 17, at 347. For a full description of the "S-box" technique, complete with mind-numbing diagrams, see Schneier, supra note 12, at 224-41, or Encryption Algorithm for Computer Data Protection, supra note 103, at 12,134.
114. See Schneier, supra note 12, at 240.
115. Exports are controlled pursuant to the ITAR, 22 C.F.R. §§ 120-130 (1994). See infra part I.C.1.c.i.
116.ACM Report, supra note 15, at 5.
117. See, e.g., Schneier, supra note 12, at 224-41.
118. See Letter from William B. Robinson, Director, Office of Defense Trade Controls, U.S. Dep't of State, to Phil Karn (Mar. 2, 1994), available online URL ftp://ftp.eff.org/pub/EFF/policy/crypto/ITAR_export/Karn_Schneier_export_case/book_1st. response (noting that a book is "not subject to the licensing jurisdiction of the Department of State since the item is in the public domain").
119. In contrast to DES, the government has classified the SKIPJACK encryption algorithm used by the Clipper family of chips. This should prevent the SKIPJACK algorithm from being exported. See infra note 187 and accompanying text (discussing the SKIPJACK algorithm).
120. See FIPS 46-2, supra note 106, at 69,347.
121. See generally Garon & Outerbridge, supra note 26, at 177-82 (arguing that DES is becoming increasingly vulnerable to attack as the cost of breaking it decreases exponentially).
122. Michael J. Wiener, Efficient DES Key Search § 10 (Aug. 20, 1993) (unpublished manuscript, on file with author). The attack requires that the attacker have access to the plaintext as well as the ciphertext of a single message. Cryptologists call this a "known plaintext" approach. Such attacks are easier than one might suppose, because one does not need a long, known plaintext and it is often possible to infer something about the contents of the message one seeks to cryptanalyze.
124. See Garon & Outerbridge, supra note 26, at 181.
125. In fact, there is more to A successful electronic funds transfer attack than breaking the code. Banking protocols contain other safeguards designed to thwart A "known plaintext" attack, making the calculations in the text more of a theoretical possibility than a likelihood. Letter from Dorothy Denning, Professor and Chair, Computer Sciences Department, Georgetown University, to Michael Froomkin 2 (Sept. 17, 1994) (on file with author).
126. Garon and Outerbridge estimate A one in 200,000 chance for 512 linked machines running under 20 MIPS each, at which speed they are capable of 15,000 DES operations per second. See Garon & Outerbridge, supra note 26, at 187. Pentiums should be at least seven times as fast. See John Blackford, The Promise of the P6, Computer Shopper, Aug. 1994, at 146 (noting that a 100Mhz Pentium is rated at 150 MIPS and that successor chips will be twice as fast). The estimate is probably low. Phil Karn recently reported using an optimized DES code on a 50 Mhz 486, and achieving more than 38,000 DES encryptions per second. See Posting from Phil Karn to Cypherpunks Mailing List (Aug. 6, 1994) (on file with author). Pentiums operating at 100 Mhz would probably run more than 2.5 times faster than this. Back to text
127. A MIPS-year is the computer power of a computer capable of executing one million instructions per second operating for a year. Five thousand MIPS-years is approximately equal to the power of 33 100Mhz Pentiums running for a year. See Blackford, supra note 126, at 146.
128. See Hayes, supra note 112, at 312; Gina Kolata, 100 Quadrillion Calculations Later, Eureka!, N.Y. Times, Apr. 27, 1994, at A13. The problem, known as RSA-129, was first publicized in the Mathematical Games column of Scientific American in 1977. The challenge was to factor 114,381,625,757,888,867,669,235,779,976,146,612,010, 218,296,721,242,362,562,561,842,935,706,935,245,733,897,830,597,1 23,563,958,705, 058,989,075,147,599,290,026,879,543,541. See Martin Gardner, Mathematical Games: A New Kind of Cipher That Would Take Millions of Years to Break, Sci. Am., Aug. 1977, at 120, 123. The problem remained unsolved for 17 years. The answer, 5000 MIPS-years later, was the factors 3,490,529,510,847,650,949,147,849,619,903,898,133,417,764,638,493 ,387,843,990,820,577 and 32,769,132,993,266,709,549,961,988,190,834,461,413,177,642,967,99 2,942,539,798,288,533. See Hayes, supra note 112, at 313. The message encoded with the 129- digit key said: "The magic words are squeamish ossifrage." See id. at 312.
129. For a definition of RSA, see Paul Fahn, RSA Laboratories, Answers to Frequently Asked Questions About Today's Cryptography § 2.1 (Sept. 20, 1993), available online URL ftp://rsa.com/pub/faq/Part1; ftp://rsa.com/pub/faq/Part2; ftp://rsa.com/pub/faq/Part3 [hereinafter RSA Cryptography Today FAQ].
130. Brute-force attacks on RSA keys can use shortcuts, because RSA keys are cracked by factoring large prime numbers. Shortcuts exist for this problem so that not every possible number needs to be tested. "A rule of thumb suggests that adding 10 decimal digits to the length of a number [used as an RSA key] makes it from five to 10 times as hard to factor." Hayes, supra note 112, at 316. Adding the same number of decimal digits to DES would result in a larger increase in the computational complexity of a brute-force keysearch.
131. See Wiener, supra note 122, § 9 (providing a schematic for triple-DES encryption).
132. Some versions of triple-DES use A different key each time, while others reuse the first key for the final round. The two-key version is estimated to be 10^13 times as resistant to a brute-force attack. Triple-DES is estimated to be even more secure. See id.
133. The alternatives are legion. The NSA might not want to certify a cipher it knew to be insecure, although it also might not wish to let it be known that a cipher considered secure by others was in fact vulnerable. The most likely explanation, however, is that triple-DES is so secure as to be computationally infeasible to break.
The hypothesis that the NSA opposes triple-DES because it is too hard to break gains support from the NSA's lobbying campaign to discourage the X9 secretariat of the American Bankers Association from undertaking a standards development process that might lead to the adoption of triple-DES as an approved option for domestic and international financial transactions. The NSA, which is a member of the X9 group, gave several reasons for its opposition, including:
* The government's commitment to EES and DES is inconsistent with this objective.NSA Reasons for Negative Vote (Oct. 18, 1994) (circulated with X9 letter ballot) (copy on file with author). Ultimately, after A reballoting of its executive committee, the X9 membership decided to undertake the standards development process for triple-DES by forwarding the issue to its X9F Subcommittee on Data and Information Security. See Letter from Cindy Fuller, Associate Director X9 Secretariat, to Michael Froomkin (Jan. 31, 1995) (on file with author).
* Triple-DES is not exportable.
* "[F]urther proliferation of triple-DES is counter to national security and economic concerns."
134. Indeed, NIST has stated that no publicly available cipher was suitable because such a cipher could be used without escrow. See National Institute of Standards and Technology, Key Escrow Initiative Questions and Answers 3 (July 29, 1993) (on file with author) [hereinafter Key Escrow Initiative Q&A]. NIST selected SKIPJACK in order to "assure[] no one can use the algorithm in non-escrowed systems." Id. The idea is to maximize the possibility that no one will be able to use SKIPJACK with a key shielded from the government.
[PAGE
742]
Because the SKIPJACK algorithm at the heart of the Clipper Chip
is classified, it has not had the benefit of almost 25 years of
determined attack by academic cryptologists. Even though the NSA spent 10 years developing SKIPJACK, see id. at 2-3, this
lack of publicity leaves open the possibility that it has an
intentional or unintentional back door, something that seems very
unlikely with DES.
135. Chaining, in which each secure message contains the key to the next message, will work only where the same two parties send each other a continuous stream of messages and are certain that the delays between messages will be short. If there are any significant delays, the stream becomes vulnerable to probabilistic attack. See supra text accompanying notes 124-25.
136. Turner Broadcasting Sys., Inc. v. FCC, 114 S. Ct. 2445, 2451 (1994).
137. See, e.g., Freeh Speech, supra note 43.
138. On October 25, 1994, President Clinton signed the digital telephony bill. See Communications Assistance for Law Enforcement Act, Pub. L. No. 103- 414, 108 Stat. 4279 (1994). Both houses of Congress passed the bill shortly before the end of the legislative session. See 140 Cong. Rec. S14666 (daily ed. Oct. 7, 1994) (reporting the Senate's passage of the bill by voice vote); 140 Cong. Rec. H10917 (daily ed. Oct. 5, 1994) (reporting the House's passage of the bill by two-thirds vote); see also SabrA Chartrand, Clinton Gets a Wiretapping Bill That Covers New Technologies, N.Y. Times, Oct. 9, 1994, at A27.
On the digital telephony bill, see generally Jaleen Nelson, Comment, Sledge Hammers and Scalpels: The FBI Digital Wiretap Bill and Its Effect on Free Flow of Information and Privacy, 41 UCLA L. Rev. 1139 (1994) (arguing that legislation requiring communication through only certain media and also requiring communication providers to affirmatively assist the government in wiretapping is inconsistent with rights of free flow of information and privacy in U.S. and international law).
139. William Shakespeare, The Third Part of King Henry VI act 4, sc. 7, l. 37 (Andrew S. Cairncross ed., Methuen & Co. Ltd. 1964).
140. It is clear that plans for EES have been gestating for years in the national security bureaucracy. Because, however, the Clinton Administration has adopted them wholeheartedly, this Article refers to the plan as the Administra- tion's proposal.
141. "With hindsight, the intelligence community might consider the public disclosure of the DES algorithm to have been a serious error and one that should not be repeated." ACM Report, supra note 15, at 25.
142. See Stewart A. Baker, Don't Worry, Be Happy: Why Clipper Is Good for You, Wired, June 1994, at 100, 100 (debunking seven "myths" about key escrow encryption).
143. Freeh Speech, supra note 43, at 13; see also OTA Information Security, supra note 97, at 9-10 (noting increasingly frequent portrayals of cryptography as a threat to domestic security and public safety).
144. Title III authorizes the use of wiretaps. See 18 U.S.C. § 2516 (1988 & Supp. V 1993). Orders for pen registers, which record the telephone numbers called but not the content of the conversation, are much more frequent than wiretap orders. Also, not every wiretap order necessarily falls under Title III.
145. See Communications and Computer Surveillance, Privacy and Security: Hearing Before the Subcomm. on Technology, Environment and Aviation of the House Comm. on Science, Space, and Technology, 103d Cong., 2d Sess. 10 (1994) (statement of James Kallstrom, Special Agent in Charge, Federal Bureau of Investigation) [hereinafter Kallstrom Statement] (indicating that, in the 10-year period ending in 1992, more than 22,000 convictions have resulted from court-authorized surveillances); Administrative Office of the U.S. Courts, 1993 Report on Applications for Orders Authorizing or Approving the Interception of Wire, Oral, or Electronic Communications 6 (1993) [hereinafter Wiretap Report] (stating that in 1993 "a total of 2,428 persons [were] arrested as a result of electronic surveillance activity; of those arrested, 413 were convicted").
146. See Kallstrom Statement, supra note 145, at 3; Wiretap Report, supra note 145, at 4 (showing fluctuations between 600 and 1000 court- ordered wiretaps per year between 1983 and 1993).
147. See Wiretap Report, supra note 145, at 5.
148. See Hoffman et al., supra note 26, at 115.
149. See Wiretap Report, supra note 145, at 5.
150. For an interesting analysis of the costs and benefits of wiretapping, which concludes that the digital telephony bill is not cost-effective, see Robin Hanson, Can Wiretaps Remain Cost-Effective?, Comm. ACM, Dec. 1994, at 13.
151. Hoffman et al., supra note 26, at 115. Similarly, the FBI warns that the conversion of telephone networks to digital and fiber-optic systems threatens to "make it impossible for the FBI to carry out court-approved surveillance in life-and-death cases." Freeh Speech, supra note 43, at 12. According to FBI Director Louis Freeh, "Development of technology is moving so rapidly that several hundred court-authorized surveillances already have been prevented by new technological impediments associated with advanced communications equipment." Id. at 13.
The Electronic Privacy Information Center (EPIC) recently filed suit under the Freedom of Information Act (FOIA) to force the FBI to release internal studies which formed the basis for testimony to Congress that new technologies were already having a harmful effect on law enforcement's wiretapping capabilities. See Electronic Privacy Information Center, Press Release, Group Seeks Release of FBI Wiretap Data, Calls Proposed Surveillance Legislation Unnecessary (Aug. 9, 1994). The FBI is resisting the suit. See Telephone Interview with David Sobel, Legal Counsel, Electronic Privacy Information Center (Nov. 29, 1994).
152. Office of the Press Secretary, The White House, supra note 20, at 1.
153. Even an enthusiastic defender of an absolutist vision of the First Amendment conceded that an "absolute" right against being tortured might nonetheless find room for an exception in the case of "the man who knew where the [atom] bomb [was ticking, but] sat grinning and silent in a chair" far from the place he had planted it. Charles L. Black, Jr., Mr. Justice Black, The Supreme Court, and the Bill of Rights, Harper's, Feb. 1961, at 63, reprinted in The Occasions of Justice: Essays Mostly on Law 89, 99 (1963). Explaining this position in a Constitutional Law class I attended at Yale in 1984, Professor Black stated that he believed torture morally justified in this extreme and hypothetical case. Once the torturer extracted the information required, Black continued, he should at once resign to await trial, pardon, and/or a decoration, as the case might be.
154. ACM Report, supra note 15, at 23.
155. See, e.g., Kahn, supra note 6, at 8 (discussing the U.S. Navy's use of traffic analysis to determine the movements of Japanese forces in World War II).
156. See Bamford, supra note 17, at 359 (quoting then-NSA Director Bobby Ray Inman as warning: "Application of the genius of the American scholarly community to cryptographic and cryptanalytic problems, and widespread dissemination of resulting discoveries, carry the clear risk that some of the NSA's cryptanalytic successes will be duplicated, with a consequent improvement of cryptography by foreign targets.").
157. See ACM Report, supra note 15, at 25.
The goals of U.S. export control policy in the areA of cryptography are (i) to limit foreign availability of cryptographic systems of strategic capability, namely, those capable of resisting concerted cryptanalytic attack; (ii) to limit foreign availability of cryptographic systems of sufficient strength to present a serious barrier to traffic selection or the development of standards that interfere with traffic selection by making the messages in broad classes of traffic (fax, for example) difficult to distinguish . . . .Id.
158. For a general survey of high- technology export controls, see Peter Swan, A Road Map to Understanding Export Controls: National Security in a Changing Global Environment, 30 Am. Bus. L.J. 607 (1992).
159. See ACM Report, supra note 15, at 25.
160. 15 C.F.R. §§ 768-99 (1994). The EAR are administered by the Bureau of Export Administration in the Department of Commerce. The statutory authority for the EAR, the Export Administration Act of 1979, 50 U.S.C. app. § 2401-2420 (1988 & Supp. IV 1992), lapsed on August 20, 1994. See 50 U.S.C.A. app. § 2419 (West Supp. 1994). President Clinton issued an executive order requiring that the EAR be kept in force to "the extent permitted by law" under the International Emergency Powers Act (IEPA), 50 U.S.C. §§ 1701-1706 (1988 & Supp. IV 1992). See Exec. Order No. 12,924, 59 Fed. Reg. 43,437 (1994).
161. See 22 C.F.R. § 121.1 (XIII)(b)(1) (1994). The ITAR are administered by the Office of Defense Trade Controls in the Department of State. If the State Department chooses, it can transfer jurisdiction of an export application to the Commerce Department. The statutory authority for the ITAR is the Arms Export Control Act, codified as amended at 22 U.S.C. § 2778 (1988 & Supp. IV 1992).
162. See Evan R. Berlack &
Cecil Hunt, Overview of U.S. Export Controls, in
Coping [PAGE 749]
with U.S. Export Controls
1994,
at 11, 26 (PLI Com. Law & Practice Course Handbook Series No.
A-705, 1994) (arguing that "under-staffing, technically
complex applications, [and] many layers of review within DOD . . .
[as well] as between DOD and State" characterize the permit
application process under ITAR); Ira S. Rubinstein, Export
Controls on Encryption Software, in Coping with U.S.
Export Controls 1994, supra, at 177, 194 (stating that,
under the ITAR, the State Department's Office of Defense Trade
Controls (DTC) has primary jurisdiction over cryptographic
software). The export of articles or services on the U.S.
Munitions List is regulated by the DTC under the ITAR. See
22 C.F.R. § 120.5 (1994). The DTC settles disputes regarding
whether an item is on the U.S. Munitions List according to the
commodity jurisdiction procedure, which determines whether the ITAR
or the EAR will apply. See 22 C.F.R. § 120.4
(1994).
Whether the ITAR unconstitutionally restrict free speech is outside the scope of this Article. For a discussion of this topic, see Constitutionality of the Proposed Revision of the International Traffic in Arms Regulations, 5 Op. Off. Legal Counsel 202, 213-14 (1981) (finding that the ITAR have constitutional and unconstitutional applications, and that they should be narrowed so that they are less likely to apply to certain protected speech); Christine Alexander, Preserving High Technology Secrets: National Security Controls on University Research and Teaching, 15 Law & Pol'y Int'l Bus. 173, 203 (1983) (noting that export controls on technology raise constitutional issues because technical expression may be considered speech and because such controls infringe upon the right of academic freedom); Mary M. Cheh, Government Control of Private Ideas--Striking a Balance Between Scientific Freedom and National Security, 23 Jurimetrics J. 1, 22 (1982) (arguing that cryptographic information is protected by the First Amendment); James R. Ferguson, Scientific Inquiry and the First Amendment, 64 Cornell L. Rev. 639, 654-56 (1979) (arguing that scientific inquiry merits some degree of protection by the First Amendment); Harold P. Green, Constitutional Implications of Federal Restrictions on Scientific Research and Communication, 60 UMKC L. Rev. 619, 643 (1992) (suggesting that the national security basis of governmental restrictions on scientific freedom may be weak because it is impossible to hold national security secrets effectively for even short periods of time); Ruth Greenstein, National Security Controls on Scientific Information, 23 Jurimetrics J. 50, 76-83 (1982) (arguing that noncommercial scientific communication should receive full First Amendment protection and noting that the ITAR may be unconstitutional unless interpreted narrowly because they may lack ties to compelling government interests); David A. Wilson, National Security Control of Technological Information, 25 Jurimetrics J. 109, 128-29 (1985) (suggesting that research contracts be used as devices for implementing the ITAR to avoid the constitutional difficulties associated with requiring licenses); Roger Funk, Comment, National Security Controls on the Dissemination of Privately Generated Scientific Information, 30 UCLA L. Rev. 405, 441-44 (1982) (arguing that current export control laws are overbroad in their restriction of speech and therefore employ excessive means to protect national security); Kenneth J. Pierce, Note, Public Cryptography, Arms Export Controls, and the First Amendment: A Need for Legislation, 17 Cornell Int'l L.J. 197, 213-19 (1984) (arguing that the ITAR's licensing requirement is an unconstitutional prior restraint of protected speech); Allen M. Shinn, Jr., Note, The First Amendment and the Export Laws: Free Speech on Scientific and Technical Matters, 58 Geo. Wash. L. Rev. 368, 397-400 (1990) (arguing that the regulation of scientific expression that would be unconstitutional if imposed directly would also be unconstitutional if imposed by contract).
163. See supra part I.B.1
(discussing how DES became a standard in the United [PAGE
750]
States).
164. See OTA Information Security, supra note 97, at 154; see also GAO Communications Privacy, supra note 15, at 6-7, 24-28.
165. See 22 C.F.R. § 121.1(XIII)(b) (1994) (designating cryptographic software as auxiliary military equipment). The ITAR state: "Software includes but is not limited to the system functional design, logic flow, algorithms, application programs, operating systems and support software for design, implementation, test, operation, diagnosis and repair." 22 C.F.R. § 121.8(f) (1994).
166. Freeware is software which is provided for distribution at no charge by the author, although the author typically retains the intellectual property rights. See supra note 73 (discussing the accessibility of Phil Zimmermann's PGP).
167. See E-mail from Vesselin Bontchev, Research Associate, Virus Test Center, University of Hamburg, to Michael Froomkin (July 22, 1994) (on file with author) (stating that Bontchev received a copy of PGP 2.6 from a chain of anonymous remailers and put it on the university's FTP site). According to an e-mail sent to the author by a person requesting anonymity, someone also uploaded a copy to a popular Linux archive site (sunsite.unc.edu). Overnight, therefore, the many sites around the world that mirror sunsite faithfully and unwittingly distributed PGP 2.6 worldwide.
168. For the curious, the URL is ftp://ftp.informatik.uni-hamburg.de/pub/virus/crypt/pgp/2.6mit. Other copies quickly appeared at Internet distribution sites in England, Italy, and other European countries.
169. See, e.g., Privacy
Issues in the Telecommunications Industry: Hearings on the
Administration's "Clipper Chip" Key Escrow Encryption
Program Before the Subcomm. on Technology and the Law of the Senate
Comm. on the Judiciary, 103d Cong., 2d Sess. (May 3, 1994)
[hereinafter Clipper Chip Hearings], available in
WESTLAW, USTestimony Database, 1994 WL 231119, at *13-26 (testimony
of Stephen T. Walker, President, Trusted Information Systems, Inc.)
(summarizing studies showing how export controls harm U.S.
business); see also Charles L. Evans, Comment, U.S.
Export Control of Encryption Software: Efforts to Protect National
Security Threaten the U.S. Software Industry's Ability to Compete
in Foreign Markets, 19 N.C. J. Int'l L. & Com. Reg.
469, 481-82 (1994) (indicating that, because most countries do not
have export controls [PAGE 751]
on encryption software,
U.S.
software developers are concerned about losing their foreign market
shares).
170. See Letter from MarthA Harris, Deputy Assistant Secretary for Export Controls, U.S. Dep't of State, to Philip R. Karn, Jr. (Oct. 7, 1994) (ODTC Case CJ 081- 94) (on file with author). The source code was published in Applied Cryptography. See, e.g., Schneier, supra note 12, at 519-32 (listing the source code for the IDEA cipher).
171. See Letter from William B. Robinson to Phil Karn, supra note 118.
172. Letter from William B. Robinson, Director, Office of Defense Trade Controls, U.S. Dep't of State, to Phillip R. Karn, Jr. (May 11, 1994), available online URL ftp://ftp.eff.org/pub/EFF/Policy/Crypto/ITAR_export/Karn_Schneier_export_case/floppy_2nd.response.
173. Letter from Martha Harris to Philip R. Karn, Jr., supra note 170.
174. See 22 C.F.R. § 120.10(5) (1994); see also Rubinstein, supra note 162, § 3 (describing the limited reach of the public domain exception).
175. See Letter from MarthA Harris to Phillip R. Karn, Jr., supra note 170.
176. 35 U.S.C. §§ 181-188 (1988 & Supp. V 1993) (codified as "Secrecy of Certain Inventions and Filing Applications in Foreign Country").
179. See Bamford, supra note 17, at 354-58 (describing the national publicity surrounding the NSA's issuance of secrecy orders to two inventors of cryptographic devices).
180. See id. at 355-56 ("Of the three hundred or so secrecy orders issued each year, all but A very few are either on inventions the government has originated itself and already classified, or on inventions somehow connected with the government.").
181. RSA is an example of a high-level cryptographic algorithm in circulation which has been patented. The issue of the validity of algorithmic patents is outside the scope of this Article.
182. Cf. Ferguson, supra note 162, at 659-61 (examining how the First Amendment limits effective regulation of scientific research and publication).
183. William Shakespeare, The Merry Wives of Windsor act 3, sc. 3, ll. 149-51 (H.J. Oliver ed., Methuen & Co. Ltd. 1971).
184. The text uses the Clipper Chip, which is designed for use in telephones, as an example. Similar procedures apply to the Capstone Chip and Fortezza PCMCIA card, although those also include circuitry supporting the Digital Signature Standard and data exchange such as electronic mail.
185. This prevents a "man-in-the- middle" attack by which the eavesdropper, armed with two Clipper telephones, intercepts the signal, decrypts it, records it, and then reincrypts it. In the event of such an attack, the two users will have different session keys (one with each of the attacker's phones), and will thus see different character strings appear on their readouts. See OTA Information Security, supra note 97, at 65 (Box 2-7).
186. Well, that is the theory anyway. One preliminary report suggests that the AT&T 3600c, a $1300 Clipper-equipped telephone, is difficult to use:
The hackers who bought the things had quite a hard time getting them to work at all. There were troubles getting it set up so that it would attempt to go into secure mode, and trouble getting it to do so reliably once a pair of phones that worked were found. . . To make the unit go into secure mode, one person pushes a red button. . . . Then the modems do their thing, making modem noises for about 20 seconds (your time may vary; AT&T manual said 10 seconds.) Once connected, the sound is very weak. We in the conference had trouble hearing when the earpiece was right next to a microphone. There was also a roughly quarter second delay (presumably this is for A/D conversion + encryption) in talking. This is a longish delay, roughly equal to an overseas satellite conversation.Posting from Adam Shostack to Cypherpunks Mailing List (Aug. 15, 1994) (on file with author) (describing a demonstration of the AT&T 3600c at the 1994 HOPE (Hackers on Planet Earth) Conference).
187. SKIPJACK is a single-key cipher similar to DES but with an 80-bit key. In a single-key cipher the sender and the receiver use the same key to encrypt and decrypt the message. Originally, the NSA intended SKIPJACK for government communications systems. See Key Escrow Initiative Q&A, supra note 134, at 2. The current estimate is that SKIPJACK should remain secure against brute-force attacks, despite continual increases in computing power, for at least 30 years. See Ernest F. Brickell, et al., SKIPJACK Review Interim Report: The SKIPJACK Algorithm 1 (July 28, 1993), available online URL http://www.quadralay.com/www/Crypt/Clipper/skipjack-review.html [hereinafter SKIPJACK Interim Report] ("[T]here is no significant risk that SKIPJACK will be broken by exhaustive search in the next 30-40 years.").
Not only is the SKIPJACK algorithm used by the Clipper Chip
classified, but it is burned into the chip in a fashion "which
is highly resistant to reverse engineering (destructive or non-
destructive) to obtain or modify the cryptographic algorithm."
FIPS 185, supra note 14, at 6004.
NIST's description of its
"state of the art" anti-reverse engineering design
mentions three techniques: the use of nonmetallic links
[PAGE 754]
to hold the "write once"
information programmed on
to the chip which it claims "cannot be investigated
externally"; the addition of "ghost logic" which is
additional random or intentional circuits with no purpose other
than to confuse analysis; and "false heat dissipation
methods." National Inst. of Standards and Technology, Reverse
Engineering Protection, 3 FIPS 185 Docket at tab 3. NIST also
noted that reverse engineering tends to destroy the device being
examined, and that because every chip will have differences, these
differences also provide a layer of protection. See
id.
Nevertheless, one commentator on the proposed FIPS concluded that, based on his review of the literature and his experience in teaching a class in reverse engineering at MIT, "[p]hysics tells us that we can find out what is in these chips. Others WILL perform this analysis. The only question is when. I believe it is 3-9 months." Comments of Thomas F. Knight, Jr., 2 FIPS 185 Docket.
188. A session key is the sequence of bits allowing decryption that will be used for only a single communication, one e-mail, or one telephone call. See infra text accompanying notes 790- 91.
189. One such supersecure method is the Diffie-Hellman Key Exchange. See infra text following note 792.
190. Both Clipper Chips in a telephone
conversation use the same session key to encrypt and decrypt their
messages. The original Clipper proposal envisaged two keys, one
for each chip in a telephone conversation, but NIST revised the
standard to require only a single key. The NIST revision came in
response to a comment it [PAGE 755]
received which
noted that A two-key system would limit law enforcement executing a wiretap on
Alice to her side of the conversation unless it obtained a warrant
for every Clipper Chip that communicated with Alice's telephone.
See FIPS 185, supra note 14,
at 6001. Capstone will
generate its own keys. See Mykotronx, Inc.,
Capstone MYK-80: A New Breakthrough in Encryption
Technology 1 (1993) (sales literature, on file with author)
(stating that the MYK-80 features "Message Encryption Key
generation").
191. The chips also send each other an "initialization vector" of unspecified length, which NIST defines as a "mode and application dependent vector of bytes used to initialize, synchronize and verify the encryption, decryption and key escrow functions." FIPS 185, supra note 14, at 6004.
192. In a two-way real-time communication, the two chips send each other a LEAF, and each chip performs a check to ensure that the other LEAF is valid. In one- way communications, like e-mail, there is only one chip sending one LEAF, which will later be checked by the receiving chip. The difference is significant. If only one chip sends a LEAF, then A wiretapper will need to obtain the chip unique key for every chip which calls the line being tapped, potentially resulting in the compromise of a large number of chips. By contrast, if the LEAFs go both ways, the wiretapper is indifferent as to who started the conversation because she always has one LEAF she can decipher.
193. The exact makeup of the LEAF is classified. See FIPS 185, supra note 14, at 6004. It is known, however, that it consists of the 80-bit session key which has been encrypted with the unit key, a 32-bit serial number unique to each Clipper Chip, and a 16-bit checksum. See National Inst. of Standards and Technology, Technical Fact Sheet on Blaze Report and Key Escrow Encryption 1-2 (June 2, 1994). A checksum is a "computer technique for ensuring the integrity of an identifying number." John M. Carroll, Computer Security 334 (1977).
194. Supporters of the Clipper Chip challenge this assumption. Because the family key will be in circulation only by means of a special circuit board which will be inserted into a personal computer operated by law enforcement agents, supporters of the Clipper Chip argue that its distribution will be relatively limited. See, e.g., Dorothy E. Denning & Miles Smid, Key Escrowing Today, IEEE Comm., Sept. 1994, at 58, 58 (emphasizing that the family key is secret and only accessible to authorized government officials).
195. See Clipper Chip Hearings, supra note 169, available in Westlaw, USTestimony Database, 1994 WL 231122, at *4 (statement of Jo Ann Harris, Assistant Attorney General, Criminal Division, U.S. Department of Justice) (describing the decrypt processor). As this Article went to press, the law enforcement community had access to one of the two existing decrypt processors, although Clipper- equipped telephones are currently being shipped to government purchasers. See Telephone Interview with Miles Smid, Security and Technology Group Manager, National Institute of Standards and Technology (Feb. 9, 1994).
196. See U.S. Dep't of Justice, Authorization Procedures for Release of Encryption Key Components in Conjunction with Intercepts Pursuant to Title III (Feb. 4, 1994) [hereinafter Title III Authorization Procedures] (establishing procedures by which escrow agents could release keys in response to requests pursuant to Title III), in Office of the Press Secretary, The White House, Key Escrow Encryption: Announcements- February 4, 1994 (Feb. 15, 1994) (information packet accompanying press release) (on file with author) [hereinafter Key Escrow Announcements]; U.S. Dep't of Justice, Authorization Procedures for Release of Encryption Key Components in Conjunction with Intercepts Pursuant to FISA (Feb. 4, 1994) [hereinafter FISA Authorization Procedures] (same, pursuant to the Foreign Intelligence Surveillance Act (FISA), 50 U.S.C. §§ 1801-1811 (1988)), in Key Escrow Announcements, supra; U.S. Dep't of Justice, Authorization Procedures for Release of Encryption Key Components in Conjunction with Intercepts Pursuant to State Statutes (Feb. 4, 1994) [hereinafter State Authorization Procedures] (same, pursuant to state statutes or Title III), in Key Escrow Announcements, supra.
The Attorney General's procedures for release of key escrow components require that the request for key components include the agency and individual conducting the wiretap, as well as the termination date of the period for which the intercept will be authorized. See U.S. Dep't of Justice, Attorney General Makes Key Escrow Encryption Announcements 2 (Feb. 4, 1994) [hereinafter Attorney General's Key Escrow Announcements], in Key Escrow Announcements, supra.
U.S. foreign intelligence agencies have the authority to listen in on all forms of electronic communication, including telephones, without seeking a warrant if the communications are between foreign powers or are signals (other than spoken communication) from A foreign country, embassy, or consulate to another foreign party. See Foreign Intelligence Surveillance Act, 50 U.S.C. §§ 1801-1811 (1988); see also Americo R. Cinquegrana, The Walls (and Wires) Have Ears: The Background and First Ten Years of the Foreign Intelligence Surveillance Act of 1978, 137 U. Pa. L. Rev. 793, 811-13 (1989) (describing FISA procedures).
197. State Authorization Procedures,
supra note 196, at 1; Title III
Authorization[PAGE 758]
Procedures, supra note 196, at 1;
FISA Authorization
Procedures, supra note 196, at
1.
198. The Attorney General's procedures require that requests for key segments be made by the principal prosecuting attorney of a state or political subdivision, or by the responsible person in an agency. See State Authorization Procedures, supra note 196, at 2. This requirement overlaps with the authority for emergency wiretaps in Title III, 18 U.S.C. § 2518(7) (1988).
199. See Clifford S. Fishman, Wiretapping and Eavesdropping § 30 (1978) ("Law enforcement officials have been reluctant to use [the emergency eavesdropping] authorization for fear it is unconstitutional."); 1 Wayne R. LaFave & Jerold H. Israel, Criminal Procedure § 4.2(g) (1984) ("There has been virtually no use of this emergency power . . . ."). But see Clifford S. Fishman, Wiretapping and Eavesdropping §§ 30-30f (Supp. 1993) [hereinafter Fishman Supplement] (describing various procedures for emergency wiretaps which have been used in life-threatening situations).
200. Posting from Matt Blaze to Cypherpunks Mailing List (Feb. 2, 1994) (on file with author) (reporting surprising frankness on the part of NSA spokespersons at a meeting discussing key escrow).
201. The Department of Justice, however, is required to ascertain, after the fact, that the legal authorization existed for Title III wiretaps and FISA wiretaps. See Title III Authorization Procedures, supra note 196, at 2 (stating that the "Department of Justice shall" ascertain the existence of authorizations for electronic surveillance); FISA Authorization Procedures, supra note 196, at 2 (same). Strangely, the Justice Department has no such obligation when the key segment is requested by a state or local police force. See State Authorization Procedures, supra note 196, at 2 (stating that the "Department of Justice may" inquire into the authorization for electronic surveillance).
202. See 18 U.S.C. §§ 2510- 2521 (1988 & Supp. V 1993).
203. XOR is a binary operation by which two binary numbers are compared a bit at a time. If both bits have the same value then XOR returns zero; if the two bits differ, XOR returns one. Both segments are thus equally necessary to retrieve the key, and neither segment alone provides the holder with any more information about the value of the key than would be possessed by a person who held no segments at all.
An example, using a hypothetical 1-bit key divided into two 1- bit segments, A and B, may make this clearer. Even if you know that segment A is 1, you still have no more information about the key's value than does anyone else. If segment B is 0, the key is 1 (because 1 XOR 0 = 1); but if segment B is 1, then the key is 0 (because 1 XOR 1 = 0). Similarly, someone holding segment B but not A is equally uninformed as to the key's actual value.
204. See Attorney General's Key Escrow Announcements, supra note 196, at 1.
205. Currently a company called Mykotronx is the only supplier authorized to produce Clipper Chips. The secure, compartmented information facility is located at Mykotronx. See Key Escrow Initiative Q&A, supra note 134, at 6. This may impose barriers to entry for potential competitors.
206. NIST has devised a fairly elaborate procedure for the key generation process. Someone working for each of the escrow agents will type 80 characters into a computer, which will store the characters, the amount of time between the keystrokes, the date, and the time. The computer will then feed these values into NIST's secure hash algorithm to produce a number. For a discussion of the secure hash algorithm, see Approval of Federal Information Processing Standards Publication 180, Secure Hash Standard (SHS), 58 Fed. Reg. 27,712 (1993); Proposed Revision of Federal Information Processing Standard (FIPS) 180, Secure Hash Standard, 59 Fed. Reg. 35,317 (1994) (correcting a technical flaw and confirming the algorithm's security reliability). See also Dorothy E. Denning, The Clipper Encryption System, Am. Scientist, July-Aug. 1993, at 319, 321-22 (describing how two escrow agents and a computer are needed to create the unit key, thus increasing public confidence that the failure of one escrow agent cannot compromise the system); Denning & Smid, supra note 194, at 60-61 (describing how escrow agents generate a key number and a random seed number for use in each programming session). Currently, the system is able to program about 120 Clipper Chips per hour, although NIST contemplates switching to a higher volume system at some future date. See id. at 64.
The procedure for generating the random numbers is important because anyone who knows which pseudorandom number generator was used and who also knows the "seed" could use this information to recreate all the keys without going to the trouble of consulting the escrow agents.
207. Technically, the two 80-bit segments held by the escrow agents are XORed to produce the actual key. See Denning & Smid, supra note 194, at 64- 65.
208. The procedures were devised in collaboration with the Department of Justice, the FBI, NIST, the NSA, the Department of the Treasury Automated Systems Division, and Rapid System Solutions, Inc. See id. at 58.
209. See OTA Information Security, supra note 97, at 65 n.5 (Box 2-7) (citing presentation by NIST Security Technology Manager Miles Smid in June 1994).
210. Title III Authorization Procedures, supra note 196, at 3; FISA Authorization Procedures, supra note 196, at 3; State Authorization Procedures, supra note 196, at 3. The government is completely correct to warn users of EES that their rights to exclude illegally seized or tainted evidence in any criminal proceeding are unchanged by EES.
211. Traffic analysis using pen registers (which record the numbers called by a telephone) and trap and trace devices (which record numbers calling the telephone) do not implicate the Fourth Amendment. See Smith v. Maryland, 442 U.S. 735, 741-46 (1979). Under Title III, however, both pen registers and trap and traces require a court order, although an actual warrant is not required. See 18 U.S.C §§ 3121- 3123 (1988); Criminal Procedure Project, Twenty-Second Annual Review of Criminal Procedure: United States Supreme Court and Court of Appeals 1991-1992, 81 Geo. L.J. 853, 952-54 (1993). Because decrypting the LEAF with the family key involves listening to at least a few seconds of the conversation, the act of intercepting and decrypting the LEAF constitutes wiretapping. This is so even though the information thus gathered is no better than could be obtained by a trap and trace device or a pen register. Perversely, however, even though the decryption of the LEAF is A wiretap, it may not violate the Fourth Amendment if the telephone user has no reasonable expectation of privacy for the LEAF. Whether a chip user would have for her LEAF a reasonable expectation of privacy, as the term is used in Fourth Amendment cases, is not as clear as it should be. The difficulty arises because the user is aware that the government has the information needed to decrypt the LEAF. Although the government has promised to use that information only in specific circumstances, the government cannot be estopped and is therefore free to renege. See, e.g., Office of Personnel Management v. Richmond, 496 U.S. 414, 434 (1990) (holding that payments from the Federal Treasury may only be made if authorized by statute, and that erroneous advice given to a claimant by a government employee does not therefore estop the government's denial of the claim); Heckler v. Community Health Servs., Inc., 467 U.S. 51, 63 (1984) (noting "the general rule that those who deal with the Govern- ment are expected to know the law and may not rely on the conduct of Government Agents contrary to law"); Federal Crop Ins. Corp. v. Merill, 332 U.S. 380, 385 (1947) (holding that claimants' lack of knowledge of regulations published in the Federal Register does not prevent those claimants from being bound by such regulations).
212. See 18 U.S.C. § 2520 (1988).
213. If the agent knowingly released A key improperly, the agent might be a co-[PAGE 763]
conspirator
or abettor of the illegal wiretapper. Back
to text
214. If the executive order were not classified, it would presumably have to be disclosed pursuant to the Adminstrative Procedures Act. See Adminstrative Procedures Act, 5 U.S.C. § 552(a)(2)(C) (1988) (requiring agencies to make publicly available instructions to staff that affect members of the public).
215. See 18 U.S.C. § 1030(a)(3) (1988) (codifying the Computer Fraud and Abuse Act, which makes it illegal to trespass into federal computer systems). Section 1030(a)(4) proscribes the use of federal computers to defraud. Section 1030(a)(5) makes illegal any unauthorized access to a computer system used in interstate commerce, as well as the alteration or destruction of records. This last provision applies only to those acting without authority. See § 1030(a)(5). Thus, the "plumber" would violate the statute, but arguably the escrow agent would not.
216. See Smith v. Maryland, 442 U.S. 735, 743-44 (1979) ("[A] person has no legitimate expectation of privacy in information he voluntarily turns over to third parties.").
217. See infra note 329 and accompanying text (discussing the Electronic Communications Privacy Act of 1986).
218. H.R. 5199, 103d Cong., 2d Sess. (1994).
219. Id. § 31(h)(2). The Encryption Standards and Procedures Act of 1994, if enacted, would have provided:
The United States shall not be liable for any loss incurred by any individual or other person resulting from any compromise or security breach of anyId.[PAGE 764]
encryption standard established under subsection (b) or any violation of this section or any regulation or procedure established by or under this section by-
(1) any person who is not an official or employee of the United States; or
(2) any person who is an official or employee of the United States, unless such compromise, breach, or violation is willful.
220. Some have argued that the process also violates the Computer Security Act of 1987. See infra part II.A.3.
221. U.S. Const. art. I, § 8, cl. 5.
222. NIST issues FIPS pursuant to § 111(d) of the Federal Property and Administrative Services Act of 1949, ch. 288, 63 Stat. 379 (the Brooks Act), as amended by the Computer Security Act of 1987, Pub. L. No. 100-235, 101 Stat. 1724. Relevant parts of the authority are codified at 40 U.S.C. § 759(d) (1988 & Supp. V 1993) and 15 U.S.C. § 278g-3 (1988). Arguably, neither of these statutes gives either NIST or the Secretary of Commerce the authority over telecommunications required to issue FIPS 185, because neither statute mentions telecommunications equipment. See National Bureau of Standards Act of 1901, 15 U.S.C. §§ 271-278h (1988) (describing NIST's powers prior to Computer Security Act of 1987); Computer Security Act of 1987, 15 U.S.C. § 278g-3 (1988) (giving NIST power to develop "standards, guidelines, and associated methods and techniques for computer systems" and to make standards for the "cost-effective security and privacy of sensitive information in Federal computer systems," and defining the latter to include "automatic data processing equipment" (ADPE)); Federal Property and Administrative Services Act of 1949, 40 U.S.C. § 759(d)(1) (1988) (giving the Secretary of Commerce authority to "promulgate standards and guidelines pertaining to Federal computer systems"); Federal Property and Administrative Services Act of 1949, 40 U.S.C. § 759(a)(2) (1988) (defining ADPE to include "any equipment or interconnected system or subsystems of equipment that is used in the automatic acquisition, storage, manipulation, management, move- ment, control, display, switching interchange, transmission, or reception, of data or information").
NIST, however, obtained a delegation of authority from the General Services Administration (GSA) to issue a FIPS relating to telecommunications system, although the GSA itself argued that the delegation was unnecessary. See Letter from Francis A. McDonough, Assistant Commissioner, Federal Information Resources Management, General Services Administration, to Michael R. Rubin, Deputy Chief Counsel, NIST (Jan. 28, 1994) (included in volume 3 of the official record of FIPS 185); see also 41 C.F.R. § 201-20.303(b)(2)(i)(B) (1993) (stating, per GSA regulation, that NIST has substantial telecommunications authority, which is arguably based on an incorrect reading of the Paperwork Reduction Reauthorization Act of 1986, Pub. L. No. 99-591, § 101(m), 100 Stat. 3341-335).
223. See Department of Commerce,
Semiannual Agenda of Regulations, 59 Fed. Reg. 20,135, 20,136
(1994) [hereinafter Agenda of Regulations] (noting that FIPS
"apply only to the Federal Government" and that in FIPS'
development, NIST "works closely with private industry
standard-setting organizations"); Mitch Ratcliffe, Security
Chips Trigger Alarm: Clipper and Capstone Open Digital Back
Door, MacWeek, Apr. 26, 1993, at 1, 1 (stating that FIPS
often become de facto standards because the U.S. government is the
largest computer customer in the world).[PAGE 766]
For an economic analysis of the costs and benefits of standards, see Stanley M. Besen & Joseph Farrell, Choosing How to Compete: Strategies and Tactics in Standardization, J. Econ. Perspectives, Spring 1994, at 117, 117-18 (asserting that firms manipulate standards for competitive advantage); Michael L. Katz & Carl Shapiro, Systems Competition and Network Effects, J. Econ. Perspectives, Spring 1994, at 93, 93-95 (warning that pervasive standards lead to inefficient market outcomes in "systems markets" characterized by products that require other conforming products to function). But see S.J. Liebowitz & Stephen E. Margolis, Network Externality: An Uncommon Tragedy, J. Econ. Perspectives, Spring 1994, at 133, 133-35 (arguing that the negative effects of standards identified by Katz and Shapiro are infrequent, if they exist at all). Back to text
224. See, e.g., USACM Position on the Escrow Encryption Standard, Comm. ACM, Sept. 1994, at 16, 16 (reporting a press release by the Association for Computing Machinery stating that "[i]ncreasingly, the standards set through the FIPS process directly affect non-federal organizations and the public at large").
225. See FIPS 185, supra note 14, at 6002.
229. Id. at 6005. Apparently,
individuals who are not members of organizations, or organizations
that do not already supply products or services to the government,
need [PAGE 767]
not apply. Back
to
text
232. 5 U.S.C. § 553(b)-(d) (1988). There is no reason other than long-standing practice by the NBS and NIST to believe that a notice and comment procedure was actually required. But see American College of Neuropsychopharmacology v. Weinberger, [1975 Developments] Food Drug Cosm. L. Rep. (CCH) § 38,025 (D.D.C. July 31, 1975) (holding that publication in the Federal Register combined with the complexity of the rules themselves meant that the rules in question were subject to the notice and comment procedures of § 553 of the APA).
233. Publication in the Federal Register is only required if the President should disapprove or modify a FIPS. See 40 U.S.C. § 759(d)(1) (1988).
234. A legislative rule is an exercise
of power delegated by Congress to an administrative agency. It can
create, modify or remove legal duties, rights or exemptions.
Agencies may make legislative rules through formal or informal rule
making. Formal rule making is rarely used. Informal rule making
ordinarily requires publication of [PAGE 768]
a notice
of the
proposed rule in the Federal Register, a request for
comments, and then a reasoned attention to those comments before
the final rule is promulgated in the Federal Register.
See 5 U.S.C. § 553(b)-(d) (1988) (detailing the rule-
making procedures for administrative agencies).
Most FIPS which affect federal procurement are mandatory in the sense that only federal agencies, but not the public, are required to adhere to them. See Agenda of Regulations, supra note 23, at 20,136; see also FIPS 46-2, supra note 106, at 69,347 (reaffirming FIPS 46-1 "for Federal Government use"); cf. Delegation of Authority for Waivers for Federal Information Processing Standards (FIPS), and of Procedures for Waivers for FIPS, 54 Fed. Reg. 4322 (1989) [hereinafter Waivers for FIPS] (establishing waiver procedures for agencies seeking exemptions from FIPS's requirements). FIPS 185 states, however, that it is "totally voluntary," even for federal agencies. FIPS 185, supra note 14, at 5998.
235. A nonlegislative rule is a rule which does not exercise a power delegated by Congress to an administrative agency. It cannot create, modify, or remove legal duties, rights, or exemptions. See Michael Asimow, Nonlegislative Rulemaking and Regulatory Reform, 1985 Duke L.J. 381, 383 (defining nonlegislative rules as those which "do not exercise delegated lawmaking power," but only "provide guidance to the public and to agency staff and decisionmakers"); Charles H. Koch, Jr., Public Procedures for the Promulgation of Interpretative Rules and General Statements of Policy, 64 Geo. L.J. 1047, 1048 (1976) (using the term "nonlegislative rules" to refer to rules not promulgated under the direction of the legislature and not in compliance with the APA's notice and comment procedures). If FIPS 185 is a rule at all, it is formally a nonlegislative rule in the sense thatit does not attempt to create any legal obligations that bind thepublic.
FIPS 185 is barely a rule within the APA's definition because the only way in which it constitutes "the whole or a part of an agency statement of general or particular applicability and future effect designed to implement, interpret, or prescribe law or policy," 5 U.S.C. § 551(4) (1988), is that it allows other agencies to substitute EES products for DES products. See FIPS 185, supra note 14, at 5999 (suggesting, but not mandating, that federal managers use EES instead of DES).
236. FIPS 185 is far too formal to fall into the miscellaneous category of agency products. This category includes press releases, informational publications, letters, etc. Such informal documents are not published in the Federal Register, which contains only documents "having general applicability and legal effect." Industrial Safety Equip. Ass'n, Inc. v. EPA, 837 F.2d 1115, 1121 (D.C. Cir. 1988); cf. Brock v. Cathedral Bluffs Shale Oil Co., 796 F.2d 533, 539 (D.C. Cir. 1986) (noting that the Federal Register, unlike the Code of Federal Regulations, also contains "policy statements" that have no legal affect).
237. See Asimow, supra note 235, at 383.
238. Peter L. Strauss, An
Introduction to Administrative Justice in the [PAGE 769]
United
States 157 (1989); see also Peter L. Strauss, The
Rulemaking Continuum, 41 Duke L.J. 1463, 1467 (1992)
(noting that "publication rulemaking" is typically
effected by agency staff without participation by the agency's
head).
239. Cf. Robert A. Anthony, Interpretive Rules, Policy Statements, Guidances, Manuals, and the Like--Should Federal Agencies Use Them to Bind the Public?, 41 Duke L.J. 1311, 1333-40 (1992) (discussing nonlegislative documents on which agencies rely for these categories of cases).
240. For the APA exception for policy statements, see 5 U.S.C. § 553(b)(3)(A), (d)(2) (1988).
241. Prior to FIPS 185, agencies that did not procure waivers were required to use DES for sensitive nonclassified information. See FIPS 46-2, supra note 106, at 69,348. "Sensitive information" is defined as:
[A]ny information, the loss, misuse, or unauthorized access to or modification of which could adversely affect the national interest or the conduct of Federal programs, or the privacy to which individuals are entitled under section 552a of Title 5 [United States Code] (the Privacy Act), but which has not been specifically authorized under criteria established by an Executive order or an Act of Congress to be kept secret in the interest of national defense or foreign policy.15 U.S.C. § 278g-3(d)(4) (1988). Back to text
242. FIPS 185, supra note 14, at 6003.
243. See Office of the Press Secretary, The White House, Fact Sheet: Public Encryption Management 2 (Apr. 16, 1993), in "Key Escrow" Information Packet, supra note 20.
244. See 28 U.S.C. § 524(c)(4) (1988 & Supp. V 1993) (listing the financial sources of this fund). The Attorney General has discretion to use this fund for law enforcement purposes and is not required to return money in the fund to the Treasury. Legitimate uses of the fund include paying informants, equipping government vehicles for law enforcement functions, and purchasing evidence. See § 524(c)(1). The fund is substantial, with about $1 billion in the pipeline at any time, including money due to be paid to state law enforcement agencies. See William P. Barr, Attorney General's Remarks, Benjamin N. Cardozo School of Law, Nov. 15, 1992, in 15 Cardozo L. Rev. 31, 33 (1993). The expected income alone from the Assets Forfeiture Fund in 1987 was estimated at $150 million. See David J. Fried, Rationalizing Criminal Forfeiture, 79 J. Crim. L. & Criminology 328, 365 n.167 (1988) (citing Budget Appropriations: Hearings Before the Subcomm. of the House Comm. on Appropriations, 99th Cong., 2d Sess. 114 (1986))
245. The Pentagon plans to purchase about two million Capstone PCMCIA cards for the Defense Message System. See Messmer, supra note 16, at 20; see also OTA Information Security, supra note 97, at 127 n.29 (citing Clinton Brooks, Special Assistant to the Director, NSA, May 25, 1994, for the statement that the Pentagon is using Tessera (now renamed Fortezza) cards in the Defense Message System).
246. See Dorothy E. Denning, The Clipper Chip Will Block Crime, Newsday (N.Y.), Feb. 22, 1994, at 35 (noting that "[t]he Justice Department has ordered $8 million worth of Clipper scramblers in the hope that they will become so widespread and convenient that everyone will use them").
247. Travelers desiring communications
security while abroad should take note that exemption from export
control does not equal exemption from the paperwork attendant to
even a temporary export license. Temporary export licenses for
exportable secure telephones or other telephone security devices
require a shipper's export declaration (SED) which must be acquired
before the trip and presented (in duplicate) to Customs officers
upon export and re-import. Unfortunately, Customs officials who
handle passengers have no familiarity with this form, do not know
where [PAGE 771]
it can be obtained, and are not
necessarily
willing to sign the SED because this function is allocated to the
cargo department. At best, attempts to follow the regulations
impose a minimum of an hour's delay in each direction, and probably
more. See E-mail from Matt Blaze, Senior Research
Scientist, AT&T Bell Laboratories, to Michael Froomkin (Jan. 6,
1995) (on file with author) (relating an unsuccessful attempt to go
through regular channels and concluding "it just isn't
possible for an individual traveler to follow all the
rules").
248. See Office of the Press Secretary, The White House, Statement of the Press Secretary 2 (Feb. 4, 1994) (explaining that the Department of State "will streamline export licensing procedures for [these] encryption products), in Key Escrow Announcements, supra note 196; U.S. Dep't of State, Statement of Dr. Martha Harris, Deputy Assistant Secretary of State for Political-Military Affairs: Encryption--Export Control Reform (Feb. 4, 1994) (detailing reforms of the licensing process), in Key Escrow Announcements, supra note 196.
249. See Dorothy E. Denning, Encryption and Law Enforcement § 5 (Feb. 21, 1994) (unpublished manuscript, on file with author). Although government procurement regulations are designed to award contracts to the lowest conforming bidder, without regard to past services rendered on other matters, cynics may find this action to be evidence of AT&T's desire to remain in the government's good graces. Paranoids may point to then-NSA Director Lincoln D. Faurer's statement in 1981 that "our intention is to significantly reward those DOD suppliers who produce the computer security products that we need." Bamford, supra note 17, at 362. Or, it may be that AT&T believed this was the patriotic, or commercially sensible, thing to do.
250. Anthony, supra note 239, at 1328; see Robert A. Anthony, "Well, You Want the Permit, Don't You?" Agency Efforts to Make Nonlegislative Documents Bind the Public, 44 Admin. L. Rev. 31, 37 (1992); cf. Asimow, supra note 235, at 382 (suggesting that postadoption public participation is the best way to deal with practically binding nonlegislative rules).
251. But cf. Anthony, supra note 239, at 1328-29 (suggesting that most nonlegislative documents with A "practical binding effect" achieve this end via one of the three means described in the text).
252. These are sometimes called "non-rule rules." Anthony, supra note 250, at 32 n.2 (defining "non-rule rules" as those that meet the APA's definition of "rules" but are not promulgated through legislative rule-making procedures).
253. See Motor Vehicle Mfrs. Ass'n v. State Farm Mut. Auto. Ins. Co., 463 U.S. 29, 33, 34 (1983) (emphasizing the duty of administrative agencies to consider all important aspects of a problem and to "articulate A satisfactory explanation for its action").
254. See A Proposed Federal Information Processing Standard for an Escrowed Encryption Standard (EES), 58 Fed. Reg. 40,791 (1993).
255. See FIPS 185, supra note 14, at 5998 (stating that comments were received from "22 government organizations in the United States, 22 industry organizations and 276 individuals").
256. See id. NIST ignored
comments from five industry organizations and 200 individuals who
stated that guarantees were needed to assure that EES would not be
a first step towards prohibition of other forms of encryption.
NIST responded that the standard was voluntary. See id.
Eight industry organizations and 181 individuals said that it was
premature to adopt EES as a standard until policy decisions on
[PAGE 773]
encryption had been made. NIST responded
that the
standard was voluntary. See id. at 5999. Seven individuals
proposed alternate technologies that they believed would be more
cost effective than EES. NIST responded that the standard was
voluntary. See id. at 6000.
257. Section 553's notice and comment requirements reflect Congress's "judgment that . . . informed administrative decisionmaking require[s] that agency decisions be made only after affording interested persons" an opportunity to communicate their views to the agency. Chrysler Corp. v. Brown, 441 U.S. 281, 316 (1979). By requiring "openness, explanation, and participatory democracy" in the rule-making process, notice and comment assures the legitimacy of administrative norms. Weyerhaeuser Co. v. Costle, 590 F.2d 1011, 1027 (D.C. Cir. 1978).
258. By bringing the case as a protest to a specific contract award, ideally one in which the competitor had made a tender of goods which conformed to the preexisting standard, the competitor might be able to distinguish Control DatA Corp. v. Baldridge, 655 F.2d 283 (D.C. Cir.), cert. denied, 454 U.S. 881 (1981). In Baldridge, the D.C. Circuit held, effectively, that no one has standing to sue to overturn a FIPS outside of the bid protest context because the public is outside the "zone of interests to be protected or regulated by" the Brooks Act. Id. at 290. Bid protests of this sort go initially to the Board of Contract Appeals of the General Services Administration. See 40 U.S.C. § 759(f) (1988 & Supp. V 1993); see also Contract Disputes Act of 1978, 41 U.S.C. §§ 601-613 (1988 & Supp. V 1993).
259. To have standing, a plaintiff must
demonstrate "injury that fairly can be traced to the
challenged action of the defendant, and not injury that results
from the independent action of some third party not before the
court." Simon v. Eastern Ky. Welfare Rights Org., 426 U.S.
26, 41-42 (1976); see also Valley Forge Christian College
[PAGE 774]
v. Americans United for Separation of Church
and
State, Inc., 454 U.S. 464, 473 (1982) (applying the "injury in
fact" element of the standing requirement).
260. Cf. 40 U.S.C. § 759(d) (1988) (creating waiver power); Waivers for FIPS, supra note 234, at 4322 (permitting delegation of waiver power).
261. 481 U.S. 465, 475 (1987) (holding that plaintiff office-holder's allegation that his constituents would be "influenced against him" by government action labeling films he sponsored as "political propaganda" sufficed to create standing).
262. 655 F.2d 283, 295-97 (D.C. Cir.) (applying the zone of interest test to hold that plaintiff lacked standing to challenge a FIPS), cert. denied, 454 U.S. 881 (1981).
263. See supra notes 254-57 and accompanying text.
264. See, e.g., International Tel. & Tel. Corp. v. Local 134, 419 U.S. 428, 442-48 (1975) (determining that an agency process without binding effect, even if it leads to significant practical consequences, is not reviewable under APA § 551); Industrial Safety Equip. Ass'n v. EPA, 837 F.2d 1115, 1121 (D.C. Cir. 1988) (holding that the joint publication and dissemination of a "Guide" by the National Institute for Occupational Safety and Health and the EPA, branding petitioner's wholly EPA-compliant protective device much less safe than a competitor's device, was not a reviewable action, nor a legislative rule: the Guide "established no rule that the regulated industry must obey"); American Trucking Ass'n, Inc. v. United States, 755 F.2d 1292, 1296-98 (7th Cir. 1985) (concluding that a report was an "educational undertaking" and did not "impose an obligation, determine a right or liability or fix a legal relationship," and was therefore not reviewable agency action, despite allegations of revenue loss to parties resulting from the report).
265. See supra notes 234-41 and accompanying text (questioning, in the context of the APA, the government's seeming nonaccountability regarding FIPS).
266. Cf. Valley Forge Christian College v. Americans United for Separation of Church and State, Inc., 454 U.S. 464, 485-86 (1982) (holding that "psychological" injury is insufficient to confer standing).
267. This is not to suggest that abuses of the standard-setting process are not properly actionable. See, e.g., Allied Tube & Conduit Corp. v. Indian Head, Inc., 486 U.S. 492, 503-07 (1988) (denying Noerr antitrust immunity to parties who manipulated the standard-setting process of a private association without official authority).
268. "[S]tandards are essential to
the achievement of full competition and to the saving of large sums
of money by the Government." Control Data Corp. v. Baldridge,
655 F.2d 283, 286 (D.C. Cir.), cert. denied, 454 U.S. 881
(1981). On the benefits of standardization, see Michael A.
Epstein, Standards and Intellectual Property,
in[PAGE 776]
Intellectual Property/Antitrust
1993 (PLI Patents,
Copyrights, Trademarks, and Literary Property Course Handbook
Series No. G4-3903, 1993), available in WESTLAW, TP-All
Database.
269. Vermont Yankee Nuclear Power Corp. v. Natural Resources Defense Council, Inc., 435 U.S. 519, 549 (1978), presents a particularly great hurdle by its holding that courts cannot impose on an agency procedural requirements not found in the APA.
270. Narrowing the terms of the Asset Forfeiture Super Surplus Fund is very much a second-best solution. Not only would suitable amendments to the authorizing legislation be difficult to draft, but the terms of the fund are already narrow enough to force money to be spent in inappropriate ways. See generally Alison R. Solomon, Comment, Drugs and Money: How Successful Is the Seizure and Forfeiture Program at Raising Revenue and Distributing Proceeds?, 42 Emory L.J. 1149, 1166-91 (1993) (examining the benefits, drawbacks, and management of federal asset forfeiture programs as law enforcement and revenue- raising tools).
271. NIST takes the position that all the interesting information is classified or confidential. Computer Professionals for Social Responsibility (CPSR) filed A FOIA request to obtain documents relating to the NSA's role in FIPS 185. CPSR's challenge to the denial of their request was dismissed with prejudice on summary judgment in Computer Professionals for Social Responsibility v. National Inst. of Standards & Technology, No. 92-0972-RCL (D.D.C. Apr. 11, 1994). CPSR is currently appealing the district court's summary judgment ruling. See Computer Professionals for Social Responsibility v. National Inst. of Standards & Technology, No. 94-5153 (D.C. Cir. filed June 27, 1994).
272. Pub. L. No. 100-235, 101 Stat. 1724 (codified as amended at 15 U.S.C. §§ 271, 272, 278g-3 to g-4, 278h (1988 & Supp. V 1993) and 40 U.S.C. § 759 (1988 & Supp. V 1993)).
273. See Telecommunications Network Security: Hearings Before the Subcomm. on Telecommunications and Finance of the House Comm. on Energy and Commerce, 103d Cong., 1st Sess. 133-35 (1993) (prepared testimony of Marc Rotenberg, Director of Washington Office, Computer Professionals for Social Responsibility); see also Plaintiff's Memorandum in Opposition to Defendant's Motion for Summary Judgment and in Support of Plaintiff's Cross-Motion for Partial Summary Judgment at 3-7, Computer Professionals for Social Responsibility v. National Inst. of Standards and Technology (D.D.C. filed May 18, 1993) (No. 92-0972-RCL) [hereinafter CPSR Motion] (suggesting that, contrary to the intent of Congress, NIST may have retained "surface" control of the development of DSS, but allowed the NSA to develop the technical guidelines).
274. Nor is it clear, if the Act were violated, who would have standing to complain. See supra text accompanying notes 258-64.
275. See supra note 271 (discussing CPSR's lawsuit against NIST).
276. One slight complication is that NIST's authority to promulgate FIPS 185, insofar as it relates to the Clipper Chip itself (as opposed to Capstone), probably does not derive from the Computer Security Act. The Act relates to computer systems and related equipment, not telephones. NIST's authority to promulgate a telecommunications standard that applies beyond modems derives from a delegation of authority from the GSA. See supra note 222. The discussion in the text undoubtedly applies to more directly computer-related devices such as the Capstone Chip and the Fortezza PCMCIA card.
277. H.R. Rep. No. 153(I), 100th Cong., 1st Sess., pt. 2, at 6 (1987), reprinted in 1987 U.S.C.C.A.N. 3120, 3158. For a summary of NSDD 145, see Renae A. Franks, Note, The National Security Agency and Its Interference with Private Sector Computer Security, 72 IowA L. Rev. 1015, 1020-24 (1987).
278. See H.R. Rep. No. 153(I), supra note 277, at 22, 25-26, reprinted in 1987 U.S.C.C.A.N. at 3137, 3141 (noting that "[g]reater emphasis should be given to cooperation between the military and civil agencies as well as the private sector in setting computer security and training goals," and stating that, although the NBS (now NIST) should work closely with other agencies such as the NSA, the NBS/NIST should retain "final authority" over the development of guidelines).
281. Computer Security Act § 2(b)(1).
282. Computer Security Act § 3(b)(6)(A), 15 U.S.C. § 278g-3(b)(6)(A) (1988).
283. See Computer Security Act § 3(c)(2), 15 U.S.C. § 278g-3(c)(2) (1988).
284. Memorandum of Understanding Between the Director of the National Institute of Standards and Technology and the Director of the National Security Agency Concerning the Implementation of Pub. L. No. 100-235 (Mar. 23, 1989), reprinted in Schneier, supra note 12, at 442-44 [hereinafter the NSA-NIST MOU].
287. For example, according to A document dated March 26, 1990, obtained by CPSR under FOIA, at one Technical Working Group meeting the NSA provided NIST with A position paper, classified "TOP SECRET CODEWORD," that discussed "reasons for the selection of certain algorithms." CPSR Motion, supra note 273, at 7.
288. See Computer Security Act § 2(b)(1) (stating that the "specific purposes" of the Act include assigning to NIST, and no other agency, "responsibility" for standards and guidelines for the security and privacy of federal computer systems that have non- classified information).
290. Agencies are allowed to choose to
defer to other opinions, so long as they make the final decision.
See Delta Data Sys. Corp. v. Webster, 744 F.2d 197, 201-02
(D.C. Cir. 1984) (stating that an agency may accept recommendations
from the GAO); City of Alexandria v. United States, 737 F.2d 1022,
1025-27 (Fed. Cir. 1984) (stating that the separation of powers
doctrine requires administrative agencies to be open to persuasion
by congressional committees); Westinghouse Elec. Corp. v. United
States Nuclear Regulatory Comm'n, 598 F.2d 759, 775-76 (3d Cir.
1979) (holding that an independent agency may allow itself to be
persuaded by the President or Congress); M. Steinthal & Co. v.
Seamans, 455 F.2d 1289, 1304-05 (D.C. Cir. 1971) (noting that the
GAO's significant experience in procurement contracts makes it A persuasive source of information in procurement cases); A.G.
Schoonmaker Co. v. Resor, 445 F.2d 726, 728 (D.C. Cir. 1971)
(upholding the Army's adoption of the Comptroller General's opinion
to set aside the awarding of a bid); John Reiner & Co. v.
United States, 325 F.2d 438, 442-43 (Ct. Cl. 1963) (holding it was
not arbitrary or capricious for an executive agency to defer to the
GAO, an arm of the legislature, in order to promote interbranch
comity, even if at first the agency disagreed with GAO's views),
cert. denied, 377 U.S. 931 (1964); Henry Spen & Co. v.
Laird, 354 F. Supp. 586, 588 (D.D.C. 1973) (allowing a procurement
officer to be convinced by the Comptroller General, even when the
latter lacks jurisdiction); United States ex rel. Brookfield
Constr. Co. v. Stewart, 234 F. Supp. 94, 100 (D.D.C.) (holding that
a disbursement [PAGE 780]
officer properly and
prudently
followed the advice of the Comptroller General), order
aff'd, 339 F.2d 753 (D.C. Cir. 1964).
291. A deadlocked vote does not in itself require NIST to change its mind. Nevertheless, it is not difficult to imagine why an agency might choose to compromise rather than involve the head of the entire department in a battle with the Secretary of Defense. In any case, the MOU's involvement of the Secretary of Defense seems contrary to the Act because the Act envisions no decision-making role for anyone outside NIST. NIST is part of a chain of command that goes up through the Secretary of Commerce to the President. Both the President and the Secretary of Commerce are free to consult anyone in the Cabinet, if they desire, for advice, but the Act provides no authority for NIST to turn over actual decision-making power, even in shared form, to the Secretary of Defense.
292. "The National Security Council, the Justice Department, the Commerce Department, and other key agencies were involved in this decision [to propose the Clipper Chip]. This approach has been endorsed by the President, the Vice President, and appropriate Cabinet Officials." Office of the Press Secretary, The White House, Questions and Answers About the Clinton Administration's Telecommunications Initiative 1 (Apr. 16, 1993), in "Key Escrow" Information Packet, supra note 20.
293. See Letter from NIST and the NSA to the Hon. John Conyers, Jr. and the Hon. Frank Horton, House Comm. on Gov't Operations (Dec. 22, 1989), reprinted in OTA Information Security, supra note 97, app. B at 201, 205-09.
298. See OTA Information Security, supra note 97, at 14.
299. See id. at 14-15 (discussing the NSA's advisory role in working with NIST).
300. Cf. id. at 16-18 (proposing seven options for congressional oversight and action on cryptography).
301. See Digital Privacy and Security Working Group, supra note 31, at 7 (critiquing guidelines set forth by the Clinton Administration for the Information Infrastructure Task Force). Back to text
302. 366 F. Supp. 104, 108 (D.D.C. 1973) (holding that the Acting Attorney General violated Justice Department regulations in firing Watergate Special Prosecutor Archibald Cox without first changing the rules giving the prosecutor limited tenure in office or finding that Cox acted with "extraordinary impropriety"). Back to text
303. 418 U.S. 683, 697 (1974) (rejecting the argument that an action brought by a Special Prosecutor against the President was nonjusticiable because both parties were officers of the executive branch); see Michael Herz, United States v. United States: When Can the Federal Government Sue Itself?, 32 Wm. & Mary L. Rev. 893, 952-53 (1991) (noting the limited ability of the President to control executive and independent agencies). See generally Note, Violations by Agencies of Their Own Regulations, 87 Harv. L. Rev. 629 (1974) (examining agencies' ability to depart from existing regulations).
304. 5 U.S.C. § 552 (1988); cf. APA § 552(a)(2)(c) (1988) (requiring agencies to disclose changes in regulations that will affect such disclosures).
305. See, e.g., Silvio Micali, Fair Public-Key Cryptosystems, in Advances in Cryptology-CRYPTO '92, at 113, 116 (Ernest F. Brickell ed., 1993).
306. Others see the issues differently. See, e.g., Letter from Johnny H. Killian, Senior Specialist American Constitutional Law, Congressional Research Service, to Joan D. Winston, Office of Technology Assessment 1 (March 3, 1994) (concluding that "placing custody of one of the keys in A federal court or in an agency of the Judicial Branch would almost certainly pass constitutional challenge"). In earlier drafts of this Article, I argued that holding keys was outside the judicial function because it was not "incidental" to any task specified in Article III. I am grateful to Larry Lessig and other participants in the LEXIS Counsel Connect on-line cryptography seminar for persuading me that there are two sides to the question.
307. See Metropolitan Wash. Airports Auth. v. Citizens for the Abatement of Aircraft Noise, Inc., 501 U.S. 252, 276-77 (1991) (holding that the participation of members of Congress on a committee possessing the power to veto decisions regarding local airports violated the doctrine of separation of powers); Bowsher v. Synar, 478 U.S. 714, 727-32 (1986) (holding that the Comptroller General could not be considered an executive branch official because Congress reserved the right to remove him by legislation, and, therefore, he could not constitutionally exercise budget-cutting powers given to him by the Deficit Control Act); Buckley v. Valeo, 424 U.S. 1, 126-33 (1976) (holding that members of Congress could not constitutionally appoint the members of the Federal Election Commission).
308. See Morrison v. Olson, 487 U.S. 654, 679 (1988) (stating that the Special Division may constitutionally exercise power to determine jurisdiction of Special Counsel only if this power is "truly `incidental' to" its appointment power).
309. See id. at 680-81 (noting that separation of powers ensures that "judges do not . . . undertake tasks that are more properly accomplished" by other branches).
310. 2 U.S. (2 Dall.) 409 (1792).
311. Id. at 410 n.! (reporter's note quoting from the judgment of the Circuit Court for the District of New York, a panel that included Chief Justice Cushing riding circuit); see United States v. Ferreira, 54 U.S. (13 How.) 40, 50-51 (1852) (relying on Hayburn's Case); see also Buckley, 424 U.S. at 123 (citing Hayburn's Case and Ferreira for the proposition that "executive or administrative duties of a nonjudicial nature may not be imposed on judges holding office under Article III of the Constitution"); National Mut. Ins. Co. v. Tidewater Transfer Co., 337 U.S. 582, 591 (1949) (Jackson, J., plurality opinion) (noting that courts properly are not asked to "participate in any legislative, administrative, political or other nonjudicial" functions).
312. Chandler v. Judicial Council of the Tenth Circuit, 398 U.S. 74, 111 (1970) (Harlan, J., concurring in denial of writ).
313. See Morrison v. Olson, 487 U.S. 654, 681 (1988) (discussing federal judicial control of the disclosure of federal grand jury matters).
314. The membership of judges on the Federal Sentencing Commission was upheld against a separation of powers challenge in Mistretta v. United States, 488 U.S. 361, 371- 412 (1989).
315. The Supreme Court upheld the judiciary's role in the selection and supervision of independent counsel, in regards to the Ethics in Government Act of 1978, in Morrison, 487 U.S. at 684.
316. See 50 U.S.C. §§ 1801- 1811 (1988); see also supra note 196 and accompanying text.
317. See Letter from Johnny H. Killian to Joan D. Winston, supra note 306, at 1-3 (discussing the probable constitutionality of placing custody of keys in the federal judiciary).
318. The proposed "Encryption Standards and Procedures Act of 1994" falls far short of this objective because it allows the President to designate any technologically qualified agency to hold key segments, so long as such agency lacks the authority to conduct wiretaps. See H.R. 5199, supra note 218, § 31(d)(1)-(2).
319. Compare Steven G. Calabresi,
The Vesting Clauses as Power Grants, 88 Nw. U. L.
Rev. 1377, 1389-1400 (1994) (describing the unitary executive
theory, which suggests that there is only limited congressional
power to restructure the executive department because the President
is vested with the power to control and direct subordinate
officials in their execution of statutory provisions) and
Steven G. Calabresi & Kevin H. Rhodes, The Structural
Constitution: Unitary Executive, Plural Judiciary, 105
Harv. L. Rev. 1155, 1155-71 (1992) (same) and Kevin
H. Rhodes, A Structure Without Foundation: A Reply to Professor
Froomkin, 88 Nw. U. [PAGE 786]
L. Rev. 1406,
1416-17
(1994) (same) with A. Michael Froomkin, The Imperial
Presidency's New Vestments, 88 Nw. U. L. Rev. 1346,
1347-49, 1366-69 (1994) (arguing that the Constitution gives
Congress broad power to structure the President's control over the
executive department) and A. Michael Froomkin, Still
Naked After All These Words, 88 Nw. U. L. Rev. 1420,
1427-30 (1994) (same) and A. Michael Froomkin, Note, In
Defense of Administrative Agency Autonomy, 96 Yale L.J.
787 (1987) (same).
320. Even more complex, and elegant, solutions exist. See, e.g., Silvio Micali, Fair Cryptosystems 7-8 (Aug. 1994) (unpublished manuscript, on file with author). Micali proposes a scheme in which the key can be broken up into any number of parts, and in which every part of the key is required to decrypt the message. See id. at 7. Micali's scheme includes a number of elegant but complex refinements, notably a scheme for making the keyholder "oblivious." Id. at 18, 40-41. By "oblivious" Micali means that even the trustee need not know the identity of the person whose key has been requested by the government. See id. at 18. In this way the trustees are unable to notify the person whose communications are being wiretapped. See id.
321. See Telephone Interview with Lynn McNulty, Associate Director, NIST (Aug. 5, 1994).
322. France, for example, prohibits the use of unregistered cryptographic algorithms. See James P. Chandler, et al., National Intellectual Property Law Inst. & George Washington Univ., Identification and Analysis of Foreign Laws and Regulations Pertaining to the Use of Commercial Encryption Products for Voice and Data Communications § 2.7.1 (Jan. 1994).
323. Recall that, according to the FBI, industrial espionage by friendly foreign governments is a growing threat to U.S. businesses. See supra note 43 and accompanying text.
324. The NSA has long-standing and close relationships with some of its foreign counterparts. See Bamford, supra note 17, at 309-37 (discussing BRUSA and UKUSA agreements with UK, Canada, Australia). The texts of the agreements, which date back to 1947, remain classified. John Gilmore has filed a FOIA request seeking information as to these agreements. See Posting from John Gilmore to USENET Group sci.crypt (Dec. 10, 1993) (on file with author). The NSA has yet to provide significant documents in response to this request. See Telephone Interview with Lee Tien (July 27, 1994) (notes on file with author) (Tien represents Gilmore in his FOIA request).
325. H.R. 5199, supra note 218, § 31(e)(2)(B).
326. One newspaper reported as follows:
The US plan for a Clipper chip . . . has raised fears among European businesses that sensitive information would no longer be secret if it were vetted by the CIA [or] the FBI . . . .Leonard Doyle, Spooks All Set to Hack It on the Superhighway, Independent (London), May 2, 1994, at 10.
. . . .
. . . [T]he European organisation representing users of computer security has rejected the Clinton initiative as "totally unacceptable."[PAGE 789]
. . . [T]he Information Security Business Advisory Group (Ibag), warns European governments to ignore overtures from the US govern- ment aimed at restricting access to the information superhighway to users who use encryptions that the government agencies can decode.
327. See supra note 43 and accompanying text.
328. See S. Rep. No. 541, 99th Cong., 2d Sess. 12 (1986), reprinted in 1986 U.S.C.C.A.N. 3555, 3566 ("The conversion of a voice signal to digital form for purposes of transmission does not render the communication non-wire."). A wire communication is an "aural transfer" made in whole or in part by wire, cable, or other like connection (for example, a telephone call). 18 U.S.C. § 2510(1) (1988). "Aural transfer" means "a transfer containing the human voice at any point between and including the point of origin and the point of reception." § 2510(18).
329. The Electronic Communications Privacy Act of 1986, Pub. L. No. 99-508, 100 Stat. 1848 (codified at 18 U.S.C. §§ 2510-2521 (1988 & Supp. V 1993)), defines an "electronic communication" as "any transfer of signs, signals, writing, images, sounds, data, or intelligence of any nature transmitted in whole or in part by a wire, radio, electromagnetic, photoelectronic or photooptical system that affects interstate or foreign commerce, but . . . not includ[ing] . . . any wire or oral communication." 18 U.S.C. § 2510(12).
330. A LEAF followed by an e-mail does not present the statutory problem discussed in this subsection because both the LEAF and the e-mail are electronic communications under 18 U.S.C. § 2510(12). The Fourth Amendment analysis, however, does apply to a LEAF preceding an e-mail message.
333. See Fishman Supplement, supra note 199, § 7.3.
334. See id. § 42.1 (Supp. 1993). This difference is more significant than it may sound. See United States v. Giordano, 416 U.S. 505, 524-29 (1974) (holding that warrant application initialed by Attorney General's executive assistant, apparently without the Attorney General's knowledge, was invalid).
335. The statutory exclusionary rule appears at 18 U.S.C. § 2515; see also § 2511(1)(d) (making it unlawful to use the contents of any wire or oral communication obtained in violation of the statute). Unlike the constitutional exclusionary rule, the statutory rule reaches private action, applies in civil and regulatory proceedings as well as in criminal cases, and is unaffected by the growing body of exceptions the Supreme Court has placed on the constitutional exclusionary rule, such as good faith exceptions, the eventual discovery exception, and the exception for use in rebuttal. See § 2515. I am indebted to Charles C. Marson for pointing this out to me.
336. See § 2520 (providing for damages and injunctive relief in civil actions). Congress deliberately omitted an exclusionary remedy. See S. Rep. No. 541, supra note 328, at 23, reprinted in 1986 U.S.C.C.A.N. at 3577; Fishman Supplement, supra note 199, §§ 252.1, 253. I am again indebted to Charles C. Marson for pointing this out to me.
337. See supra note 211.
338. See 18 U.S.C. §§ 3121- 3123 (1988); supra note 211.
339. See, e.g., United States v. Miller, 425 U.S. 435, 442-43 (1976) (holding that by voluntarily conveying information to a bank and its employees, the respondent did not have a legitimate expectation of privacy); Katz v. United States, 389 U.S. 347, 353 (1967) ("The Government's activities in electronically listening to and recording the petitioner's words violated the privacy upon which he justifiably relied while using the telephone booth . . . .").
340. See Katz, 389 U.S. at 361 (Harlan, J., concurring).
341. See Smith v. Maryland, 442 U.S. 735, 745-46 (1979).
342. See Miller, 425 U.S. at 443 (holding that a depositor had no legitimate expectation of privacy, and hence no protectable Fourth Amendment interest, in copies of checks and deposit slips retained by his bank because the depositor, by writing the checks and making the deposits, had taken the risk that "the information [would] be conveyed . . . to the Government").
343. For an argument that Miller should be reversed on the theory that the Right to Financial Privacy Act of 1978, Pub. L. No. 95-630, 92 Stat. 3697 (codified at 12 U.S.C. §§ 3401-3422 (1988 & Supp. V 1993)), creates A reasonable expectation of privacy in bank records, see Bercu, supra note 90, at 407-09.
344. See supra text accompanying note 155 (noting that intelligence agencies learn important information by tracking who calls whom).
345. Because telephone traffic carries with it switching information regarding the destination of the call (information that is used by the service provider's routing system), a sophisticated eavesdropper may in any event have access to some of this information with less effort.
346. See Communications Assistance for Law Enforcement Act, Pub. L. No. 103-414, § 103(a)(1), 108 Stat. 4279, 4280 (1994) (requiring telephone- service providers to make systems wiretap-ready).
347. See, e.g., Corn Exch. Bank v. Coler, 280 U.S. 218, 223 (1930) (allowing seizure of absconding husband's property without prior notice); Henry P. Monaghan, Our Perfect Constitution, 56 N.Y.U. L. Rev. 353, 396 (1981) (arguing that, contrary to arguments of "due substance" theorists, the Constitution does not protect some external concept of morality and does not guarantee perfect government).
348. The legal issues raised by publication of FIPS 185 are discussed above, see supra parts II.A.1-3, and will not be repeated here.
349. See FIPS 185, supra note 14, at 6004 ("The National Security Agency maintains these classified specifications and approves the manufacture of devices which implement the specifications.").
350. The only members of the public who have had access to the inner workings of SKIPJACK are a committee of five outside experts who were asked to examine SKIPJACK so that they could opine on its security. See SKIPJACK Interim Report, supra note 187.
351. See, e.g., Greene v. McElroy, 360 U.S. 474, 508 (1959) (holding that absent explicit authorization from either the President or Congress, an executive agency may not create a security program that deprives a civilian of employment without an opportunity to challenge an adverse determination of security clearance); Adams v. Laird, 420 F.2d 230, 235, 238-39 (D.C. Cir. 1969) (finding no due process violation when an applicant for security clearance is afforded a noncustodial interview and is able to cross-examine witnesses supplying adverse testimony, and when the agency follows clearly enunciated standards and makes adequate findings with respect to such standards), cert. denied, 397 U.S. 1039 (1970).
352. See, e.g., 18 U.S.C. § 793 (1988) (criminalizing the unauthorized disclosure of cryptographic information).
353. See, e.g., Reeves, Inc. v. Stake, 447 U.S. 429, 439 n.12 (1980) (describing the government's "`unrestricted power . . . to fix the terms and conditions upon which it will make needed purchases'" (quoting Perkins v. Lukens Steel Co., 310 U.S. 113, 127 (1940))).
354. Congress thus far has made no such choice in this case. Congress has given the Attorney General discretion to spend monies in the Asset Forfeiture Super Surplus Fund. See supra note 244 and accompanying text (describing the Fund). Conceivably, a court might imply a limit to this delegation and might find that the attempt to determine industrial policy in its use of the Fund exceeded the implicit limit. Because there appears to be no one with standing to sue, this must remain speculation.
355. See Meese v. Keene, 481 U.S.
465, 479-80 (1987) (holding that government labeling of
environmental films as "political propaganda" is
permissible government speech); Thomas I. Emerson, The System of
Freedom of Expression 699-708 (1970) (stating that government
has the same freedom of speech as individuals);[PAGE 795]
Mark G. Yudof, When Government Speaks: Politics, Law, and
Government Expression in America 301 (1983) (stating that
courts "create more problems than they solve" when they
attempt to limit government expression); Steven Shiffrin,
Government Speech, 27 UCLA L. Rev. 565, 622 (1980)
(encouraging application of a balancing test when analyzing
government subsidies such as election funding and free speech);
Mark G. Yudof, When Governments Speak: Toward a Theory
of Government Expression and the First Amendment, 57 Tex. L.
Rev. 863, 917 (1979) (arguing for legislative rather than
judicial control of government speech); cf. Beth Orsoff,
Note, Government Speech as Government Censorship, 67 S.
Cal. L. Rev. 229, 234 (1993) ("[A]ll government criticism
carries with it an implied threat. Thus, the test should be
whether the average reasonable person receiving the government
criticism would perceive it as a threat, not whether the government
official can legitimately execute the threat."). By this
standard, cheerleading for Clipper seems to be permissible,
although I have qualms about speaking for "the average
reasonable person." If, however, the government pressured
AT&T into abandoning its plans to manufacture a DES-based
secure telephone and to substitute a Clipper telephone instead,
then the cheerleading stepped over the line to impermissible
coercion.
356. See, e.g., 15 U.S.C. § 272(c)(22) (1988) (catchall provision for authorized NIST activities).
357. See FIPS 185, supra note 14, at 6000.
358. The Justice Department has available the Asset Super Surplus Forfeiture Fund. See supra note 244 and accompanying text (describing the Fund). NIST has a cost recovery fund and a working capital fund. See 15 U.S.C. § 278b (1988). Back to text
359. See supra text accompanying note 165 (discussing the prohibition of the export of cryptographic software and hardware).
360. Preencrypting a message with an
ordinary, non-escrowed cipher, then feeding it to an EES-compliant
device preserves the user's ability to make it appear that the
message complies with EES while, in fact, partially subverting it.
A casual inspection of the message will reveal a valid LEAF, and
decrypting the LEAF with the family key will reveal a valid chip
serial number. Decrypting the message with the session key
obtained from the escrow agents, however, will reveal nothing more
than a new ciphertext. Eavesdroppers remain able to do traffic
analysis by logging the serial numbers of chips as they
communicate, but they cannot hear or read the message [PAGE
796]
without cracking the additional cipher.
FIPS 185 prohibits the postencryption of an EES message. Because FIPS 185 is only a nonbinding standard, it remains legal to postencrypt output from a Clipper or Capstone Chip, making the LEAF unintelligible to even a public servant armed with the family key. Although it is legal, it is also fairly pointless: if you are going to use another system on top of Clipper/Capstone, why bother using the latter at all? Because postencryption violates FIPS 185, an EES-compliant device will refuse to decrypt a postencrypted message, making postencryption of limited utility.
361. See supra text following note 139; supra text accompanying note 152.
362. See supra text accompanying note 141.
363. See supra text accompanying notes 242, 246.
364. Although forceful, this argument ignores the difference between illicit government surveillance that requires an intrusion into the home or office, and illicit surveillance that does not. If the White House "plumbers" who committed the Watergate burglary had been able to wiretap the Democratic National Committee from the outside, there would never have been a "third-rate burglary" detected by an alert security guard, and President Nixon would have completed his second term.
365. See supra text accompanying note 143.
366. "You shouldn't over estimate the I.Q. of crooks." Stewart A. Baker, Data Encryption: Who Holds the Keys?, Address Before the Fourth Conference on Computers, Freedom and Privacy 8 (Mar. 24, 1994) (transcript on file with author) [hereinafter Baker Talk]. Indeed, criminals often use ordinary telephones, which can be wiretapped.
367. As Stewart Baker, then the General Counsel of the National Security Agency, put it:
The concern is not so much what happens today when people go in and buy voice scramblers; it is the prospect that in five years or eight years or ten years every phone you buy that costs $75 or more will have an encrypt button on it that will interoperate with every other phone in the country and suddenly we will discover that our entire communications network, sophisticated as it is, is being used in ways that are profoundly anti-social. That's the real concern, I think, that Clipper addresses. If we are going to have a standardized form of encryption that is going to change the world we should think seriously about what we are going to do when it is misused.Id. at 9.
368. See Denning, supra note 206, at 322.
369. See supra note 73 and accompanying text (describing how to get military-grade cryptography on the Internet).
370. A program called AquaFone (so named because the imperfect voice quality makes it sound as if the user is speaking under water) is now available from Cogon Electronics of Culpeper, Virginia. The program uses RSA encryption under license from RSA Data Security, Inc., and sells for $129. The hardware requirements are minimal, as the two parties to the conversation need only a personal computer, a sound card, and a modem. The company markets a demonstration disk via its 800 number.
Phil Zimmermann, the creator of the popular shareware encryption program PGP, and a development team are currently working on a voice version of the PGP encryption program, nicknamed "Voice-PGP." The program will be released early in 1995, although the actual name has not yet been selected. See Telephone Interview with Philip Zimmermann (Dec. 6, 1994) (notes on file with author).
371. The classic line, now almost A battle cry, is John Barlow's proclamation, "[Y]ou won't pry my fingers from its private key until you pull it from my cold dead hands." Steven Levy, Crypto Rebels (June 1994), available online URL http://wwn.cggnus.com/~gnu/crypto.rebels. html.
372. The VigenŠre cipher, which was
well-known by the 17th century, was still considered unbreakable at
the time of the American Revolution. See Kahn,
supra [PAGE 799]
note 6,
at 214-21. Indeed,
Scientific American wrongly described the cipher as
uncrackable as late as 1917. See id. at 148. For A fascinating discussion of Thomas Jefferson's creation of A cryptosystem still good enough to be used by the U.S. Navy in 1967,
see id. at 192-95.
373. See Nelson, supra note 138, at 1139-42 (describing preliminary draft of digital telephony legislation, requiring the alteration of electronic communications equipment to enable the government to maintain its current wiretapping capabilities).
374. The individual's right to remain silent harms the prosecution; England recently abridged the right for that reason. See Criminal Justice and Public Order Act, 1994, ch. 33, §§ 34-37 (Eng.) (providing that courts may draw adverse inferences from the silence of suspects). As a result of this legislation, the warning to be given to suspects upon arrest has been tentatively redrafted to read:
You do not have to say anything. But if you do not mention now something which you later use in your defence the court may decide that your failure to mention it now strengthens the case against you. A record will be made of anything you say and it may be given in evidence if you are brought to trial.Jason Bennetto, New Police Caution Alarms Legal Experts, Independent (London), Aug. 20, 1994, at 4 (quoting the draft text of the revised caution to suspects). Whether this change will ever take effect--or survive judicial review if it does--is open to question because the European Court of Human Rights recently ruled that the right to remain silent is guaranteed under the European Convention on Human Rights (formerly known as the Convention for the Protection of Human Rights and Fundamental Freedoms), Nov. 4, 1950, art. 6(1), 213 U.N.T.S. 221. See Funke v. France, 256 Eur. Ct. H.R. (ser. A) at 8 (1993) (holding that Article 6(1) of the European Convention on Human Rights guarantees the right against self-incrimination); Ying H. Tan, Use of DTI Interviews Unfair, Independent (London), Sept. 30, 1994, at 30 (reporting the decision of the European Commission of Human Rights in Saunders v. United Kingdom). Back to text
375. The important difference between the 18th and 20th centuries is that rapid communication is possible over much greater distances.
376. See, e.g., PGP(TM) User's Guide, supra note 73 (maintaining that the encryption habit increases the supply of privacy, a public good, to everyone).
377. "It is very important to change keys frequently to minimize" the problem of key compromise. Schneier, supra note 12, at 27. In A software-based cryptographic system, changing the key is as easy as pressing a button. The Clinton Administration has repeatedly said that it would be pleased to consider software-based escrow systems if they could be designed in a way that prevented users from using non-escrowed keys.
378. See 18 U.S.C.
§ 2518(8)(d) (1988) (requiring that "[w]ithin A reasonable
time but not later than ninety days after" the termination of
a wiretap, the persons named in the wiretap order and "such
other parties to intercepted communications as the judge may
determine in his discretion that is in the interest of
justice," must be given notice of the wiretap and of the fact
that communications were intercepted; and providing that upon the
filing of a motion, the judge has discretion to allow access to
such portions of the intercepted communications for inspection as
the judge believes to be warranted by the interests of
justice).[PAGE 801]
There is no comparable notification duty for those wiretaps governed by the Foreign Intelligence Surveillance Act of 1978. See 50 U.S.C. §§ 1801-1811 (1988). The EES proposal requires no additional reporting to the subjects of such wiretaps.
379. In practice, replacement is likely to require getting a whole new telephone because one of the aims of the EES program is to make it difficult to obtain chips to reverse engineer.
380. Upon the expiration of the authority for a wiretap, the public servants are supposed to destroy the key information stored in the Decrypt Processor. See Denning & Smid, supra note 194, at 68.
381. Capstone is the e-mail version of Clipper, based on the Fortezza chip. Capstone provides both encryption and digital signatures. See supra note 16.
382. See Capstone Chip Technology, supra note 16. Back to text
383. See 18 U.S.C. § 2518(8)(d).
384. See Letter from Dorothy Denning, Professor and Chair, Computer Sciences Department, Georgetown University, to Michael Froomkin 3 (Sept. 17, 1994) (stating that the Tessera/Fortezza card stores separate keys for signatures) (on file with author).
385. "[I]t must be presumed that federal officers will adhere to the law . . . ." Sanchez- Espinoza v. Reagan, 770 F.2d 202, 208 n.8 (D.C. Cir. 1985).
386. Threat analysis is a long- established intelligence approach in which one assumes the worst about everyone and attempts to measure their capabilities for harm without regard to their likely or actual motives. See, e.g., Andrew Cockburn, The Threat: Inside the Soviet Military Machine 6 (1983) (describing American threat assessment of the Soviet Union's military capabilities).
387. The Federalist No. 51, at 322 (James Madison) (Clinton Rossiter ed., 1961).
388. See Albert O. Hirschman, The Passions and the Interests: Political Arguments for Capitalism Before Its Triumph 30 (1977) (discussing Federalist No. 51, in which Madison justified the separation of powers as necessary to control the abuses of government).
389. So too, of course, is the counterbalancing impulse that government is pointless if it is not effective. See, e.g., McCulloch v. Maryland, 17 U.S. (4 Wheat.) 316, 421 (1819) (rejecting a strict construction of the Necessary and Proper Clause in favor of a construction recognizing broad discretion in the means Congress may adopt to achieve its legitimate ends).
390. Taking the keys out of escrow and using them might constitute a taking under the Fifth Amendment. In addition, if the government promises the public secure communications, and then attempts to go back on its promise, there may be grounds for arguing that the government violated the Due Process Clause of the Fifth Amendment by its bait and switch tactics.
391. "Key management is the hardest part of cryptography, and often the Achilles heel of an otherwise secure system." Schneier, supra note 12, at xvi. For examples of Cold War NSA security breaches, see Kahn, supra note 6, at 690-97.
392. See infra note 767 and accompanying text (discussing the cryptological community's mistrust of secret algorithms).
393. See Key Escrow Initiative Q&A, supra note 134, at 2-3. Back to text
394. The SKIPJACK algorithm was reviewed by a panel of five distinguished outside experts who gave it their interim seal of approval. See SKIPJACK Interim Report, supra note 187, at 1, 7.
395. See, e.g., Baker Talk, supra note 366, at 6-10 (noting that communications protected by SKIPJACK cannot be intercepted without access to the escrow keys).
396. See, e.g., Digital Privacy and Security Working Group, supra note 31, at 4.
397. See Robert Garcˇa, "Garbage In, Gospel Out": Criminal Discovery, Computer Reliability, and the Constitution, 38 UCLA L. Rev. 1043, 1053 (1991) (noting that "changes in technology are likely to increase the use of electronic eavesdropping significantly").
398. See Steven Emerson, Where Have All His Spies Gone?, N.Y. Times, Aug. 12, 1990, § 6 (Magazine), at 16, 16, 19; see also Stephen Kinzer, German Lawmakers Back Steps to End Spy Taint, N.Y. Times, Oct. 18, 1991, at A6 (stating that the Stasi had "about 85,000 agents and several million part-time informers").
399. See Emerson, supra note 398, at 19, 30.
400. See Ferdinand Protzman, German Overhaul Is Led by Phones, N.Y. Times, Mar. 11, 1992, at D1 (reporting 1.8 million telephones in East Germany before unification--one for every 10 citizens); see also 1993 U.S. Statistical Abstract, supra note 38, at 563 (reporting 141.2 million telephone lines in the United States and an average of 9.773 billion telephone conversations per day).
401. See 1993 U.S. Statistical Abstract, supra note 38, at 563.
402. In 1993, the average cost of installing and monitoring a wiretap on a single subject (including those who may have had more than one telephone) was $57,256. See Wiretap Report, supra note 145, at 5.
403. See Communications Assistance for Law Enforcement Act, Pub. L. No. 103-414, 108 Stat. 4279 (1994); supra note 138 and accompanying text. For A discussion of an earlier version of the Digital Telephony initiative, see Nelson, supra note 138.
404. The government has already
connected the databases of the Customs Service, [PAGE 806]
the
Drug Enforcement Agency, the IRS, the Federal Reserve, and the
State Department. In addition, the Counter Narcotics Center, based
at CIA headquarters, "includes agents from the FBI, the DEA,
the NSA, the Defense Department, the State Department, and the
Coast Guard." Garcˇa, supra note
397, at 1065. For an
alarming account of the sweeping information compiled by the
Treasury Department for its Financial Crimes Enforcement Network
(FinCEN) and the few legal controls applicable, see Bercu,
supra note 90. The existence of A large, and linked,
database is potentially alarming because the United States has
relatively few data protection statutes along the lines of the
European and Canadian models. See Paul Schwartz, DatA Processing and Government Administration: The Failure of the
American Legal Response to the Computer, 43 Hastings L.J
1321, 1324 (1992) (stating that from an international perspective,
the American legislative response to computer processing of
personal data is incomplete); see also Office of
Technology Assessment, U.S. Congress, Making Government Work:
Electronic Delivery of Federal Services 144 (OTA-TCT-578 1993)
(warning that the "extensive use of computer matching can lead
to a `virtual' national data bank, even if computer records are not
centralized in one location").
405. See John Markoff, A Spy Agency Gives Contract to Cray Computer, N.Y. Times, Aug. 18, 1994, at D3 (reporting that Colombian police were able to track down drug-cartel leader Pablo Escobar Gaviria by programming U.S.- supplied computers to monitor cell-phone frequencies for his voice).
406. See Garcˇa, supra note 397, at 1056 n.39 (collecting sources that detail technological advances).
407. See Blaze, supra note 16 (manuscript at 131) (announcing the discovery of a method enabling cryptographic communication among EES processors without the transmission of a valid LEAF).
408. Spoofing has no effect on the security of the communication other than to block access by eavesdroppers armed with the family key and the chip unique key. See id. (manuscript at 138-39). Back to text
410. See supra note 193.
411. See Blaze, supra note 16 (manuscript at 141). Blaze cautions that the test machine was not optimized for speed. See id. (manuscript at 140). On the probabilistic nature of this trial-and-error approach, see supra text accompanying notes 123-24.
412. See National Inst. Standards & Technology, supra note 193, at 1.
413. See Posting from David Koontz to Cypherpunks Mailing List (Aug. 25, 1994) (on file with author).
414. This monitoring capability might become particularly significant in the event that the government attempts to make key escrow mandatory.
415. See supra note 245 (noting large Defense Department orders of EES-compliant devices).
416. In a poll of one thousand Americans, two-thirds found it more important to protect the privacy of phone calls than to preserve the ability of police to conduct wiretaps. When informed about the Clipper Chip, 80% said they opposed it. See Philip Elmer-Dewitt, Who Should Keep the Keys?, Time, Mar. 14, 1994, at 90. Doubt about the Clipper has already become part of popular culture. See, e.g., D.G. Chichester et al., Tree of Knowledge: Conclusion: Softwar, Daredevil, Sept. 1994, at 1, 5 (describing Clipper Chip as a "suspicious tool"); People Are Talking About: Big Donut, Vogue, Sept. 1994, at 172, 172 (asking: "How to cope?" with the Clipper Chip).
417. See supra text accompanying note 131.
418. See infra note 791.
419. In either case, the government may
choose to augment the hardware-based EES [PAGE 809]
with A software key escrow standard. A software key escrow system seeks
to achieve the same ends as the Clipper Chip without requiring that
users purchase expensive and potentially inflexible hardware.
Software-based systems are potentially more vulnerable to reverse
engineering, thus increasing the danger that the cryptosystem might
be converted to non-escrowed uses. Although adding software key
escrow would increase the consumer appeal of escrowed encryption,
there is a good chance that even this would not suffice to create
a widely used standard.
420. Office of the Press Secretary, The White House, supra note 292, at 1.
421. Hoffman et al., supra note 26, at 112 (comparing a ban on unescrowed cryptography to the prohibition of alcohol in the 1920s). Back to text
422. See, e.g., Brock Meeks, Cyberwire Dispatch (Feb. 22, 1994), available online URL gopher://cyberwerks.com:70/00h/cyberwire/cwd/cwd.9402.22b (describing a classified April 30, 1993 memo from the Assistant Secretary of Defense stating that law enforcement and national security agencies "propose that cryptography be made available and required which contains a `trap door' that would allow law enforcement and national security officials, under proper supervision, to decrypt enciphered communications"); John Mintz & John Schwartz, Chipping Away at Privacy?, Wash. Post, May 30, 1993, at H1 (describing the Administration's contingency plan to ban unescrowed encryption).
423. Lance J. Hoffman et al., Cryptography: Trends in Technology and Policy 8 (1993) (quoting Memorandum from John Podesta, Assistant to the President and Staff Secretary, The White House, to Jerry Berman, Digital Privacy and Security Working Group on Key Escrow Encryption Technology (July 29, 1993)).
424. Office of the Press Secretary, The White House, supra note 292, at 2.
425. See supra note 138 and accompanying text (discussing the Digital Telephony initiative).
426. Digital Privacy and Security Working Group, supra note 31, at 8.
427. Louis Freeh, Keynote Luncheon Address at the International Cryptography Institute (Sept. 23, 1994) (excerpt on file with author).
429. See Memorandum from Robert D. Poling, Specialist in American Public Law, American Law Division, Congressional Research Service 2-5 (Oct. 4, 1994) (discussing current legal authority to mandate private use of the Clipper Chip, and noting that, although the Computer Security Act allows the government to set cryptographic standards for its computers, this authority applies only to the federal government's computer systems and not to civilian computer systems) (on file with author).
430. Olmstead v. United States, 277 U.S. 438, 474 (1928) (Brandeis, J., dissenting).
431. Trusted Information Systems has proposed a software key escrow protocol to NIST. See Telephone Interview with David Balenson, Senior Computer Scientist, Trusted Information Systems (June 10, 1994).
Some of the constitutional questions discussed below arguably might be avoided by focusing on standing. It might be said that because the chip (or commercial software) is produced by A corporation, and it is the corporation that is required to give the government the keys, the ultimate user thus lacks standing to complain (and, in some cases, the corporation lacks the rights enjoyed by natural persons). Because, however, much encryption occurs in software rather than hardware, and software typically generates new session keys for each communication, and these keys would not ordinarily be escrowed but for the legislation, the discussion in the text assumes that it is the end-user, whether A person or a corporation, who will have to give the government the information it seeks.
For ease of exposition, the text uses the phrase "chip key" to refer to any unique identifying key that allows LEAF-like access to a session key, whether the "chip key" is implemented in hardware or software. Back to text
432. The right to privacy also derives from the Third and Ninth Amendments. See Griswold v. Connecticut, 381 U.S. 479, 484-85 (1965) (describing how the Third and Ninth Amendments create "zones of privacy"). This Article does not discuss the claim that the Second Amendment protects cryptography. That argument gains some force from the ITAR's classification of cryptography as a "munition," although the extent to which an administrative classification should have constitutional implications is certainly debatable. The topic would, however, require a discussion of the federal power to regulate weaponry that is beyond the scope of this Article.
434. See generally Laurence H. Tribe, American Constitutional Law § 12-1 (2d ed. 1988) (discussing whether freedom of speech is a means to some end or is an end in itself). For the view that the most important function of the First Amendment is to promote and protect democracy, see Alexander Meiklejohn, Free Speech and Its Relation to Self- Government (1972).
435. Whether mandatory key escrow is compelled speech does not turn on how the government gets the keys. Although under EES the keys are provided to the government before the user buys the product, the user is still forced to send a LEAF to use the encryption. Similarly, with software encryption, users will be required to communicate the session key to the government in some fashion.
436. See Riley v. National Fed'n of the Blind, 487 U.S. 781, 795 (1988) ("Mandating speech that a speaker would not otherwise make necessarily alters the content of the speech."). Thus, compelled disclosures of fact enjoy the same protection as the compelled expressions of opinion in Wooley v. Maynard, 430 U.S. 705, 713 (1977) (holding that requiring cars to display license plates bearing New Hampshire's state motto is unconstitutional), and West Virginia State Board of Education v. Barnette, 319 U.S. 624, 642 (1943) (holding that compelling individuals to recite the pledge of allegiance and salute the flag violates the First Amendment). But see R. George Wright, Free Speech and the Mandated Disclosure of Information, 25 U. Rich. L. Rev. 475, 496 (1991) (arguing that a less stringent standard would have been more appropriate in Riley).
437. See Riley, 487 U.S. at 798.
440. See id. at 720 (Rehnquist, J., dissenting) (stating that citizens are not "forced to affirm or reject that motto").
441. In addition to Wooley, these include Miami Herald Publishing Co. v. Tornillo, 418 U.S. 241, 258 (1974) (holding unconstitutional a state law requiring newspapers to provide a right of reply to political candidates), and Barnette, 319 U.S. at 642 (finding a compulsory flag salute and recital of the pledge of allegiance unconstitutional).
442. Buckley v. Valeo, 424 U.S. 1, 64 (1976) (citations omitted); see also Gibson v. FloridA Legislative Investigation Comm., 372 U.S. 539, 546 (1963) (holding that disclosure of membership lists requires a "substantial relation between the information sought and a . . . compelling state interest").
443. This was the issue in Riley v. National Fed'n of the Blind, 487 U.S. 781, 781 (1988).
444. The Supreme Court described the protection of national security as a compelling state interest in Aptheker v. Secretary of State, 378 U.S. 500, 509 (1964) ("That Congress . . . has power to safeguard our Nation's security is obvious and unarguable."). See generally Developments in the Law--The National Security Interest and Civil Liberties, 85 Harv. L. Rev. 1130 (1972) (surveying whether national security claims justify the use of secrecy, surveillance, and emergency police powers). But see National Fed'n of Fed. Employees v. Greenberg, 789 F. Supp. 430, 436 (D.D.C. 1992) ("[S]ecurity concerns do not, under the American system of ordered liberty, ipso facto override all constitutional and privacy considerations. The purpose of national security is to protect American citizens, not to overwhelm their rights."), vacated, 983 F.2d 286 (D.C. Cir. 1993).
445. See Ward v. Rock Against Racism, 491 U.S. 781, 799 (1989) (noting that regulation is not narrowly tailored when a substantial portion of the burden on speech does not advance the state's content-neutral goals).
446. The Supreme Court's practice of
balancing constitutional rights against public needs has attracted
considerable criticism. For a survey of the issues, see Symposium,
[PAGE 815]
When Is a Line as Long as a Rock Is
Heavy?:
Reconciling Public Values and Individual Rights in Constitutional
Adjudication, 45 Hastings L.J. 707 (1994).
447. See Tribe, supra note 434, § 12-24 (discussing the "public forum" freedom of speech doctrine).
448. Scheppele, supra note 1, at 302; see also Shoshana Zuboff, In the Age of the Smart Machine: The Future of Work and Power 344-45 (1988) (describing the phenomenon of "anticipatory conformity" among persons who believe they are being observed).
449. See Turner Broadcasting Sys., Inc. v. FCC, 114 S. Ct. 2445, 2459-62 (1994) (holding that A must-carry provision that distinguished between speakers solely by the technical means used to carry speech is not a content-based restriction); Clark v. Community for Creative Non-Violence, 468 U.S. 288, 293 (1984) (allowing reasonable time, place, and manner restrictions on speech, provided such restrictions are not content- based); City Council of L.A. v. Taxpayers for Vincent, 466 U.S. 789, 804 (1984) (describing an antisign ordinance as content- neutral); Heffron v. Int'l Soc'y for Krishna Consciousness, Inc., 452 U.S. 640, 648-49 (1981) (holding a time, place, and manner regulation on all solicitations at a state fair to be content- neutral).
The act of disclosing the key might be viewed as compelled speech, in which case the compulsion would be subjected to strict scrutiny. See supra part III.A.1. Merely recording public information may not rise to the level of a chilling effect on speech. The Ninth Circuit rejected a Free Exercise challenge to warrantless government tape recordings of public church services at which parishioners discussed smuggling Central Americans into Arizona. See United States v. Aguilar, 883 F.2d 662, 694-96 (9th Cir. 1989), cert. denied, 498 U.S. 1046 (1991). Aguilar is inapposite, however, because the chip key is not public.
450. See generally David S. Day, The Incidental Regulation of Free Speech, 42 U. Miami L. Rev. 491 (1988) (discussing the development of the less- exacting incidental regulation doctrine for examining free speech concerns); Geoffrey R. Stone, Content-Neutral Restrictions, 54 U. Chi. L. Rev. 46 (1987) (exploring the nature of content-neutral review); Ned Greenberg, Note, Mendelsohn v. Meese: A First Amendment,CODE>[PAGE 816]Challenge to the Anti- Terrorism Act of 1987, 39 Am. U. L. Rev. 355, 369 (1990) (distinguishing between regulations that incidentally restrict speech, which are subject to a lower level of scrutiny, and those that directly curtail speech, which are subject to a higher level of scrutiny).
451. See City of Ladue v. Gilleo, 114 S. Ct. 2038, 2046 (1994) (applying the balancing test); Clark, 468 U.S. at 293 (same); Consolidated Edison Co. v. Public Serv. Comm'n, 447 U.S. 530, 535 (1980) (same); Tribe, supra note 434, § 12-23, at 979 (stating that the Supreme Court's balancing test examines "the degree to which any given inhibition . . . falls unevenly upon various groups").
The discussion in the text assumes that a court would not find that mandatory key escrow has shut down a traditional public forum. Although mandatory key escrow most severely affects private conversation, it also affects USENET--which may be a public forum or, in universities, at least, a series of linked public fora--and other bulletin board services which may be considered private fora, cf. Allen S. Hammond, IV, Regulating Broadband Communications Networks, 9 Yale J. Reg. 181, 219 (1992) (including "certain subscription technologies" within the definition of private fora), by making anonymous posting of messages less secure. If a court were to find that mandatory key escrow seriously inhibited a traditional public forum, the court would likely find the statute unconstitutional. See Tribe, supra note 434, § 12-24, at 987 (noting that the designation "public forum" serves as "shorthand for the recognition that a particular context represents an important channel of communication in the system of free expression").
452. On the use of computers for political speech, see Eric C. Jensen, Comment, An Electronic Soapbox: Computer Bulletin Boards and the First Amendment, 39 Fed. Comm. L.J. 217, 218-24 (1987) (noting that the growth of various types of computer bulletin boards "`brings back the era of the pamphleteer'" (quoting Lee Dembart, The Law Versus Computers: A Confounding Terminal Case, L.A. Times, Aug. 11, 1985, at D3)). Leaving aside the special case of anonymous speech, discussed below, see infra part III.A.3, the extent to which encrypted speech (for example, on Clipper telephones) is likely to be chilled is an empirical question on which it would be difficult to collect evidence. It is hard to measure how many people will not use encrypted telephones or e-mail if they are not confident the system is secure. It is harder still to measure how their speech changes as a result. A court considering this issue is likely to assume that the government will act legally and decrypt EES communications only when authorized. Courts are unlikely to accept that reasonable people might disagree, although whether they would, and how much, is the central empirical question.
453. See Tribe, supra note 434, § 12-23, at 979-80 (describing how the Court seeks [**PAGE 817**]to avoid upholding communicative limits with a disproportionate impact on the poor, because the poor have the fewest alternative communication channels).
454. City of Ladue, 114 S. Ct. at 2045 n.13 (1994) (quoting Geoffrey R. Stone, Content-Neutral Restrictions, 54 U. Chi. L. Rev. 46, 58 (1987)); see also Wayte v. United States, 470 U.S. 598, 611 (1985) (noting that part of the test is whether an "incidental restriction on alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest" (quoting United States v. O'Brien, 391 U.S. 367, 377 (1968))).
455. Perhaps as a result, in a recent First Amendment case no five justices were able to agree on A disposition, even though the Court unanimously agreed that the intermediate standard applied. See Turner Broadcasting Sys., Inc. v. FCC, 114 S. Ct. 2445, 2475 (Stevens, J., concurring in part and concurring in judgment) (voting to remand so that five justices would agree on a disposition of the appeal, despite his belief that the Court should affirm). See generally T. Alexander Aleinikoff, Constitutional Law in the Age of Balancing, 96 Yale L.J. 943 (1987) (discussing the implications of Supreme Court "balancing").
456. William Shakespeare, Love's Labour's Lost act 5, sc. 2, ll. 228-29 (Richard David ed., 1956).
457. Tribe, supra note 434, § 12-26, at 1019; see also Brown v. Socialist Workers '74 Campaign Comm., 459 U.S. 87, 91 (1982) ("The Constitution protects against the compelled disclosure of political associations and beliefs."); Gilmore v. City of Montgomery, 417 U.S. 556, 575 (1974) (noting that the right to associate freely promotes democracy); NAACP v. Button, 371 U.S. 415, 431 (1963) (refusing to permit compelled disclosure of political affiliation); Talley v. California, 362 U.S. 60, 65 (1960) (striking down A statute forbidding distribution of anonymous handbills); NAACP v. Alabama ex rel. Patterson, 357 U.S. 449, 462 (1958) (warning that forced disclosure of affiliation with certain groups may inhibit freedom of association).
458. Hynes v. Mayor of Oradell, 425 U.S. 610, 628 (1976) (Brennan, J., concurring in part).
459. See Bates v. City of Little Rock, 361 U.S. 516, 522-24 (1960) (holding, on freedom of assembly grounds, that the NAACP did not have to disclose its [**PAGE 818**]membership lists).
460. See Timothy C. May, The Cyphernomicon §§ 2.9, 8.5 (Sept. 10, 1994), available online URL ftp://ftp.netcom.com/pub/tc/tcmay/cyphernomicon. A hypertext version of this document is available from URL: http://www.apocalypse.org/pub/nelson/bin.cgi/cypernomicon.
462. See, e.g., Brown v. Socialist Workers '74 Campaign Comm., 459 U.S. 87, 91 (1982) (holding that the "Constitution protects against the compelled disclosure of political associations"); Hynes, 425 U.S. at 623 (Brennan, J., concurring in part) (asserting that A disclosure requirement puts an impermissible burden on political expression); Shelton v. Tucker, 364 U.S. 479, 485-87 (1960) (holding invalid a statute that compelled teachers to disclose associational ties because it deprived them of their right of free association); Talley v. California, 362 U.S. 60, 64-65 (1960) (voiding an ordinance that compelled the public identification of group members engaged in the dissemination of ideas); NAACP v. Alabama ex rel. Patterson, 357 U.S. 449, 462 (1958) ("It is hardly a novel perception that compelled disclosure of affiliation with groups [**PAGE 819**]engaged in advocacy may constitute . . . restraint on freedom of association . . . ."); Joint Anti-Fascist Refugee Comm. v. McGrath, 341 U.S. 123, 145 (1951) (Black, J., concurring) (expressing the fear that dominant groups might suppress unorthodox minorities if allowed to compel disclosure of associational ties). But see Communist Party of the United States v. Subversive Activities Control Bd., 367 U.S. 1, 85 (1961) (declining to decide whether forced disclosure of the identities of Communist Party members was an unconstitutional restraint on free association); New York ex rel. Bryant v. Zimmerman, 278 U.S. 63, 77 (1928) (holding that a required filing of group members' names with the state constituted a legitimate exercise of police power).
463. Patterson, 357 U.S. at 462.
464. See Brown, 459 U.S. at 91- 92; see also Buckley v. Valeo, 424 U.S. 1, 143 (1976) (upholding compulsory disclosure to FEC of names of persons donating more than $10 to campaigns, and public disclosure of contributors of over $100); Griset v. Fair Political Practices Comm'n, 884 P.2d 116, 126 (Cal. 1994) (upholding state statute banning political candidates from sending anonymous mass political mailings). In McIntyre v. Ohio Elections Comm'n, 618 N.E.2d 152, 156 (Ohio 1993), cert. granted, 114 S. Ct. 1047 (1994), the Ohio Supreme Court let stand a state statute forbidding the circulation of anonymous leaflets pertaining to the adoption or defeat of a ballot issue.
465. Board of Directors of Rotary Int'l v. Rotary Club of Duarte, 481 U.S. 537, 544 (1987); see also New York State Club Ass'n v. City of New York, 487 U.S. 1, 13 (1988) (stating that freedom of expression is a powerful tool used in the exercise of First Amendment rights); Roberts v. United States Jaycees, 468 U.S. 609, 617-19 (1984) (recognizing that an individual's First Amendment rights are not secure unless those rights may be exercised in the group context as well); Moore v. City of E. Cleveland, 431 U.S. 494, 503-04 (1977) (plurality opinion) (citing examples of intimate association).
468. See id. at 548 (stating that
the protections of the First Amendment imply a right [PAGE
820]
to associate); see also Citizens Against Rent
Control/Coalition for Fair Hous. v. City of Berkeley, 454 U.S. 290,
299(1981) (holding an ordinance limiting the amount of money that
may be contributed to certain political organizations to be an
impermissible restraint on free association).
471. Id. at 99-101 (describing "massive" harassment of the Socialist Workers Party by the FBI).
472. See City of Ladue v. Gilleo, 114 S. Ct. 2038, 2046 (1994) (holding that flyers are not A substitute for cheap and convenient signs in front of a house).
473. 618 N.E.2d 152 (Ohio 1993), cert. granted 114 S. Ct. 1047 (1994).
474. Similar questions arose in Griset v. Fair Political Practices Comm'n, 884 P.2d 116, 126 (Cal. 1994) (upholding the constitutionality of a state ban on anonymous political mailings).
475. The parallel to mandatory key escrow is imperfect because antimask laws owe their existence to efforts to curb a specific group, the Ku Klux Klan, see Oskar E. Rey, Note, Antimask Laws: Exploring the Outer Bounds of Protected Speech Under the First Amendment--State v. Miller, 260 Ga. 669, 398 S.E.2d 547 (1990), 66 Wash. L. Rev. 1139, 1145 (1991), rather than a more generalized desire to catch criminals. The existence of a specific animus aimed at one group may itself be a First Amendment violation. See infra note 478 (listing cases in which courts expressed disapproval of antimask laws on free speech grounds). Furthermore, most reported cases have concentrated on whether mask-wearing constitutes symbolic speech rather than on the associational freedom claims that would most likely be the centerpiece of a First Amendment challenge to any mandatory key escrow plan.
476. See Wayne R. Allen, Note, Klan, Cloth and Constitution: Anti-Mask Laws and the First Amendment, 25 Ga. L. Rev. 819, 821 n.17 (1991) (citing statutes from 10 states); Rey, supra note 475, at 1144 n.43 (citing additional statutes). A related type of antimask statute makes it a part of the offense to intend to interfere with the civil rights of another. See Allen, supra at 821 n.16. Additionally, 18 U.S.C. § 241 makes it a felony for two or more persons to go in disguise on public highways or on the premises of another with the intent to prevent the free exercise and enjoyment of any legal right or privilege by another citizen. See 18 U.S.C. § 241 (1988). Back to text
477. See Allen, supra note 476, at 828-29.
478. Compare State v. Miller, 398 S.E.2d 547, 553 (Ga. 1990) (rejecting challenge to antimask statute) and Walpole v. State, 68 Tenn. 370, 372-73 (1878) (same) and Hernandez v. Commonwealth, 406 S.E.2d 398, 401 (Va. Ct. App. 1991) (same) with Hernandez v. Superintendent, 800 F. Supp. 1344, 1351 n.14 (E.D. Va. 1992) (noting that the statute might have been held unconstitutional if petitioner had demonstrated that unmasking himself would have restricted his ability to enjoy free speech and freedom of association) and Aryan v. Mackey, 462 F. Supp. 90, 91 (N.D. Tex. 1978) (granting temporary restraining order preventing enforcement of antimask law against Iranian students demonstrating against Shah) and Ghafari v. Municipal Court, 150 Cal. Rptr. 813, 819 (Cal. Ct. App. 1978) (holding statute prohibiting wearing masks in public overbroad and finding that state's fear that violence will result from the mere presence of anonymous persons is "unfounded"); compare also Allen, supra note 476, at 829-30 (arguing for the validity and retention of antimask laws) with Rey, supra note 475, at 1145-46 (arguing antimask laws are unconstitutional). One way to describe the cases cited above, of course, is that the KKK loses, but Iranian students win.
479. See Miller, 398 S.E.2d at 551.
480. Compare Aryan, 462 F. Supp. at 93-94 (requiring concrete evidence supporting the prediction that violence will occur) with Miller, 398 S.E.2d at 580 (accepting history of violence as sufficient evidence).
482. The Supreme Court of GeorgiA rejected an associational freedom argument presented by the KKK in
Miller. See 398 S.E. 2d at 552-53. Relying on
associational freedom, Judge Higginbotham granted a temporary
restraining order preventing enforcement of an antimask law against
Iranian students in Aryan. See 462 F. Supp.
[PAGE 823]
at 92-94.
484. See California v. Greenwood, 486 U.S. 35, 40 (1988) (holding that the Fourth Amendment does not prohibit a warrantless search and seizure of garbage left for trash collection); see also United States v. Scott, 975 F.2d 927, 928-30 (1st Cir. 1992) (holding that the warrantless seizure and reconstruction of 5/32-inch pieces of shredded documents in the trash did not violate the Fourth Amendment), cert. denied, 113 S. Ct. 1877 (1993); United States v. Comeaux, 955 F.2d 586, 589 (8th Cir.) (permitting a warrantless search of garbage within the curtilage of the home because the garbage was readily accessible to the public), cert. denied, 113 S. Ct. 135 (1992); United States v. Hedrick, 922 F.2d 396, 400 (7th Cir.) (same) cert. denied, 112 S. Ct. 147 (1991).
485. See Smith v. Maryland, 442 U.S. 735, 743 (1979) (holding that the installation and use of A pen register by a telephone company does not constitute a search within the meaning of the Fourth Amendment). The rationale is that because people are aware that the telephone company keeps this information for billing purposes, they cannot reasonably expect that the information will be kept secret. See id. at 742. This is neither necessarily true, nor timelessly true, nor beyond the ability of persons and service providers to change by contract, but it is still the rule.
486. See Florida v. Riley, 488 U.S. 445, 451-52 (1989) (plurality opinion) (holding valid A warrantless aerial surveillance of a greenhouse from four hundred feet); see also California v. Ciraolo, 476 U.S. 207, 215 (1986) (holding valid a warrantless aerial surveillance of a yard enclosed by a 10-foot fence).
487. See Dow Chem. Co. v. United
States, 476 U.S. 227, 239 (1986) (holding that [PAGE 824]
warrantless aerial photography of a factory taken with A commercial camera from navigable airspace does not violate the
Fourth Amendment).
488. See Lisa J. Steele, Comment, The View from on High: Satellite Remote Sensing Technology and the Fourth Amendment, 6 High Tech. L.J. 317, 327-33 (1991) (discussing warrantless searches by satellite and the applicable constitutional implications).
489. See United States v. Pinson, 24 F.3d 1056, 1059 (8th Cir.) (holding that a warrantless use of infrared sensing devices did not violate the Fourth Amendment because any defendant's subjective expectation of privacy in heat emanating from her house is not one that society is prepared to recognize as objectively reasonable), cert. denied, 115 S. Ct. 664 (1994); United States v. Kerr, 876 F.2d 1440, 1443-44 (9th Cir. 1989) (considering the absence of heat a sign of suspiciously good insulation); United States v. Domitrovich, 852 F. Supp. 1460, 1472 (E.D. Wash. 1994) (holding that thermal imaging does not constitute a "search"); United States v. Penny-Feeney, 773 F. Supp. 220, 225-28 (D. Haw. 1991) (holding that warrants are not required for the use of an infrared sensing device from navigable air space above defendant's house, and noting that heat emanating from a house may be considered a sign that the occupants are growing marijuana within), aff'd on other grounds sub nom. United States v. Feeney, 984 F.2d 1053 (9th Cir. 1993); cf. United States v. Kyllo, 37 F.3d 526, 530 (9th Cir. 1994) (remanding for a hearing on the "intrusiveness" of thermal imaging in order to lay a factual foundation for a ruling on whether thermal imaging is a search within the meaning of the Fourth Amendment). But see United States v. Ishmael, 843 F. Supp. 205, 209-10, 212 (E.D. Tex. 1994) (holding that defendants had a reasonable expectation of privacy in a building and its surrounding property, and therefore thermal images that were not the product of naked eye observations amounted to an illegal, warrantless search); State v. Young, 867 P.2d 593, 601 (Wash. 1994) (holding that the use of an infrared thermal detection device to perform warrantless surveillance of the defendant's home violated the Washington state constitution's protection of defendant's private affairs and its protection against warrantless invasion of his home, as well as the Fourth Amendment of the U.S. Constitution); cf. Lisa J. Steele, Waste Heat and Garbage: The Legalization of Warrantless Infrared Searches, 29 Crim. L. Bull. 19 (1993) (arguing that a warrant should be required for the use of infrared photography to determine activity within A dwelling).
490. See United States v. Place, 462 U.S. 696, 705-07 (1983) (finding the use of "canine sniff[s]" for narcotics detection to be inoffensive to the Fourth Amendment unless the governmental interest is outweighed by the effect of the search on the individual's liberty interest). For a review of the Supreme Court cases regarding sense-enhanced searches such as dog sniffs, wiretaps, and overflights, as well as a proposed reformulation of the warrant requirement that focuses on sense-enhanced searches most susceptible to abuse, see generally David E. Steinberg, Making Sense of Sense-Enhanced Searches, 74 Minn. L. Rev. 563 (1990).
491. United States v. Karo, 468 U.S.
705, 715-18 (1984) (noting that, although the monitoring of A beeper is not per se unconstitutional, such monitoring of A person's home is a violation of the Fourth Amendment if the
individual has a justifiable interest in the privacy of the
residence); see also United States v. Knotts, 460 U.S. 276,
[PAGE 825]
282-85 (1983) (holding that police
monitoring of
signals does not constitute a search if the police could
legitimately monitor the same activities by other legal means).
But cf. Note, Tying Privacy in Knotts: Beeper
Monitoring and Collective Fourth Amendment Rights, 71 Va. L.
Rev. 297 (1985) (criticizing the Knotts and Karo
decisions).
492. Warrantless wiretaps are authorized by the Foreign Intelligence Surveillance Act (FISA), 50 U.S.C. § 1802(a) (1988). The President, acting through the Attorney General, may authorize electronic surveillance for up to one year if the surveillance is directed solely at communications between or among foreign powers, there is no substantial likelihood of acquiring communication of U.S. citizens, and minimization procedures have been followed. See id. Title 18 of the U.S. Code also permits warrantless surveillance in emergency situations involving immediate danger, death, or serious physical injury to any persons; conspiratorial activities threatening the national interest; or conspiratorial activities characteristic of organized crime. See 18 U.S.C. § 2518(7) (1988).
493. All of the court's activities are classified. It is widely believed, however, that the FISA court, as it is known, has yet to turn down a wiretap request.
The Attorney General's reports indicate that not one of the more than 4200 FISA wiretap requests was turned down during the Court's first 10 years. See Cinquegrana, supra note 196, at 814-15; see also ACM Report, supra note 15, at 18. The Court did turn down a request for authorization for a break-in, denying it on the dual jurisdictional grounds that the Court lacked the statutory authority to issue such an order and that the President has the inherent authority to order domestic national security surveillance without need of a court order. See Cinquegrana, supra note 196, at 823.
Congress recently authorized the FISA court to issue warrants for national security break-ins and inspections of the interior of buildings by "technical means." Intelligence Authorization Act for Fiscal Year 1995, Pub. L. No. 103-359, tit. VIII, § 807(a), § 301(5), 108 Stat. 3423, 3444 (1994) (to be codified at 50 U.S.C. § 1821). This authority can be used against American citizens if the Justice Department persuades the FISA court that the suspects are agents of a foreign power. See id. § 301(b), 108 Stat. 3423, 3445; see also United States v. Humphrey, 629 F.2d 908, 912-14 (2d Cir. 1980) (holding that a warrantless search did not violate the Fourth Amendment because it was related to national security); In re Application of the United States for an Order Authorizing the Physical Search of Nonresidential Premises and Personal Property (F.I.S.C. 1981) (holding that a FISA order was not required, and was at any rate unavailable due to lack of jurisdiction, for A warrantless national security break-in), reprinted in S. Rep. No. 280, 97th Cong., 1st Sess. 16 (1981).
494. The Supreme Court has allowed warrantless searches of homes occupied by parolees, probationers, or welfare recipients. See infra text accompanying note 524. Lower courts have sanctioned two additional exceptions to this rule. First, some courts have approved warrantless national security break-ins (presumably, however, the premises were not specifically designed to resist such break-ins). Second, as described supra note 489, several lower courts have allowed warrantless infrared inspections of properties, including at least one property that was carefully insulated.
495. See United States v. United States Dist. Court (The Keith Case), 407 U.S. 297, 314-21 (1972) (holding that a warrantless wiretap violated Fourth Amendment rights and implicated First Amendment policies).
496. Fourth Amendment privacy in this context begins with the premise that people have control over who knows what about them and "the right to shape the `self' that they present[] to the world." Tribe, supra note 434, § 15-16, at 1389-90. This control is protected by the Fourth Amendment freedom from unlawful searches and seizures. See id.
497. "[T]he purpose of the Fourth Amendment was to protect the people of the United States against arbitrary action by their own Government . . . ." United States v. Verdugo-Urquidez, 494 U.S. 259, 266 (1990); see also O'Connor v. Ortega, 480 U.S. 709, 730 (1987) (Scalia, J., concurring) (stating that the Fourth Amendment serves primarily to protect the right of privacy); Warden v. Hayden, 387 U.S. 294, 303- 05 (1967) (same).
498. U.S. Const. art. I, § 8, cl. 3.
499. U.S. Const. art. I, § 8, cl. 18. Technically, federal courts are involved solely in the adjudication of crimes that the legislative authority has defined and to which it has affixed punishments, because they cannot define common-law crimes. See United States v. Hudson, 11 U.S. (7 Cranch) 32, 34 (1812) (holding that the circuit courts of the United States cannot exercise common-law jurisdiction in criminal cases). Note that Article II is also involved in a different type of enforcement because some searches against agents of foreign powers operating either in the United States or abroad can be conducted pursuant to the President's national security powers. See 50 U.S.C. § 1802(a) (1988) (granting the President A limited power to authorize electronic surveillance for up to one year).
500. For an argument that a mandatory key escrow scheme is a search, and thus would violate the Fourth Amendment's particularity requirement, see Mark I. Koffsky, Comment, Choppy Waters in the Surveillance Data Stream: The Clipper Scheme and the Particularity Clause, 9 High Tech. L.J. 131 (1994).
501. Oliver v. United States, 466 U.S. 170, 178 (1984); see also Katz v. United States, 389 U.S. 347, 361 (1967) (holding that the Fourth Amendment protects against violations of subjective expectations of privacy that society is prepared to recognize as reasonable). Items in plain view are not considered private. See Horton v. California, 496 U.S. 128, 133 (1990) ("If an article is already in plain view, neither its observation nor its seizure would involve any invasion of privacy."); Coolidge v. New Hampshire, 403 U.S. 443, 464-66 (1971) (plurality opinion) (describing certain circumstances in which police may, without a warrant, seize evidence in "plain view"); see also Minnesota v. Dickerson, 113 S. Ct. 2130, 2136-37 (1993) (extending the Horton rationale to items in "plain touch").
502. See Katz, 389 U.S. at 357 (noting that "searches conducted outside the judicial process, without prior approval by judge or magistrate, are per se unreasonable under the Fourth Amendment--subject only to A few specifically established and well-delineated exceptions" (footnotes omitted)); cf. Intelligence Authorization Act for Fiscal Year 1995, Pub. L. No. 103-359, § 807(a), §§ 301-309, 108 Stat. 3423, 3443-53 (1994) (to be codified at 50 U.S.C. §§ 1821-1829) (amending FISA to grant the FISA court power to issue in camera, ex parte orders authorizing physical searches and "examination of the interior of property by technical means" on a lesser showing of need than would be required for a warrant); Benjamin Wittes, Surveillance Court Gets New Powers, Legal Times, Nov. 7, 1994, at 1 (noting the ACLU's claim that the extension of FISA court's power is "A clear violation of the Fourth Amendment").
503. See Florida v. Riley, 488 U.S. 445, 451-52 (1989) (plurality opinion) (holding that aerial police surveillance of an individual's house from a helicopter flying at four hundred feet does not offend the Fourth Amendment); California v. Ciraolo, 476 U.S. 207, 215 (1986) (finding constitu- tional the aerial surveillance of a yard enclosed by a 10-foot fence).
504. See United States v. Place, 462 U.S. 696, 707 (1983) (holding that the use of canines in narcotics searches is not per se unconstitutional, but is subject to reasonableness requirements).
505. 277 U.S. 438, 468-69 (1928) (holding that the admission of evidence obtained through telephone taps offends neither the Fourth nor the Fifth Amendment).
506. See Katz, 389 U.S. at 353; Berger v. New York, 388 U.S. 41, 51 (1967).
507. See infra text accompanying note 514.
508. See, e.g., New York v. Belton, 453 U.S. 454, 457 (1981) (stating that "a lawful custodial arrest creates a situation which justifies the contemporaneous search without a warrant of the person arrested and of the immediately surrounding area").
509. See, e.g., Cupp v. Murphy,
412 U.S. 291, 295-96 (1973) (justifying a warrantless search on the
grounds that an arrestee will be "sufficiently apprised of his
suspected role in the crime to motivate him to destroy what
evidence he [can]"); Schmerber v. California, 384 U.S. 757,
769-71 (1966) (concluding that the rapid depletion of [PAGE
829]
alcohol in the human bloodstream justifies A warrantless
blood-alcohol test).
510. See United States v. MontoyA de Hernandez, 473 U.S. 531, 537-38 (1985) (noting that the longstanding tradition of conducting warrantless searches of persons entering the United States reflects a concern for "the protection of the integrity of the border"); United States v. Martinez-Fuerte, 428 U.S. 543, 557 (1976) (stating that the need to make routine stops at border checkpoints in order to prevent the entrance of smugglers and illegal aliens outweighs any intrusion on Fourth Amendment interests); California Bankers Ass'n v. Schultz, 416 U.S. 21, 62-63 (1974) (dictum) (stating that those "leaving the country may be examined as to their belongings and effects, all without violating the Fourth Amendment").
511. See 18 U.S.C. § 2518 (1988 & Supp. V 1993); see also supra notes 502-06 and accompanying text (discussing the Fourth Amendment's warrant requirement).
512. But see supra note 211 and text following note 338 (describing a reason to doubt user expectations of privacy).
513. See infra Technical Appendix, part B (describing a public- key cryptographic system).
514. Craig M. Cornish & Donald B. Louria, Employment Drug Testing, Preventive Searches, and the Future of Privacy, 33 Wm. & Mary L. Rev. 95, 98 (1991).
515. 489 U.S. 656 (1989); see
also Florida v. Bostick, 501 U.S. 429, 434 (1991) (holding that
random approaches to passengers in buses, conducted pursuant to
passengers' consent, are not per se unconstitutional); Michigan
Dep't of State Police v. Sitz, 496 U.S. 444, 447 (1990)
(classifying suspicionless sobriety checkpoints to deter drunk
driving as reasonable under the Fourth Amendment); Skinner v.
Railway Labor Executives' Ass'n, 489 U.S. 602, 634 (1989) (finding
drug and alcohol tests mandated by Federal Railroad Administration
regulations reasonable under the Fourth Amendment); Marshall v.
Barlow's, Inc., 429 U.S. 1347, 1347 (1977) (granting stay of
injunction against further warrantless searches of workplaces
permitted under the Occupational Safety and Health Act of 1970, Pub
L. No. 91-596, 84 Stat. 1590 (codified as amended in scattered
sections of 5 U.S.C., 15 U.S.C., 18 U.S.C., 29 U.S.C., and 42
U.S.C. (1988 & Supp. V 1993))). But see Camara v.
Municipal Court, [PAGE 831]
387 U.S. 523, 540 (1967)
(finding
that the defendant had a constitutional right to deny a housing
inspector entry into a leasehold without a warrant in A nonemergency situation).
516. See Von Raab, 489 U.S. at 665.
518. Id. at 665-66 (citing Skinner, 489 U.S. at 619-20).
519. Id. at 668 (offering hidden conditions and impracticality as examples of "compelling" special needs) (citing Skinner, 489 U.S. at 624).
520. The special needs standard has received strong criticism from academic commentators. See William J. Stuntz, Implicit Bargains, Government Power, and the Fourth Amendment, 44 Stan. L. Rev. 553, 554 & n.10 (1992) (summarizing criticisms).
521. See Von Raab, 489 U.S. at 666 (explaining that the Customs Service's mandatory employee drug testing program was not designed to further criminal prosecutions, but to ensure that drug users did not ascend to certain positions in the service).
522. Cf. United States v. Martinez-Fuerte, 428 U.S. 543, 557 (1976) (stating that requiring particularized suspicion before routine stops on major highways near the Mexican border "would be impractical because the flow of traffic tends to be too heavy to allow the particularized study of a given car that would enable it to be identified as a possible carrier of illegal aliens").
523. "A determination of the
standard of reasonableness applicable to a particular [PAGE
832]
class of searches requires `balanc[ing] the nature and
quality of the intrusion on the individual's Fourth Amendment
interests against the importance of the governmental interests
alleged to justify the intrusion.'" O'Connor v. Ortega, 480
U.S. 709, 719 (1987) (citations omitted).
524. See Wayne R. LaFave & Jerold H. Israel, Criminal Procedure § 3.9 (2d ed. 1992).
528. See id. at 536-39. The Camera Court relied on several factors absent from the mandatory key escrow scenario for its holding, including the "long history of judicial and public acceptance" of housing code inspections. Id. at 537.
529. 400 U.S. 309, 316-18 (1971) (holding that a mandatory (and warrantless) home visit by a welfare caseworker does not violate any of the welfare recipient's Fourth Amendment rights). The breadth of the special needs justification becomes particularly clear when one considers the means by which the Wyman Court permitted warrantless intrusions into welfare recipients' homes. See id.
530. Given the plasticity of the special needs doctrine, it is possible that the Court would extend the regulatory search exception to the home user of encryption. Extending the logic of Von Raab to the home, however, would gut much of what remains of the Fourth Amendment, and is a result to be avoided at all costs.
532. Andresen v. Maryland, 427 U.S. 463, 470-71 (1976) (holding that business records are outside the Fifth Amendment privilege) (quoting United States v. White, 322 U.S. 694, 701 (1944)).
533. "Although conduct by law enforcement officials prior to trial may ultimately impair that right, a constitutional violation occurs only at trial." United States v. Verdugo-Urquidez, 494 U.S. 259, 264 (1990) (citations omitted).
534. Fisher v. United States, 425 U.S. 391, 399 (1976) (citations omitted).
535. See, e.g., Nixon v. Administrator of Gen. Servs., 433 U.S. 425, 459 & n.22 (1977) (noting that the most personal of documents are entitled to special protection). The protection is from subpoenas only, not search warrants.
536. The required records doctrine came into full flower in Shapiro v. United States, 335 U.S. 1, 32-33 (1948), which upheld a subpoena for incriminatory records that were required under a wartime price control statute. Later cases made clear, however, that there are limits to the government's power to define records as "required" and hence outside the protections of the Fifth Amendment. See Marchetti v. United States, 390 U.S. 39, 48-49 (1968) (holding that a registration requirement violated the Fifth Amendment because it materially increased the chances of prosecution); Grosso v. United States, 390 U.S. 62, 64-67 (1968) (holding that a statute requiring the reporting of, and payment of an excise tax on, earnings from wagering violated the Fifth Amendment because it was inherently self-incriminatory). Other cases have cast doubt on the firmness of these limits. See, e.g., California v. Byers, 402 U.S. 424, 434 (1971) (plurality opinion) (requiring that a hit-and- run motorist identify himself). One thing beyond dispute, however, is that the government needs a court order to get access to required records. Because the point of a mandatory key escrow scheme would be to get access to the keys without a court order, the required records exception is irrelevant.
539. Id. at 632. Judge Friendly criticized this statement as "ringing but vacuous" because it "tells us almost everything, except why." Henry J. Friendly, The Fifth Amendment Tomorrow: The Case for Constitutional Change, 37 U. Cin. L. Rev. 671, 682 (1968).
541. Id. at 91 (quoting Couch v. United States, 409 U.S. 322, 327 (1973)).
542. See Samuel A. Alito, Jr., Documents and the Privilege Against Self-Incrimination, 48 U. Pitt. L. Rev. 27, 29 (1986) (examining the new framework used by the Supreme Court in applying the Fifth Amendment privilege against self-incrimination to compulsory process for documents); Note, Formalism, Legal Realism, and Constitutionally Protected Privacy Under the Fourth and Fifth Amendments, 90 Harv. L. Rev. 945, 964-85 (1977) (detailing the modern approach to the Fourth and Fifth Amendments).
543. See United States v. Doe, 465 U.S. 605, 613-14 (1984) (finding that the act of producing the documents at issue would involve testimonial self-incrimination, and that requiring such production therefore violated the Fifth Amendment); Fisher v. United States, 425 U.S. 391, 398-99 (1976) (holding that requiring relinquishment of the documents at issue was not a Fifth Amendment violation because no testimonial incrimination was compelled); see also Doe, 465 U.S. at 618 (O'Connor, J., concurring) (contending that "the Fifth Amendment provides absolutely no protection for the contents of private papers of any kind").
In Baltimore City Department of Social Services v. Bouknight, 493 U.S. 549, 561 (1990) (holding that a mother could not invoke her Fifth Amendment privilege against a court order to produce the child she had allegedly abused), the Supreme Court, analogizing the mother's care of the child to a required record, held that producing a child was not testimonial, and therefore the Fifth Amendment did not apply. Id. at 556-60. In light of this decision it is fair to ask whether the Fifth Amendment applies to anything other than oral testimony.
544. See, e.g., Braswell
v. United States, 487 U.S. 99, 109-10 (1988) (holding that A custodian of corporate records may not withhold them on the grounds
that such production will incriminate him in violation of the Fifth
Amendment); Andresen v. Maryland, 427 U.S. 463, 472-73 (1976)
(holding that a legal search of the petitioner's office resulting
in the seizure of voluntarily recorded business records
authenticated by a prosecution witness was not a violation of the
Fifth Amendment); Bellis v. United States, 417 U.S. 85, 101 (1974)
(holding that a dissolved law partnership had its own institutional
identity, and its records were held in a representative capacity;
therefore a grand jury subpoena for those records could not be
ignored on Fifth Amendment grounds); United States v. White, 322
U.S. 694, 698-99 (1944) (holding that an officer of an
unincorporated labor union could not refuse, based on Fifth
Amendment [PAGE 836]
protections, to produce the union's
records);
Hale v. Henkel, 201 U.S. 43, 56-58 (1906) (holding that a witness
who, because of statutory immunity, cannot invoke the Fifth
Amendment as to oral testimony cannot invoke it against the
production of books and papers).
545. See Couch v. United States, 409 U.S. 322, 335-36 (1973) (holding that petitioner had no legitimate expectation of privacy when she handed her papers over to her accountant); Bellis, 417 U.S. at 92-93 (same, when papers were handed to a partner in a small law firm). The attor- ney-client privilege is an exception to this general rule.
546. See supra note 536 (discussing Shapiro v. United States, 335 U.S. 1 (1948).
547. See Gilbert v. California, 388 U.S. 263, 266-67 (1967). This rule has also been applied to voice samples, see United States v. Wade, 388 U.S. 218, 222- 23 (1967), and blood samples, see Schmerber v. California, 384 U.S. 757, 767 (1966).
548. See Johnson v. Eisentrager, 339 U.S. 763, 771, 782-83 (1950); cf. United States v. Tiede, 86 F.R.D. 227, 259 (D. Berlin 1979) (holding that friendly aliens have Fifth Amendment rights when charged with civil offenses in a U.S. court outside the territory of the United States). U.S. citizens abroad, however, do have Fifth Amendment rights. See Reid v. Covert, 354 U.S. 1, 5 (1957) (rejecting the ideA that "when the United States acts against citizens abroad it can do so free of the Bill of Rights").
549. See LaFave & Israel, supra note 524, § 8.12, at 701-02.
550. Marchetti v. United States, 390 U.S. 39, 48 (1968) (holding that requiring a frequent gambler to report illegal gambling income created a reasonable basis for fear of incrimination).
553. Recall that for the purposes of this discussion "chip key" means either the hardwired chip's unique key in a Clipper Chip (which can lead the government to the encrypted session key buried in a LEAF) or the information needed to decrypt the equivalent information generated by A software package.
554. See Hoffman v. United States, 341 U.S. 479, 486 (1951) (noting that a witness's response is incriminating if it might furnish a link in the chain of evidence needed to prosecute).
555. See, e.g., Doe v. United States, 487 U.S. 201, 208 n.6 (1988) (noting that a communication does not become privileged just because "`it will lead to incriminating evidence'" (quoting In re Grand Jury Subpoena, 826 F.2d 1166, 1172 n.2 (2d Cir. 1987) (concurring opinion))).
556. That at least was Justice Harlan's view in Griswold v. Connecticut, 381 U.S. 479, 499-500 (1965) (Harlan, J., concurring) (stating that privacy derives not from penumbras in the Bill of Rights, but from fundamental ideas of ordered liberty); cf. Roe v. Wade, 410 U.S. 113, 152 (1973) (relying on penumbras in the Bill of Rights).
557. For a taxonomy of taxonomies, see Tribe, supra note 434, § 15-1, and Ken Gormley, One Hundred Years of Privacy, 1992 Wis. L. Rev. 1335, 1340. For an argument that the three strands of the right to privacy are actually inimical to each other, at least in the eyes of their advocates on the Supreme Court, see generally David M. Smolin, The Jurisprudence of Privacy in a Splintered Supreme Court, 75 Marq. L. Rev. 975 (1992).
558. See supra parts III.A-C.
559. See, e.g., Hampton v. Mow Sun Wong, 426 U.S. 88, 103 (1976) (holding the federal government's denial of a resident alien's right to work unconstitutional under the Fifth Amendment); Lamont v. Postmaster General, 381 U.S. 301, 305 (1965) (invalidating, under the First Amendment, a statutory requirement that persons wishing to receive "communist propaganda" identify themselves to the post office); Shelton v. Tucker, 364 U.S. 479, 490 (1960) (striking down a statute requiring teachers at state-supported schools and colleges to list every organization they had joined during the prior five years); NAACP v. Alabama ex rel. Patterson, 357 U.S. 449, 462 (1958) (holding that the NAACP had the right to refuse to disclose its membership list on behalf of its members' rights to prevent the State from compelling them to reveal their affiliation with the group); see also, e.g., John H. Ely, Democracy and the Right to Be Different, 56 N.Y.U. L. Rev. 397, 405 (1981) (arguing that the right to be different is not constitutionally protected, but that the lack of protection is not a problem because the right generally will be invoked by those who do not need protection).
560. See supra part III.D.
561. Olmstead v. United States, 277 U.S. 438, 478 (1928) (Brandeis, J., dissenting); see also Stanley v. Georgia, 394 U.S. 557, 564 (1969) (finding a constitutional right to "receive information and ideas, regardless of their social worth").
562. Whalen v. Roe, 429 U.S. 589, 598-99 (1977) (acknowledging the existence of the right, but finding that it could be overcome by a narrowly-tailored program designed to serve the state's "vital interest in controlling the distribution of dangerous [prescription] drugs"); see Gary R. Clouse, Note, The Constitutional Right to Withhold Private Information, 77 Nw. U. L. Rev. 536, 547-57 (1982) (collecting and dissecting inconsistent circuit court cases dealing with the right to withhold private information). The right to be left alone, however, is insufficiently compelling to prevent a large number of physical intrusions to bodily integrity when the police seek forensic evidence relating to a criminal investigation. See Tribe, supra note 434, at 1331 nn.4-11 (collecting cases); supra note 546 (same).
563. See, e.g., Francis S.
Chlapowski, Note, The Constitutional Protection of Informational
Privacy, 71 B.U. L. Rev. 133, 155 (1991) (concluding
that because most theories of personhood assume personal
information is a crucial part of a person's identity, there must be
a recognized "right to informational privacy based on
personhood" and that information is property protected by the
Fifth Amendment); Clouse, supra note
562, at 541-47 (tracing
the development of the right to informational privacy, and noting
the Supreme Court's use of a balancing test to determine whether an
individual's[PAGE 840]
constitutional rights have been
infringed by a government-mandated disclosure of information).
568. An extreme statute, requiring broad data collection combined with a requirement that reports be available to the public, was held unconstitutional in Thornburgh v. American College of Obstetricians and Gynecologists, 476 U.S. 747 (1986). What limits there might be to data collection and the safeguards required against disclosure were issues left open in Whalen: "We . . . do not[] decide any question which might be presented by the unwarranted disclosure of accumulated private data--whether intentional or unintentional--or by a system that did not contain comparable security provisions." 429 U.S. at 605-06.
570. Paul v. Davis, 424 U.S. 693, 713 (1976); see also Roberts v. United States Jaycees, 468 U.S. 609, 618-22 (1984) (describing types of "personal bonds" and relationships entitled to heightened constitutional protection); Moore v. City of E. Cleveland, 431 U.S. 494, 499 (1977) (plurality opinion) (recognizing a right to choose which relatives to live with); Roe v. Wade, 410 U.S. 113, 152 (1973) (protecting the reproductive decisions of women); Doe v. Bolton, 410 U.S. 179, 197-98 (1973) (recognizing the right to make reproductive decisions without interference from a hospital committee); Eisenstadt v. Baird, 405 U.S. 438, 452-55 (1972) (protecting the procreative decisions of unmarried opposite-sex couples); Loving v. Virginia, 388 U.S. 1, 12 (1967) (endorsing the right to engage in an interracial marriage); Griswold v. Connecticut, 381 U.S. 479, 482-86 (1965) (establishing the right of married opposite-sex couples to make procreative decisions); Poe v. Ullman, 367 U.S. 497, 551-54 (1961) (Harlan, J., dissenting) (arguing that the Constitution protects the procreative decisions of married opposite-sex partners); Skinner v. Oklahoma, 316 U.S. 535, 541 (1942) (recognizing the right not to be sterilized); Pierce v. Society of Sisters, 268 U.S. 510, 534-35 (1925) (holding that parents have the right to determine the schooling of their children); Meyer v. Nebraska, 262 U.S. 390, 399-400 (1923) (recognizing a parental right to determine what language children may learn); Kenneth L. Karst, The Freedom of Intimate Association, 89 Yale L.J. 624, 637-38 (1980) (arguing that divorce--the freedom of disassociation--is a fundamental privacy right).
571. See Cleveland v. United States, 329 U.S. 14, 18-20 (1946) (rejecting an argument that polygamous practices should be excluded from the prohibitions of the Mann Act); Reynolds v. United States, 98 U.S. 145, 165-67 (1878) (rejecting a First Amendment challenge to a statute forbidding polygamy). Both decisions remain good law.
572. See supra note 570 and accompanying text (discussing cases defining zones of privacy). In addition, many states have laws prohibiting adultery that remain on the books. These laws are not currently enforced, but there is reason to believe that if they were enforced they could survive A constitutional challenge based on privacy principles. See Commonwealth v. Stowell, 449 N.E.2d 357, 360 (Mass. 1983) (rejecting constitutional attack against a Massachusetts adultery statute). But see Martin J. Siegel, For Better or for Worse: Adultery, Crime & the Constitution, 30 J. Fam. L. 45, 58-86 (1991/1992) (arguing that laws criminalizing adultery are unconstitutional).
573. 478 U.S. 1039 (1986). The Supreme
Court refused to extend the vision of privacy set out in the cases
above to protect the sexual choices of an unmarried same-sex couple
in Bowers and did so in a way that casts doubt on the entire
strand of privacy protection for intensely personal and intimate
associations. Professor Tribe describes the decision in
Bowers as erroneous and unprincipled and predicts that
it[PAGE 842]
will not be followed. See
Tribe,
supra note 434, § 15-21. For
a thoughtful reformulation
of privacy doctrines after Bowers, see Rubenfeld,
supra note 4, at 750-807.
574. See Planned Parenthood v. Casey, 112 S. Ct. 2791, 2824-33 (1992) (allowing certain state restrictions on abortion); Rust v. Sullivan, 500 U.S. 173, 196 (1991) (allowing government to ban the use of federal public funds for abortions and related activities). Back to text
575. On the psychological and moral importance of allowing individuals to make voluntary choices in matters vitally affecting them, see Bruce J. Winick, On Autonomy: Legal and Psychological Perspectives, 37 Vill. L. Rev. 1705, 1755-68 (1992); see also Bercu, supra note 90, at 402-03 (asserting that "information privacy is essential to our development and self-fulfillment as individuals").
576. Tribe, supra note 434, § 15-21.
577. See, e.g., Planned Parenthood, 112 S. Ct. at 2831 (striking down a statutory provision that required spousal notification prior to abortion, but upholding the statute's informed consent and reporting requirements), overruling Thornburgh v. American College of Obstetricians and Gynecologists, 476 U.S. 747 (1986) (invalidating the informed consent and reporting requirements of a statute restricting abortions).
578. E-mail also allows people to meet and exchange ideas, thus increasing their chances of forming lasting relationships. See Steve Lohr, Therapy on A Virtual Couch, N.Y. Times, Aug. 28, 1994, at C7 (interviewing psychiatrist and novelist Avodah Offit). Indeed, in a few cases e-mail apparently has become a substitute for sex, as some of Dr. Offit's patients have consulted her about their "E-mail love relationships." Id.
579. But see Lovisi v. Slayton,
539 F.2d 349, 351-52 (4th Cir.) (en banc) (holding that a marital
couple's right to bring a privacy challenge to a conviction under
a Virginia sodomy statute was waived due to the presence of an
invited third party), cert. denied, 429 U.S. 977 (1976).
The majority conceded, however, that the Lovisis "would [have]
remain[ed] protected in their expectation of privacy" if they
had only spoken or[PAGE 843]
written about their
activities to third parties. Id. at 351.
580. The even thornier problem of the intimate, international, interspousal e-mail is beyond the scope of this Article. The question is complex because it will turn on the citizenship of the parties, their location, and other factors.
581. "Certain intimate personal documents--a diary is the best example--are like an extension of the individual's mind. They are a substitute for the perfect memory that humans lack. Forcing an individual to give up possession of these intimate writings may be psychologically comparable to prying words from his lips." Alito, supra note 542, at 39.
582. See Bamford, supra note 17, at 357-63 (quoting remarks of the NSA Director Bobby Ray Inman); supra note 162 (collecting articles on governmental attempts to suppress academic speech with arguably harmful consequences).
583. Vincent Blasi, The Checking Value in First Amendment Theory, 1977 Am. B. Found. Res. J. 523, 525.
584. On a related point, consider that in colonial times, "nothing even remotely resembling modern law enforcement existed." Carol S. Steiker, Second Thoughts About First Principles, 107 Harv. L. Rev. 820, 824 (1994).
585. See supra notes 484-92 and accompanying text.
586. See supra notes 398-402 and accompanying text.
587. One counterargument contends that
the technical change that really matters is that the world is A more dangerous place than ever imagined in 1791. The size and
nature of the threat to the nation's existence, including nuclear,
chemical, and bacteriological weaponry, as well as systems for
their rapid and/or surreptitious delivery, means that national
security interests require compromises of some rights that might
never have been imagined in 1791. Although this argument is
powerful, I think it mistaken for two reasons. First, it
understates the degree to which the fledgling United States was at
risk from military and economic assault by European powers. The
new Constitution was adopted precisely because the country appeared
to be falling apart under the Articles of Confederation. The
Spanish, French, and British each posed substantial military and
economic threats. Second, granting that modern risks include
extinction where previously the greatest danger was subjugation, it
would take a far more immediate danger than we currently face to
make it even worth considering subjecting ourselves to A surveillance state.
See generally Oscar H. Gandy, Jr., The Panoptic Sort: A Political Economy of Personal Information (1993) (analyzing
panoptic thinking by integrating several social science
perspectives); Harold Edgar & Benno C. Schmidt, Jr.,
Curtiss-Wright
Comes Home: Executive Power,[PAGE 845]
and National
Security
Secrecy, 21 Harv. C.R.-C.L. L. Rev. 349 (1986)
(discussing the enlarged role of the executive in national security
matters).
588. See, e.g., Frank Odasz, Big Sky Telegraph, 71 Whole Earth Rev. 32, 32 (1991) (describing the use of the Big Sky Telegraph in linking Women's Centers in distant parts of Montana); Graeme Browning, Zapping the Capitol, 43 Nat'l J. 2446, 2446 (1994) (dis- cussing the use of "the worldwide web of computer networks" to lobby Congress); John H. Fund, We Are All Pundits Now, Wall St. J., Nov. 8, 1994, at A22 (reporting on the role of e-mail in the campaign to defeat Speaker of the House Tom Foley in the 1994 elections).
589. In Hester v. United States, 265 U.S. 57, 59 (1924), the Supreme Court held that an open field was not a protected area for Fourth Amendment purposes. This does not detract from the point in the text, which refers to attitudes that long predate both Olmstead v. United States, 277 U.S. 438 (1928), and Hester. The open fields doctrine was restated in Oliver v. United States, 466 U.S. 170, 176-77 (1984) (finding no Fourth Amendment violation stemming from a search of a field where marijuana was growing), which appears to create a safe harbor for eavesdropping. See Stephen A. Saltzburg, Another Victim of Illegal Narcotics: The Fourth Amendment (As Illustrated by the Open Fields Doctrine), 48 U. Pitt. L. Rev. 1, 25 n.105 (1986) (criticizing Oliver for permitting new means of surveillance that are "used to invade areas which people have traditionally believed were closed to outsiders").
590. See supra note 372.
591. See supra part III.D.2.
592. Seth F. Kreimer, Sunlight, Secrets, and Scarlet Letters: The Tension Between Privacy and Disclosure in Constitutional Law, 140 U. Pa. L. Rev. 1, 13 (1991). The archetype can be seen as a special type of idealized cognitive model, usually concerning something feared or loved. See Steven L. Winter, The Cognitive Dimension of the Agon Between Legal Power and Narrative Meaning, 87 Mich. L. Rev. 2225, 2233-34 (1989). Like other idealized cognitive models, archetypes serve as reference points used to categorize experiences.
593. For a discussion on McCarthyism as
an archetype, see Kreimer, supra note
592, at 14 (finding
that McCarthyism "has achieved the status of a negative
archetype in contemporary political discourse . . . as a term of
opprobrium, of classic political impropriety"). See
generally Robert A. Burt, The Constitution in Conflict
(1992) (deriving archetypes implicitly from American constitutional
history from the figures[PAGE 847]
of Hamilton,
Madison, and
Lincoln).
594. See generally Herbert J. Storing, What the Anti-Federalists Were for (1981) (explaining the role of the Anti-Federalists and arguing that they should be counted among the Founding Fathers); Herbert J. Storing, The Complete Anti-Federalist (1981).
595. See, e.g., Wickard v. Filburn, 317 U.S. 111, 128 (1942) (holding that Congress has the power under the Commerce Clause to regulate home-grown wheat); United States v. Darby, 312 U.S. 100, 113 (1941) (finding that Congress may regulate interstate shipment of goods under Commerce Clause power).
596. By referring to the paradigmatic citizen, I mean to indicate that where once the free, white, usually property-owning male was the person whose political rights were the subject of rights talk, the pool of relevant rights claimants now has expanded to include all adult residents.
597. I owe the example to Mark Eckenwiler, Letter to the Editor: This Chip Can "Clip" Americans' Civil Liberties, Nat'l L.J., Aug. 1, 1994, at A18.
598. Even here, alas, a small qualification may be in order. Five years ago I would have written only slightly less emphatically that a proposal to require that every telephone switching system be modified to make government wiretapping easy would be constitutional, but of course would have no chance of passage. Now this country is committed to paying at least half a billion dollars to make that plan a reality. See supra notes 138, 425 and accompanying text (discussing the Digital Telephony initiative).
599. It is disturbing to note, however, that most household locks use reproducible patterns. Armed with the serial number, a call to the manufacturer makes it easy to fabricate a duplicate. The keys to most household locks are in effect held in "escrow" by their manufacturers.
600. George Orwell, 1984 (1948).
601. A similarly chilling vision is found in Bentham's concept of the Panopticon--although Bentham himself found his vision of pervasive surveillance utopian rather than dystopian. See Michel Foucault, Power/Knowledge: Selected Interviews & Other Writings 1972-1977, at 146-48 (Colin Gordon ed., 1980). The Orwellian archetype also gathers some of its power from the existence of surveillance states such as North Korea, the People's Republic of China (at least in some times and regions), and the former Soviet Union. See David B. Davis, The Fear of Conspiracy 265 (1971) (noting that Americans often find it "easier to blame communist conspirators for every conflict in the world than to study the origins and complexities of civil [strife]"). Orwell's 1984 is powerful because it rings too true.
602. See, e.g., Planned
Parenthood v. Casey, 112 S. Ct. 2791, 2882 (1992) (Scalia, J.,
dissenting) (stating that the effect of Roe's sudden
elimination of abortion's moral stigma from the minds of many is
"nothing less than Orwellian"); Austin v. Michigan State
Chamber of Commerce, 494 U.S. 652, 679 (1990) (Scalia, J.,
dissenting) (describing a state law that prohibits the Chamber of
Commerce from advertising support for political candidates as
Orwellian); County of Allegheny v. ACLU, 492 U.S. 573, 678 (1989)
(Kennedy, J., concurring in the judgment in part and dissenting
in[PAGE 849]
part) (describing as Orwellian the Court's
screening out of religious symbols from public displays); FloridA v. Riley, 488 U.S. 445, 466 (1989) (Brennan, J., dissenting)
(comparing the Court's allowance of police surveillance from
aircraft to a passage from Orwell's 1984).
603. See supra note 416 (describing popular opposition to Clipper). But see supra note 138 (describing the Digital Telephony initiative).
604. See Letter from Ron Rivest, E.S. Webster Professor of Computer Science, Massachusetts Institute of Technology, to Dorothy E. Denning, Professor and Chair, Computer Sciences Department, Georgetown University 1 (Feb. 25, 1994) (on file with author) ("There are all kinds of wonderfully stupid things one could do with modern technology that could `help' law enforcement. But merely being of assistance to law enforcement doesn't make a proposal a good thing; many such ideas are objectionable and unacceptable because of the unreasonably large cost/benefit ratio (real or psychological cost).").
605. Katz v. United States, 389 U.S. 347, 361 (1967) (Harlan, J., concurring) (contending that Fourth Amendment jurisprudence should accommodate contemporary standards of reasonable privacy).
606. See, e.g., William N. Eskridge, Jr., Dynamic Statutory Interpretation, 135 U. Pa. L. Rev. 1479, 1479 (1987) (noting that the Constitution, like common law, is interpreted dynamically).
607. See supra note 77.
608. There are, of course, magnificent
examples of executive and judicial resistance to security paranoia.
See, e.g., New York Times Co. v. United States, 403 U.S.
713, 718-19 (1971) (allowing a newspaper to publish the contents of
classified documents); Brandenburg v. Ohio, 395 U.S. 444, 448
(1969) (striking down Ohio's criminal[PAGE 850]
syndicalism
statute); Kent v. Dulles, 357 U.S. 116, 130 (1958) (holding that
passports cannot be denied on the basis of past or present
membership in the Communist party). It was resistance by the
Department of the Army--in the executive branch--that triggered the
fall of McCarthy. See Richard M. Fried, Men Against
McCarthy 282 (1976) (describing climatic moment of Army-
McCarthy hearings).
609. Edgar & Schmidt, supra note 587, at 349.
610. In fact, the worry has deep English roots, even if these are sometimes exaggerated. See Francis B. Sayre, Criminal Conspiracy, 35 Harv. L. Rev. 393, 397 (1922) (discussing the period between the reigns of Edward III and Elizabeth I, during which a number of statutes were passed to suppress combinations for various specific purposes, such as treasonable designs, breaches of the peace, raising prices, and the like); Developments in the Law--Criminal Conspiracy, 72 Harv. L. Rev. 920, 922-23 (1959) (discussing the conspiracy indictment). The first conspiracy statutes, which defined the crime in narrow terms, were enacted around 1300 in the reign of Edward I. See Wayne R. LaFave & Austin W. Scott, Jr., Criminal Law 525 (2d ed. 1986).
611. See supra note 416 and accompanying text.
612. In fact, wartime has often provided the impetus for U.S. government censorship of communications between the United States and foreign countries. See, e.g., Exec. Order No. 8985 (Dec. 19, 1941), reprinted in 3 C.F.R. 1047 (1938-1943) (establishing the Office of Censorship to censor, at the "absolute discretion" of the Director, "mail, cable, radio, or other means of transmission" to or from other countries), revoked Exec. Order No. 9631 (Sept. 28, 1945), reprinted in 3 C.F.R. 435 (1943-1948) (abolishing the Office of Censorship); see also Matthew J. Jacobs, Assessing the Constitutionality of Press Restrictions in the Persian Gulf War, 44 Stan. L. Rev. 675, 679-86 (1992) (outlining the history of U.S. wartime battle-zone censorship).
The World War II domestic censorship regulations required that
all written[PAGE 851]
messages--including those
hand-carried--be
passed to a censor. See U.S. Censorship Regulations, 32
C.F.R. § 1801.3 (1945). Letters from the United States to
foreign countries or U.S. possessions were to be written "in
English, if possible" but if written in another language then
"the name of the language used should be written in English on
the face of the envelope." Id. § 1801.21(b).
Cables and radio traffic were permitted only in English, French,
Portuguese, or Spanish without special authorization. See
id. § 1801.48.
Letters employing codes and ciphers were specifically prohibited unless authorized. See id. § 1891.22. Cable transmissions could use any one of nine specified commercial codes, but private codes required a special license from the Department of Censorship. Applicants for these licenses were required to provide 15 copies of their code with the application. See id. § 1801.49.
Telephone calls to Mexico were permitted to be in Spanish, while French was allowed in calls to Canada. Radiotelephones could use English, Spanish, French, and Portuguese "except in the event that translators are not available at the censorship point." Id. § 1801.74. Anonymous international calls were prohibited. All callers had to identify themselves to the censors in advance. See id. § 1801.71. Callers from hotels had to be identified by the management, whereas calls from pay phones were banned. See id. §§ 1801.72-.73.
613. See Robert S. Levine, Conspiracy and Romance 5 (1989) (describing various fears of conspiracy in early America); Paul Marcus, Criminal Conspiracy Law: Time to Turn Back from an Ever Expanding, Ever More Troubling Area, 1 Wm. & Mary BillRts. J. 1, 3-4 (1992) (discussing the generally accepted notion that consipracy is punished because joint action is more dangerous than individual action). Webster's defines conspiracy as "[a]n agreement between two or more persons to commit a crime or accomplish a legal purpose through illegal action." Webster's II New Riverside University Dictionary 302 (1983).
614. See Davis, supra note 601, at xiii ("If the United States has enjoyed uncommon security from the time of independence, the Ameri- cans have also been subjected to continual alarms and warnings of imminent catastrophe.").
616. See Richard Hofstadter, The Paranoid Style in American Politics, in The Paranoid Style in American Politics and Other Essays 3, 6-7 (1965) (noting that "Americans have no monopoly on the gift for paranoid improvisation").
617. Davis, supra note 601, at xiii.
618. See Levine, supra note 613, at 6-8 (arguing that conspiratorial fears helped New England colonists to define and create communities); Perry Miller, The New England Mind: From Colony to Province 395 (1953).
619. See generally Bernard Bailyn, The Ideological Origins of the American Revolution 95, 144-159 (1967) (arguing that the sentiment that the American colonists were faced with a conspiracy to deprive them of their freedom had deep roots in Anglo-American political culture predat- ing the events of the struggle with England); Gordon S. Wood, Conspiracy and the Paranoid Style: Causality and Deceit in the Eighteenth Century, 39 Wm. & Mary Q. 411 (1982) (discussing the prevalence of conspiratorial fears in colonial America).
620. George Washington, Farewell Address (1796), in A Compilation of the Messages and Papers of the Presidents 205, 207 (James D. Richardson ed., 1897).
621. See Davis, supra note 601, at 68 (pointing to anti-Catholic writers who observed that the "key to Catholic strategy" was the maxim "`[d]ivide and conquer,'" by which Catholics supposedly wished to "keep the diverse groups and interests of society from fusing into `a bona fide American character'"); Washington, supra note 620, at 207 (warning of the dangers of factions).
622. Peter Amazeen, The Bavarian Illuminati Scare 3 (1988) (unpublished B.A. thesis, Harvard University); see also Vernon Stauffer, New England and the Bavarian Illuminati 229-43 (1918) (noting that the panic, which lasted almost two years, was touched off by A sermon given by Reverend Jedeidiah Morse on May 9, 1798, based on his reading of John Robison's 1797 book, Proofs of a Conspiracy Against All the Religions and Governments of Europe).
623. See Davis, supra note 601, at 35.
624. See id. at 36. For A discussion of the Alien and Sedition Acts, see supra note 75 and accompanying text.
625. Bray Hammond, Banks and Politics in America from the Revolution to the Civil War 379 (1957) (quoting President Jackson); see also id. at 395-96, 405-09 (describing the various evils of the Bank as perceived by Jacksonians).
626. See David B. Davis, Some Themes of Countersubversion: An Analysis of Anti-Masonic, Anti- Catholic, and Anti-Mormon Literature, 47 Miss. Valley Hist. Rev. 205 (1960), reprinted in Davis, supra note 601, at 9, 10-11. Fear of Freemasons gave rise to a major political party, the anti-Masonic party.
627. See id. at 14 (citing Richard Rush for the idea that "[o]f all governments . . . ours was the one with the most to fear from secret societies, since popular sovereignty by its very nature required perfect freedom of public inquiry and judgment").
628. See David B. Davis, The Slave Power Conspiracy and the Paranoid Style 62-86 (1969) (describing various formulations of the "Slave Power" thesis).
629. Jacob Epstein, The Great Conspiracy Trial 5-7 (1970).
630. See Davis, supra note 601, at 153.
631. See, e.g., Robert M. LA Follette, A Small Group of Men Hold in Their Hands the Business of This Country, 42 Cong. Rec. 3434-36, 3450-51 (1908), reprinted in Davis, supra note 601, at 200. Consider too this lawyer's view of the danger:>
We have heard much of the dangers of corporations in
late years; but, while our publicists had hardly whetted their
swords to meet this question, we are confronted with a new monster
a thousand times more terrible. Every student knows how
corporations have grown from a monastic institution to the
predominance they now occupy in the business world; but American
ingenuity has invented a legal machine which may swallow A hundred[PAGE 855]
corporations or a hundred thousand
individuals; and
then, with all the corporate irresponsibility, their united power
be stored, like a dynamo, in portable compass, and wielded by one
or two men. Not even amenable to the restraints of corporation law,
these "trusts" may realize the Satanic ambition,--
infinite and irresponsible power free of check or
conscience.
>
F.J. Stimson, Trusts, 1 Harv. L. Rev. 132, 132 (1887-
1888).
632. See Davis, supra note 601, at 205-10 ("The years from 1917 to 1921 are probably unmatched in American history for popular hysteria, xenophobia, and paranoid suspicion."). At its peak in 1924, the Ku Klux Klan, which blended nativisism with its anti-Black, - Catholic, and -Jewish ideology, had about 4.5 million members. See id. at 215.
633. See generally David M. Oshinsky, A Conspiracy So Immense: The World of Joe McCarthy (1983) (tracing Senator McCarthy's life and political career).
634. See generally Harry Kalven, Jr., A Worthy Tradition: Freedom of Speech in America 340-67 (1988) (discussing Supreme Court decisions regarding loyalty oaths).
635. See, e.g., Brandenburg v. Ohio, 395 U.S. 444, 447 (1969) (per curiam) (reformulating the test first outlined in Schenck v. United States, 249 U.S. 47, 51 (1919), to require objectively inciting language in a context that makes it likely to produce direct lawless behavior in order to regulate speech); Dennis v. United States, 341 U.S. 494, 516-17 (1951) (upholding a conviction under the Smith Act for the mere advocacy of Communism); Schenck v. United States, 249 U.S. 47, 51 (1919) (creating the "clear and present danger" test for any attempted regulation of speech). See generally Thomas I. Emerson, Toward a General Theory of the First Amendment (1963) (tracing the Supreme Court's development of jurisprudence protecting freedom of expression); Note, Conspiracy and the First Amendment, 79 Yale L.J. 872 (1970) (discussing the conflict between conspiracy law and First Amendment rights).
636. 341 U.S. 494, 516-17 (1951) (upholding petitioner's conviction under the Smith Act despite the absence of evidence of any overt act other than the advocacy of Communism); cf. 3 Ronald D. Rotunda et al., Treatise on Constitutional Law: Substance and Procedure § 20.14, at 56 (1986) (attributing Dennis to the Supreme Court's bowing to the "tone of the times . . . as it managed to avoid direct confrontation" with Congress and the Executive).
637. Martin M. Shapiro, Freedom of Speech: The Supreme Court and Judicial Review 65 (1966).
638. See, e.g., Hess v. Indiana,
414 U.S. 105, 109 (1973) (per curiam) (emphasizing[page
856]
the requirement that speech must be "intended to
produce, and likely to produce, imminent disorder" to
be punished by the state); Brandenburg, 396 U.S. at 447
(reformulating Schenck to require objectively inciting
language in a context that makes it likely to produce direct
lawless behavior in order to regulate speech). See
generally Lee C. Bollinger, The Tolerant Society:
Freedom of Speech and Extremist Speech in America (1986)
(arguing that over the past few decades freedom of speech has
developed a new significance that helps to account for the extremes
to which the principle has been taken). Back
to text
639. See David Yallop, Tracking the Jackal: The Search for Carlos, the World's Most Wanted Man (1993); see also Claire Sterling, The Terror Network 129-46 (1981) (tracing the career of Carlos the Jackal).
640. "Big Brother is dead. The only serious likelihood of his resurrection lies in reaction to the chaos and disintegration" that might be caused by "[t]errorists with secure phones . . . [who] could bring down not just a few buildings but large sections of a modern economy." Nicholas Wade, Method and Madness: Little Brother, N.Y. Times, Sept. 4, 1994, § 6 (Magazine), at 23.
641. Steven B. Duke & Albert C. Gross, America's Longest War: Rethinking Our Tragic Crusade Against Drugs at xv (1993).
642. Diane-Michele Krasnow, To Stop the Scourge: The Supreme Court's Approach to the War on Drugs, 19 Am. J. Crim. L. 219, 224 (1992) (discussing Supreme Court cases dealing with the "War on Drugs" in relation to the Fourth, Sixth, and Eighth Amendments). Back to text
643. Sandra R. Acosta, Imposing the Death Penalty upon Drug Kingpins, 27 Harv. J. on Legis. 596, 596 (1990) (quoting Representative James A. Traficant, A Democrat from Ohio).
644. Krasnow, supra note 642, at 221 n.3.
645. See Pub. L. No. 91-513, tit. II, § 408, 84 Stat. 1236, 1265 (1970) (codified as amended at 21 U.S.C. § 848 (1988 & Supp. V 1993)). A person engages in A continuing criminal enterprise if she "occupies a position of organizer, a supervisory position, or any other position of management" of five or more persons who act feloniously, in concert, to violate drug laws. 21 U.S.C. § 848(c)(2)(A) (1988).
646. See Pub. L. No. 100-690, tit. VII, § 7001, 102 Stat. 4181, 4387 (1988) (codified as amended at 21 U.S.C. § 848 (1988 & Supp. V 1993)) (making the death penalty applicable to convictions for killings committed during illicit drug-related activity). Back to text
647. See Pub. L. No. 91-513, tit. II, § 406, 84 Stat. 1236, 1265 (1970) (codified as amended at 21 U.S.C. § 846 (1988)).
648. See Michael Isikoff,
Federal Study Shocks Drug Experts: "Casual" Use of
Pot, Cocaine Plummets, but Coke Addiction Rises, Sacramento
Bee, Aug. 1, 1989, at A1, A12 (noting that casual use of
marijuana and cocaine fell from 23 million people in 1985 to 14.5
million in 1988); see also Randy E. Barnett, Bad Trip:
Drug Prohibition and the[PAGE 858]
Weakness of Public
Policy, 103 Yale L.J. 2593, 2613 n.65 (1994) (book
review of Duke & Gross, supra note 641).
649. See Duke & Gross, supra note 641, at 160-61 (discussing the social costs of drug prohibition); Barnett, supra note 648, at 2610-14 (same).
650. See Barnett, supra note 648, at 2613 (noting that demonization is made easier by the relatively small number of drug users). Back to text
651. Duke & Gross, supra note 641, at 107.
652. Cf. Barnett, supra note 648, at 2612 (describing the view of some commentators that police are given so much deference in searching and arresting drug suspects that there is a de facto "drug exception" to the Bill of Rights).
653. See Wiretap Report, supra note 145, at 4.
654. See Cornish & Louria, supra note 514, at 95 (discussing the effects that mass drug testing will have on our culture by examining employment drug testing as a means of surveillance). The authors report that mass drug testing entails:>
(1) Fourth Amendment tolerance of systematic preventive searches; (2) increased use of biochemical surveillance as a means of monitoring and deterring undesired behavior; (3) increased use of the workplace and economic sanctions as a tool of regulating undesirable behavior; (4) privatization of traditional law enforcement functions; (5) shrinkage of our expectations of personal privacy; (6) increased use of "profiles"; (7) erosion of the presumption of innocence; and (8) erosion of dignity and autonomy.> Id. at 96.
655. United States v. Place, 462 U.S. 696, 704 n.5 (1983).
656. See Krasnow, supra note 642, at 240; see also Silas J. Wasserstrom, The Incredible Shrinking Fourth Amendment, 21 Am. Crim. L. Rev. 257, 264 (1984) (arguing that the Court has not only weakened the warrant and probable cause requirements, but has also avoided them by expanding Terry v. Ohio, 392 U.S. 1 (1968), and contracting the definition of a search); Steven Wisotsky, Crackdown: The Emerging "Drug Exception" to the Bill of Rights, 38 Hastings L.J. 889, 907 (1987) (noting that "[i]n recent years . . . the courts have almost always upheld the government" in search and seizure cases).
658. James B. White, Judicial Criticism, 20 Ga. L. Rev. 835, 854 (1986).
659. See id. (arguing that Taft's opinion is drafted to evoke a "sense that the fourth amendment has nothing to do with what is really going on in the case").
660. Another possible trigger might be the use of cryptography to hide a child pornography ring. For an example of the likely reaction, see generally John C. Scheller, Note, PC Peep Show: Computers, Privacy, and Child Pornography, 27 J. Marshall L. Rev. 989 (1994).
661. 588 A.2d 145 (Conn.), cert. denied, 112 S. Ct. 330 (1991).
662. See id. at 152. The Court did not decide whether the abutment was the defendant's home for Fourth Amendment purposes. See id. at 155. Compare Teryl S. Eisenberg, Note, Connecticut v. Mooney: Can a Homeless Person Find Privacy Under a Bridge?, 13 Pace L. Rev. 229 (1993) (arguing that a homeless person may be afforded an expectation of privacy in the area the individual reasonably considers"home" based on societal understandings of the privacyassociated with a "home") with David H. Steinberg, Note, Constructing Homes for the Homeless? Searching for a Fourth Amendment Standard, 41 Duke L.J. 1508 (1992) (arguing that the reasonable-expectation-of-privacy inquiry is based on property interests, and concluding that a homeless defendant could not have a reasonable expectation of privacy in belongings left beneath a public bridge abutment).
663. See, e.g., Edward H. Levi, An Introduction to Legal Reasoning 1-3 (1948) (outlining the case method of legal reasoning); K.N. Llewellyn, The Bramble Bush 70 (1951) ("There was a view, and I suppose some hold it still, that law is made up of principles and rules. A master craftsman would be able to arrange them in one great hierarchical scheme.").
664. See, e.g., D. Marvin Jones, Darkness Made Visible: Law, Metaphor, and the Racial Self, 82 Geo. L.J. 437, 447-87 (1993) (applying an analysis of social and legal uses of metaphor to illuminate social construction and significance of race); Steven L. Winter, The Metaphor of Standing and the Problem of Self-Governance, 40 Stan. L. Rev. 1371, 1382-94 (1988) (describing cognitive and legal functions of metaphor).
665. Winter, supra note 664, at 1383; see also Donald A. Sch”n, Generative Metaphor: A Perspective on Problem-Setting in Social Policy, in Metaphor and Thought 137, 137 (Andrew Ortony ed., 2d ed. 1993) (discussing the use of "metaphor" as both a kind of product and a kind of process by which "new perspectives on the world come into existence"). Back to text
666. Others have made this point before
in the context of computers. See I. Trotter Hardy, The
Proper Legal Regime for "Cyberspace," 55 U.
Pitt. L. Rev. 993, 996-1015 (1994) (distinguishing between
problems such as defamation, for which existing legal categories
work, and problems such as reasonableness of behavior in
cyberspace, for which existing legal categories do not offer clear-
cut solutions); David R. Johnson & Kevin A. Marks, Mapping
Electronic Data Communications onto Existing Legal Metaphors:
Should We Let Our Conscience (and Our Contracts) Be Our Guide?,
38 Vill. L. Rev. 487, 489-90 (1993) (arguing for a contract
regime in cyberspace); Henry H. Perritt, Jr., Tort Liability,
the First Amendment, and Equal Access to Electronic Networks,
Harv. J.L. & Tech., Spring 1992, at 65, 95-113 (arguing
that the common law of defamation should inform courts' treatment
of tort claims arising out of communications on electronic
networks); Henry H. Perritt, Jr., Metaphors and Network Law
(Oct. 15, 1992), available online URL gopher://ming.law.vill.edu:70/00/.
chron/.papers/.files/Metaphors.and.Network.Law.txt (arguing
that print shop, broadcasting, and telephone[PAGE 861]
metaphors are inadequate, and proposing alternatives based on
a tort system).
667. John Perry Barlow suggests that the development of the railroads provides a better metaphor for the growth of the Internet than does the information superhighway, because in both cases large private industries sought government- imposed standards and regulations that served to give early entrants commanding market positions. See John P. Barlow, Stopping the Information Railroad, available online URL http://www.ora.com/gnn/bus/ora/features/barlow/index. html.
668. See Hinman v. Pacific Air Transp. Corp., 84 F.2d 755, 759 (9th Cir. 1936), cert. denied, 300 U.S. 654 (1937) (giving airplanes right of way over private property); Vincent M. Brannigan, Biotechnology: A First Order Technico-Legal Revolution, 16 Hofstra L. Rev. 545, 549 (1988) (noting that some new technologies, like railroads, did not require an adjustment in legal conceptions, whereas others, like the airplane, required fundamental adjustments); Brannigan & Dayhoff, supra note 52, at 27 (noting that the advent of air travel "required fundamental shifts in the nature of the right to property").
669. See Michael J. Reddy, The Conduit Metaphor: A Case of Frame Conflict in Our Language About Language, in Metaphor and Thought, supra note 665, at 164, 165.
670. See id. at 189-201 (giving hundreds of examples).
671. Reddy argues that the conduit metaphor is dysfunctional because it obscures a reality in which the auditor/recipient of the communication must actively construct a new text with meaning in light of the recipient's own referents. See id. at 184-87; see also Roland Barthes, The Pleasure of the Text 3-67 (Richard Miller trans., 1975) [hereinafter Barthes, Pleasure] (posing questions about and offering commentary on a reader's finding pleasure in the text she reads); Roland Barthes, S/Z at 4 (Richard Miller trans., 1974) [hereinafter Barthes, S/Z] ("Our literature is characterized by the pitiless divorce . . . between the producer of the text and its user, [which leaves the reader] with no more than the poor freedom either to accept or reject the text . . . ."). See generally David Holdcroft, Saussure: Signs, System, and Arbitrariness (1991) (discussing signs as A semantic system). The validity of this critique is not at issue here; what matters for present purposes is the accuracy of the claim that the conduit metaphor is pervasive.
672. Because the message remains unreadable, there is no way to tell whether it was preencrypted with unescrowed cryptography.
673. See supra text preceding note 413.
674. This check does not detect preencryption of e-mail. See supra text following note 413.
675. See Chambers v. Maroney, 399 U.S. 42, 52 (1970) (applying the exigency exception to the warrant requirement to an automobile); Carroll v. United States, 267 U.S. 132, 153 (1925) (allowing an exception to the Fourth Amendment's warrant requirement in the case of movable vessels).
676. The ECPA presumably protects against the warrantless interception and recording of encrypted communications even when the public servants recording the message lack the capability to decrypt it. Otherwise, public servants would routinely be able to record encrypted messages and then decrypt them if they were later able to amass sufficient evidence to convince a judge to issue a valid warrant. Any other construction of the ECPA would make it almost a dead letter.
In this perspective, the Fifth Circuit's decision in Steve Jackson Games, Inc. v. United States Secret Serv., 36 F.3d 457 (5th Cir. 1994), is troubling but inapposite. The Fifth Circuit imposed a narrow construction on the ECPA prohibition against "intentionally intercepting" e-mails, 18 U.S.C § 2511(1)(a) (1988), to exclude e-mail communications residing on a computer bulletin board but not yet received by the intended recipient. In the Fifth Circuit's stunted view, an "intercept" can only occur if the communication is overheard during transmission. Id. at 461-62. The court held that the capture of an unread e-mail stored on a bulletin board is governed by the less stringent provisions of Title I of the ECPA, which covers the electronic storage of messages. See id. at 461-63.
The Steve Jackson decision is worrisome because it suggests that courts will interpret the ECPA narrowly. It is inapposite because the recording of an encrypted communication during transmission clearly falls under Title II of ECPA. See also id. at 462 n.7 (explaining that a search warrant would be required to obtain access to contents of a stored electronic communication).
681. The test originated in Brown v. Texas, 443 U.S. 47 (1979).
682. See 496 U.S. at 454. The original formulation of the test weighed the degree to which the procedure advances the public interest, but the Sitz Court appears to have lowered its scrutiny of this factor and to have looked instead to some evidence of effectiveness. See id. at 454-55.
683. See Brown, 443 U.S. at 50- 51.
684. Cf. California v. Acevedo, 500 U.S. 565 (1991) (discussing the "automobile exception" to the Fourth Amendment).
685. See Kahn, supra note 6, at 549-50.
686. See, e.g., Farrington v.
Tokushige, 273 U.S. 284, 298-99 (1927) (discussing the
constitutionality of regulations aimed at foreign language
schools); Yu Cong Eng v. Trinidad, 271 U.S. 500, 525 (1926) (same);
Bartels v. Iowa, 262 U.S. 404, 411 (1923) (finding unconstitutional
a state statute prohibiting the teaching of foreign languages in
public schools); Meyer v. Nebraska, 262 U.S. 390, 400 (1923)
(recognizing a parent's right to determine the language to be
spoken by her child); Yniguez v. Arizonans for Official English,
1994 U.S. App. LEXIS 37650, *30-*47 (9th Cir. Jan. 17, 1995)
(amending 1994 U.S. App. LEXIS 34195 (9th Cir. Dec. 7, 1994))
(holding that Article XXVIII of the Arizona constitution, which
requires that English be used for all official business, violates
the First Amendment of the U.S. Constitution); Asian Am. Business
Group v. City of Pomona, 716 F. Supp. 1328, 1330-32 (C.D. Cal.
1989) (holding that an ordinance requiring one-half of the space of
a foreign alphabet sign to be devoted to English alphabetical
characters violated First Amendment free speech rights and the
Equal Protection Clause); see also Antonio J. Califa,
Declaring English the Official Language: Prejudice Spoken
Here, 24 Harv. C.R.-C.L. L. Rev. 293,[PAGE 866]
330-
46 (1989) (arguing that English-only laws violate the Equal
Protection Clause); Donna M. Greenspan, Florida's Official
English Amendment, 18 Nova L. Rev. 891, 908-16 (1994)
(arguing that a lack of enforcement saves the constitutionality of
Florida's Official English amendment, and warning that some day
Spanish-speaking citizens might seek to use similar laws against
English speakers); Joseph Leibowicz, The Proposed English
Language Amendment: Shield or Sword?, 3 Yale L. & Pol'y
Rev. 519, 542-50 (1985) (arguing that the English Language
Amendment should be rejected because it embraces a pure form of
Anglo-conformity and uses a language issue as a weapon against
those already the objects of cultural or racial prejudice); Wendy
Olson, The Shame of Spanish: Cultural Bias in English First
Legislation, 11 Chicano-Latino L. Rev. 1, 23-28 (1991)
(arguing that English-only laws stigmatize language minorities and
violate their constitutional rights to equal protection and
privacy); Juan F. Perea, Demography and Distrust: An Essay on
American Languages, Cultural Pluralism, and Official English,
77 Minn. L. Rev. 269, 356-57 (1992) (asserting that official
English laws should be subject to heightened scrutiny); Hiram Puig-
Lugo, Freedom to Speak One Language: Free Speech and the
English Language Amendment, 11 Chicano-Latino L. Rev.
35, 44-46 (1991) (arguing that English-only laws would be
unconstitutional); Michele Arington, Note, English-Only Laws and
Direct
Legislation: The Battle in the States over Language Minority
Rights, 7 J.L. & Pol. 325, 339-42 (1991) (arguing
that official-English laws should be interpreted narrowly rather
than as broad restraints on bilingual programs); Note,
"Official English": Federal Limits on Efforts to
Curtail Bilingual Services in the States, 100 Harv. L.
Rev. 1345, 1352-56 (1987) (arguing that English-only laws
violate the Equal Protection Clause and unconstitutionally limit
the access of language minorities to the political process); Leo J.
Ramos, Comment, English First Legislation: Potential National
Origin Discrimination, 11 Chicano-Latino L. Rev. 77, 92-
93 (1991) (arguing that language discrimination is facial
discrimination deserving strict scrutiny); Carol Schmid, Comment,
Language Rights and the Legal Status of English-Only Laws in the
Public and Private Sector, 20 N.C. Cent. L.J. 65, 72-76
(1992) (analyzing issues raised by English-only laws).
687. I have been unable to find a single criminal case in which the government has attempted to force A defendant to translate her message. There are cases in which the government provides translations of an eavesdropped conversation and then in effect challenges the defendant to explain what is wrong with the government's incriminating rendition. See, e.g., United States v. Briscoe, 896 F.2d 1476, 1490-93 (7th Cir. 1990) (involving a translation from a Nigerian dialect). Similarly, some courts have held that parties cannot be required to translate foreign-language documents as part of civil discovery governed by the Federal Rules of Civil Procedure. See, e.g., In re Korean Airlines Disaster of Sept. 1, 1983, 103 F.R.D. 357, 357-58 (D.D.C. 1984) (denying a motion to direct Korean Airlines to provide English translations of Korean documents). The Supreme Court has held that in cases under the Hague Evidence Convention a federal court may require a party providing documents to provide translations as well as descriptions of documents. See Soci‚t‚ Nationale Industrielle A‚rospatiale v. United States Dist. Court for the S. Dist. of Iowa, 482 U.S. 522, 546 (1987).
688. See supra part III.A.2.
689. Dorothy E. Denning, Encrypted Speech Is Not Speech 1 (Jan. 16, 1994) (unpublished manuscript, on file with author). In addition to the items discussed in the text, Denning argues that all languages are capable of direct translation to all other languages without the intermediation of a third language, but that a ciphertext, which often consists of strings of ones and zeros, must first be decrypted to its plaintext before being translated into another language. See id. In an e- mail to the author, Professor Denning qualified her claim that encrypted speech "is not speech" by adding: "My conclusion was that encryption must be regarded as a manner of speech rather than speech (or manner of expression) in a more fundamental sense. This, of course, does not rule out its protection." E-mail from Dorothy Denning, Professor and Chair, Computer Science Department, Georgetown University, to Michael Froomkin (Dec. 7, 1994) (on file with author).
690. Denning, supra note 689, at 1-3. Denning also argues that all languages have syntactic malleability--that is, the ability to use semantic building blocks in different orders and combinations that produce meanings--but that ciphertext lacks this property because reordering the ones and zeros will usually produce gibberish. See id.
691. Because American English, like many languages, borrows heavily from other languages, this statement presupposes a very robust conception of the parameters of a living language.
692. See Denning, supra note 689, at 1-3.
693. In considering the extent to which ciphertext resembles speech protected by the First Amendment, I am not assuming that it is so protected. To invoke the First Amendment directly to resolve the constitutional status of ciphertext would be to beg the question that the invocation of the "language" metaphor is supposed to answer. The argument in the text seeks instead to use First Amendment cases to classify differences as relevant to whether a communication is protected speech or not.
694. See Texas v. Johnson, 491
U.S. 397 (1989) (holding that burning the American,[PAGE
868]
flag for expressive reasons falls under the protection
of the First Amendment); Tinker v. Des Moines Indep. Community Sch.
Dist., 393 U.S. 503 (1969) (holding that the First Amendment
protects the
wearing of an armband for expressive reasons); United States v.
O'Brien, 391 U.S. 367 (1968) (finding that the First Amendment
safeguards the burning of a draft card for expressive reasons);
Stromberg v. California, 283 U.S. 359 (1931) (recognizing that the
display of a red flag was protected speech).
695. See Doran v. Salem Inn, Inc., 422 U.S. 922, 932 (1975). But see Barnes v. Glen Theatre, Inc., 501 U.S. 560 (1991) (qualifying Doran).
696. See Brown v. Louisiana, 383 U.S. 131 (1966).
697. See generally Barthes, S/Z, supra note 671; Holdcroft, supra note 671.
700. Cf. Texas v. Johnson, 491 U.S. 397, 404 (1989) (stating that whether particular conduct is protected by the First Amendment depends on "whether `[a]n intent to convey a particularized message was present, and [whether] the likelihood was great that the message would be understood by those who viewed it[]'" (quoting Spence v. Washington, 418 U.S. 405, 410-11 (1974))).
701. Commentators disagree as to whether the First Amendment protects communications or communicative intent. Compare Frederick Schauer, Speech and "Speech"--Obscenity and "Obscenity": An Exercise in the Interpretation of Constitutional Language, 67 Geo. L.J. 899, 918 (1979) ("[A]ny rational justification for the principle of free speech requires both A communicator and an intended object of the communication.") with Melville B. Nimmer, The Meaning of Symbolic Speech Under the First Amendment, 21 UCLA L. Rev. 29, 36 (1973) ("The right to engage in verbal locutions which no one can hear and in conduct which no one can observe may sometimes qualify as a due process `liberty,' but without an actual or potential audience there can be no first amendment speech right.").
702. "Seen as a prosthetic device, the personal computer extends the limits of the individual human body, whether within the privatized microworlds of computer simulations, or through the interactive exchange of messages across global computer networks." Deborah Heath, Computers and Their Bodies: Sex, War and Cyberspace 1 (1992), available online URL gopher://gopher.cpsr.org:70/00/cpsr/gender/clark/Heath.Deborah.
703. See supra text following note 468; supra text accompanying note 686.
704. The important qualifications to this statement, including regulatory searches, valid subpoenas, and searches with valid warrants appear supra part III.B.
705. A number of participants in the
cypherpunks mailing list, notably cypher-[PAGE 871]
punks' co-founder Eric Hughes, have argued in their postings
to the list that
cryptography can only become safe from regulation if it becomes
ubiquitous. This analysis suggests that they are onto something.
If nothing else, the push to provide ubiquitous and user-friendly
cryptography could serve to shorten the "cultural lag,"
for court decisions defining the legal regime for a new technology
often are made before the technology is well-understood by the
public or judges. See Diane C. Maleson, The Historical
Roots of the Legal System's Response to Nuclear Power, 55 S.
Cal. L. Rev. 597, 617-18 (1982) (giving as an example of
"cultural lag" the issue of nuclear power plant
safety).
706. See Doe v. United States, 487 U.S. 201, 210 n.9 (1988) (analogizing to the forced production of a strongbox key).
708. See id. at 409 (stating that the Fifth Amendment ordinarily protects against the compulsion to "restate, repeat, or affirm the truth of the contents of documents sought").
709. See infra Technical Appendix, part A.
710. See 56 Crim. L. Rep. (BNA) No. 12, at 2023 (Dec. 21, 1994).
711. Id. at 2038 (emphasis added). It is difficult to see, however, how under the Fifth Amendment limited immunity could be given to the suspect without preventing the prosecution from using any information directly resulting from the use of the password.
712. Words frequently achieve the same effect with regard to the meanings they "carry," but that is not, one hopes, their primary purpose.
713. Arkansas v. Sanders, 442 U.S. 753, 764-65 n.13 (1979). What a kit of burglar tools looks like, or how its looks differ from an ordinary tool box's, the Supreme Court did not explain.
714. A safe on the move also resembles an armored car. If so, it is constitutionally unsafe, compared to the stationary safe, because there seems little likelihood of an armored car exception to the Fourth Amendment's automobile exception. See supra note 675 and accompanying text.
715. See California v. Acevedo, 500 U.S. 565, 575 (1991) ("Law enforcement officers may seize a container and hold it until they obtain a search warrant.").
716. See id. at 575 (stating that
"the police often will be able to search containers without A warrant, despite the Chadwick-Sanders rule, as a search
incident to a lawful arrest"). The prior rule recognized A privacy interest in luggage. See United States v. Place,
462 U.S. 696, 706-07 (1983) (citing United States v. Chadwick, 433
U.S. 1,[PAGE 873]
13 (1977) for the proposition that
persons
possess "a privacy interest in the contents of personal
luggage that is protected by the Fourth Amendment"); see
also United States v. Ross, 456 U.S. 798, 824 (1982) (expanding
the warrantless search doctrine under Carroll v. United States, 267
U.S. 132, 153 (1925), to containers in cars upon probable cause);
Arkansas v. Sanders, 442 U.S. 753, 764 (1979) (finding luggage to
be a common "repository for personal items" inevitably
associated with an expectation of privacy), rev'd sub nom.
California v. Acevedo, 500 U.S. 565 (1991); United States v.
Chadwick, 433 U.S. 1, 13 (1977) (stating that movable luggage has
a greater reasonable expectation of privacy than automobile),
rev'd sub nom. California v. Acevedo, 500 U.S. 565 (1991).
717. Or at least testimony from which A court might infer consent. See, e.g., United States v. Cox, 762 F. Supp. 145 (E.D. Tex. 1991) (concluding that by unlocking the combination lock of a suitcase, the defendant consented to A search); United States v. Miller, 442 F. Supp. 742, 748 n.5, 753 (D. Me. 1977) (stating that the fact that an officer told the suspect that a lock would be opened with or without his help did not vitiate consent); cf. supra note 366 and accompanying text (suggesting that the intelligence of criminals is not to be overestimated).
718. How brief is unclear. Ninety minutes is too long. See Place, 462 U.S. at 709-10 (stating that the Court has "never approved a seizure of the person for the prolonged 90-minute period involved here").
719. Id. at 707 ("We are aware of no other investigative procedure that is so limited both in the manner in which the information is obtained and in the content of the information revealed by the procedure.").
720. On whether the LEAF is inside or outside the message for ECPA purposes, see supra notes 330- 36 and accompanying text.
721. Once one considers movement, the "safe" begins to look like another conduit metaphor, in which meanings are placed into a safe/cipher that "holds meaning" and then "conveys" the meaning to another.
723. See id. at 394 & n.3. In Soldal v. Cook County, Ill., 113 S. Ct. 538, 549 (1992), the Supreme Court treated the movement of a trailer home affixed to A foundation as a seizure, but this would apply whether the object was a home or a car.
724. Olmstead v. United States, 277 U.S 438, 466 (1928).
725. The exception for messages entrusted to the Postal Service, deriving from Ex parte Jackson, 96 U.S. 727, 733 (1877), which held that the Fourth Amendment protects sealed letters in the mail, was explained as owing to the government's control of the mails. See also United States v. Chadwick, 433 U.S. 1, 10 (1977) (reaffirming application of the Fourth Amendment to mails).
726. Katz v. United States, 389 U.S. 347, 351 (1967).
727. Id. at 361 (Harlan, J., concurring).
729. Id. at 372 n.* (Black, J., dissenting).
731. Id. (noting Olmstead v. United States, 277 U.S. 438 (1928)).
732. 389 U.S. at 365 (Black, J., dissenting).
733. See supra text accompanying note 656 (noting that government has prevailed in most of the recent search and seizure cases before the Supreme Court).
734. See supra text accompanying notes 467-70 (describing ways in which the Boyd decision has been limited).
735. The Supreme Court's test for resolving extent-of-curtilage questions demonstrates the resonance of the idea of "house": the area claimed to be curtilage will be placed under the home's "umbrella" of protection if intimately tied to the home by proximity, an enclosure surrounding the home, the nature and uses to which the area is put, and the steps taken by the resident to protect the area from observation by passersby. See United States v. Dunn, 480 U.S. 294, 301 (1987) (listing the four factors with which the curtilage question should be resolved).
736. See United States v. White, 401 U.S. 745, 786 (1971) (Harlan, J., dissenting) (arguing that the privacy "analysis must . . . transcend the search for subjective expectations"); Terry v. Ohio, 392 U.S. 1, 19 (1968) (noting that "the central inquiry under the Fourth Amendment [is] the reasonableness in all the circumstances of the particular governmental invasion of a citizen's personal security"); see also Dunn, 480 U.S. at 300 ("[T]he extent of curtilage is determined by factors that bear upon whether an individual reasonably may expect that the area in question should be treated as the home itself.").
737. The same Court that looked to
community standards in order to determine the[PAGE 877]
line
between obscenity and legal pornography has not chosen to apply
this amorphous standard to determine objectively reasonable
expectations of privacy.
738. For a fuller description, see Daniel B. Yeager, Search, Seizure and the Positive Law: Expectations of Privacy Outside the Fourth Amendment, 84 J. Crim. L. & Criminology 249 (1993) (surveying and analyzing the Court's Fourth Amendment jurisprudence).
739. See California v. Greenwood, 486 U.S. 35, 39-40 (1988) ("An expectation of privacy does not give rise to Fourth Amendment protection, however, unless society is prepared to accept that expectation as objectively reasonable.").
740. See Florida v. Riley, 488 U.S. 445, 451-52 (1989) (plurality opinion) ("[T]here is nothing . . . to suggest that helicopters flying at 400 feet are sufficiently rare in this country to lend substance to respondent's claim that he reasonably anticipated that his greenhouse would not be subject to observation from that altitude."); see also California v. Ciraolo, 476 U.S. 207, 215 (1986) (holding valid a warrantless aerial surveillance of yard enclosed by a 10- foot fence).
741. See Oliver v. United States, 466 U.S. 170, 179 (1984) (finding that "[i]t is not generally true that fences or `No Trespassing' signs effectively bar the public from viewing open fields in rural areas"); see also United States v. Dunn, 480 U.S. 294 (1987) (refining the concept of a protected zone within curtilage); Ciraolo, 476 U.S. at 215 (finding a reasonable expectation of privacy lacking with regard to a property surrounded by a 10-foot fence in an age in which private and commercial flight in public airways is routine).
744. See Yeager, supra note 738, at 252-53; Heather L. Hanson, Note, The Fourth Amendment in the Workplace: Are We Really Being Reasonable?, 79 Va. L. Rev. 243, 262-73 (1993) (suggesting a return to A property rights basis for privacy in the workplace); see also Soldal v. Cook County, Ill., 113 S. Ct. 538, 543-45 (1992) (explaining that the Fourth Amendment protects property interests as well as privacy interests). But see Rawlings v. Kentucky, 448 U.S. 98, 105-06 (1980) (refusing to extend constitutional protection in the context of a defendant's ownership interest in illegal drugs by rejecting "the notion that `arcane' concepts of property law ought to control the ability to claim the protections of the Fourth Amendment"); Kreimer, supra note 592, at 89-94 (discussing the virtues of disclosure); Stephen J. Schnably, Property and Pragmatism: A Critique of Radin's Theory of Property and Personhood, 45 Stan. L. Rev. 347, 378 n.153 (1993) (warning that one danger of the reliance upon a property theory of rights is that the home- as-fortress becomes a refuge for harmful acts that might be prevented by exposure).
745. The argument in the text has mixed implications for other forms of privacy. Records may be protected to the extent that they are encrypted and that production of the key cannot be compelled. Note that compulsion may be a subpoena, a grand jury inquiry, or even civil discovery instigated by the government on pain of tax forfeiture. See Ann L. Iijima, The War on Drugs: The Privilege Against Self-Incrimination Falls Victim to State Taxation of Controlled Substances, 29 Harv. C.R.-C.L. L. Rev. 101, 127-34 (1994) (describing how records and other information may be compelled by taxing authorities to the detriment of drug dealers).
746. See infra Technical Appendix, part B.
747. The full ramifications of this question are beyond the scope of this Article.
748. Digital Privacy and Security Working Group, supra note 31, at 6.
749. The Supreme Court's decision in California v. Carney suggests that this day is still some ways off. See California v. Carney, 471 U.S. 386, 393-94 (1985) (holding that a moving recreational vehicle is a car and not a house).
750. Bok, supra note 2, at 16.
753. See, e.g., Gandy, supra note 587, at 15 (noting that the collection, processing, and sharing of information about individuals and groups is widespread and continues to expand); see also Kevin Fogarty, Data Mining Can Help to Extract Jewels of Data, Network World, June 6, 1994, at 40, 40 (describing the practice of "data mining" by which corporations accumulate and manipulate enormous data bases).
754. Osborn v. United States, 385 U.S. 323, 341 (1966) (Douglas, J., dissenting).
755. See supra text accompanying notes 582-91. This may give cause to invoke another archetype-- Frankenstein's monster.
756. See supra part I.C.1.c.i (discussing the ITAR). A law banning unescrowed cryptography, or even only unescrowed commercial cryptography, would provide some advantages for law enforcement. However, under such a regime it might be easier to prove that someone has used unescrowed cryptography than to prove the offense that the secret message would tend to prove. If the defendant will not decrypt the message, she may still be subject to prosecution for the (lesser?) offense of using unregistered cryptography. Although this smacks of prosecuting Al Capone for tax evasion, it may still be an effective technique.
Defining some types of cryptography as contraband would be another approach to the problem, again one with First Amendment problems. Courts have upheld a variety of contraband statutes, but none affected free speech. For example, courts have upheld statutes banning radar detectors (fuzzbusters). See generally Nikolaus F. Schandlbauer, Comment, Busting the "Fuzzbuster": Rethinking Bans on Radar Detectors, 94 Dick. L. Rev. 783, 785-89 (1990) (listing the jurisdictions which prohibit radar detectors). Statutes banning burglars' tools, such as N.Y. Penal Law § 140.35 (McKinney 1988) and Fla. Stat. ch. 810.06 (West 1994), have been upheld also, see, e.g., People v. Atson, 526 N.Y.S.2d 618, 619 (N.Y. App. Div.) (requiring more than "purely circumstantial" evidence to sustain a conviction of possession of burglar's tools), appeal denied, 528 N.E.2d 896 (N.Y. 1988); Thomas v. State, 531 So. 2d 708, 709-10 (Fla. 1988) (requiring a specific intent to commit burglary when the state burglary tool statute criminalizes common household tools or devices), as have drug paraphernalia statutes, such as 21 U.S.C. § 863 (Supp. V 1993), see, e.g., Posters `N' Things, Ltd. v. United States, 114 S. Ct. 1747, 1749 n.5, 1754-55 (1994) (upholding the constitutionality of 21 U.S.C. § 857 (1988), which Congress repealed and replaced in 1990 with the virtually identical § 863).
757. See supra text accompanying note 367.
758. See supra part I.C.1.c.i (discussing the ITAR's lack of effectiveness).
759. See generally E-mail from Rishab A. Ghosh to Michael Froomkin (Jan. 11, 1995) (on file with author) (arguing, by quoting from his article in Asian Age magazine of January 2, 1995, that in cyberspace "distance--as we usually understand it--disappears).
760. See supra part I.A.
761. Schneier, supra note 12, at xv.
762. Note that the best scrambler phones in the world will not protect you from a listening device in your room.
763. Successful cryptographic systems combine two basic principles: confusion and diffusion. Confusion is nothing more than some form of substitution--letters or, more commonly, sets of information bits representing other letters or information bits in some fashion. Replacing every letter with the one next to it on the typewriter keyboard is an example of confusion by substitution. Diffusion means mixing up the pieces of the message so that they no longer form recognizable patterns. Jumbling the letters of a message is a form of diffusion. Because modern ciphers usually operate at the bit level, many modern cryptographic systems produce confusion and diffusion which depends on every bit of the plaintext. Changing a single character will change the operations performed during encryption, making encryption more difficult, but also more difficult to crack. See Schneier, supra note 12, at 193; Feistel, supra note 7, at 15.
764. Cf. Note, 17 L.Q. Rev. 223, 223 (1901) (urging "prudent persons not to write defamatory statements of any kind on postcards, even in the decent obscurity of a learned language").
765. As we will see, this vulnerability may have been unknown to the cryptographers who designed the cipher, or it may have been inserted intentionally.
766. The two can sometimes be combined: a mathematical attack might, for example, demonstrate that only certain types of keys need to be tried, thus lowering the computational effort involved.
767. Algorithms whose security depends on the algorithm being kept secret have "historical interest, but by today's data security standards they provide woefully inadequate security. A large or changing group of users cannot use them, because users will eventually reveal the secret. When they do, the whole security of the system fails. . . . [They are also] trivial to break by experienced cryptanalysts." Schneier, supra note 12, at 2.
768. Ordinarily, in systems using multiple keys, only one of the keys need be kept secret.
769. The only type of algorithm guaranteed to be secure against all forms of mathematical and brute-force attacks is known as the "one-time pad." A one-time pad is "nothing more than a nonrepeating set of truly random key letters . . . . The sender uses each key letter on the pad to encrypt exactly one plaintext character. The receiver has an identical pad and uses each key on the pad, in turn, to decrypt each letter of the ciphertext." Schneier, supra note 12, at 13; see also Gardner, supra, note 128, at 120 (stating that ciphers that provide "absolute secrecy" are not always used because it is "too impractical"). The critical features of a one-time pad are that the pad must be kept from the enemy, the characters on the pad must be truly random, and the pad must never be used twice. Because large pads are difficult to generate, must be communicated to the recipient in utmost secrecy, and are unwieldy, the one-time pad is difficult to use for anything other than short messages of the highest security. See Schneier, supra note 12, at 14-15.
770. See Eric Bach et al., Cryptography FAQ (06/10: Public-key Cryptography) § 6.6 (June 7, 1994), available online URL ftp://rt fm.mit.edu/pub/usenet/news.answers/cryptography-faq/part06 ("Historically even professional cryptographers have made mistakes in estimating and depending on the intractability of various computational problems for secure cryptographic properties.").
771. A bit is a binary unit of information that can have the value zero or one. Computers organize bits into bytes, often 8, 16, or 32 bits in length. For example, DOS-based personal computers use eight-bit bytes to represent alphanumeric characters.
772. Although a 16-character message on
a PC has 128 bits, most 8-bit-to-a-byte PCs limit the characters
that can be represented to fewer than 256 per byte because one bit
is used for error-checking. Hence, although the amount of
information in a 128-bit key is equal to a 16-character text on A PC, such a text itself would in effect be a much shorter key on
most personal computers because an attacker would know
that[PAGE 888]
many possible values for certain bits
could be ignored.
Limiting the possible keys only to lowercase letters and digits
would have even more drastic effects: a 56-bit key, which if
unconstrained would produce 2^56 (approximately 10^16) possible
keys, would be limited to 10^12 possible keys, making it 10,000
times easier to crack. See Schneier, supra
note 12, at 141.
773. See id. at 7, 129. The universe is estimated to be about 10^10 years old. See id. at 7.
774. See Garon & Outerbridge, supra note 26, at 179-81; Schneier, supra note 12, at 7.
775. See Ronald L. Rivest, Responses to NIST's Proposal, Comm. ACM, July 1992, at 41, 44-45 (estimating that today a 512-bit key can be broken with about $8.2 million worth of equipment, and noting that the cost will continue to shrink).
776. See Bamford, supra note 17, at 346-47 (describing the "closed door negotiations" between the NSA and IBM that resulted in the key size reduction from 128 to 56 bits).
777. See supra text accompanying note 108.
778. See Bamford, supra note 17, at 348; Whitfield Diffie & Martin E. Hellman, Exhaustive Cryptanalysis of the NBS Data Encryption Standard, Computer, June 1977, at 74, 74.
779. See Diffie & Hellman, supra note 778, at 74 (predicting that the rapidly decreasing cost of computation should have, by 1987, reduced the solution cost to the $50 range); Rivest, supra note 775, at 45 (estimating the cost to break a 512-bit key as $8.2 million); see also Bamford, supra note 17, at 348.
780. See Bamford, supra note 17, at 348-49 (comparing arguments about the cost of a computer that could break a 56-bit key).
781. See FIPS 46-2, supra note 106, at 69,348.
782. See H.R. Rep. No. 153 (Part I), 100th Cong., 1st Sess. 18 (1987), reprinted in 1987 U.S.C.C.A.N. 3120, 3133.
783. See FIPS 46-2, supra
note 106, at 69,347. NIST suggested, however, that it may
[PAGE 890]
not recertify DES when its current five-year
certification
expires. See id. at 69,350 (noting that in 1998, when the
standard will be over 20 years old, NIST will consider alternatives
that offer a higher level of security).
784. For the original papers describing public-key cryptography, see Whitfield Diffie & Martin E. Hellman, New Directions in Cryptography, IT-22 IEEE Transactions Info. Theory 644 (1976), and Ralph C. Merkle, Secure Communication over Insecure Channels, Comm. ACM, Apr. 1978, at 294. For further information about public- key cryptography, see generally Schneier, supra note 12, at 29. More concentrated descriptions can be found in Bach, et al., supra note 770, § 6; RSA Cryptography Today FAQ, supra note 129; and in Whitfield Diffie, The First Ten Years of Public-Key Cryptography, 76 Proc. IEEE 560 (1988) (discussing the history of public key cryptography).
785. This proved to be a real problem during World War II, resulting in the capture of several spy rings. See Kahn, supra note 6, at 530.
786. Bach et al., supra note 770, § 6.2. Despite the prediction of ubiquity, the fact remains that to date nongovernmental commercial uses of public-key cryptography have been very limited. "It is easy to build A case for buying cryptography futures. . . . Nonetheless, cryptography remains a niche market in which (with the exception of [sales to the government]) a handful of companies gross only a few tens of millions of dollars annually." ACM Report, supra note 15, at 12.
787. The ASCII version of the author's
public key for his 1024-bit key in Pretty[PAGE 891]
Good Privacy
is:
----BEGIN PGP PUBLIC KEY BLOCK---- Version: 2.6.2 mQCNAi4ztDUAAAEEAMjCVU3S9YJDTYD3f3XO1bZGCC0+zGLXrUE3ww0YwCktzp5r OCR1sE4OxoLlGrECH9A/BVw0KAm7mpwb7n3wIg7TfasRmbDEKcc9jZfc9xlPpavD TSXAx3a3Ab3R5PTJEl76EF2lU2jnVE7wo2GI1wZQuRDYFPWHwXpsXYZGTrN1AAUR tC9NaWNoYWVsIEZyb29ta2luIDxtZnJvb21raUB1bWlhbWkuaXIubWlhbWkuZWR1 Pg== =qpGN ----END PGP PUBLIC KEY BLOCK----
788. See Schneier, supra note 12, at 284-85 (stating that security of RSA evaporates if someone discovers a rapid means of factoring large numbers); id. at 318-20 (explaining that security of certain other public-key algorithms depends on the continuing inability of mathematicians to solve the long-standing problem of calculating discrete logarithms).
789. Clearly, the security of the system evaporates if the private key is compromised, that is, transmitted to anyone.
790. For example, RSA, one of the leading public-key programs, is at least 100 times slower than DES in software implementations, and up to 10,000 times slower in hardware. See RSA Cryptography Today FAQ, supra note 129, § 2.3.
791. Another popular single-key cipher, which is not hampered by a 56-bit limit on key length, is called IDEA. See Schneier, supra note 12, at 260- 66.
792. If Bob does not have a public key on record somewhere that Alice considers reliable, then Bob needs to authenticate it in a manner that protects against a "man in the middle" attack when he sends his public key to Alice. In a "man in the middle" attack, a third party intercepts Bob's first message. The man in the middle substitutes his public key for Bob's. Now Alice thinks she has Bob's key, and sends messages that are easily decrypted by the third party. The third party then reencrypts them with Bob's public key and sends them on to Bob who may never know the difference. A "man in the middle" attack will be prevented if Bob signs his public key with a digital signature that Alice can recognize. But this requires either that Bob register his public key for digital signatures somewhere trustworthy or that he find someone whose digital signature Alice already knows and who can affix a digital signature to Bob's transmission of his public key, thus attesting to the fact that it really comes from Bob.
793. See ACM Report, supra note 15, at 8 (noting that the logarithmic computation would "demand more than 2^100 (or approximately 10^30) operations" and that "today's supercomputers . . . would take a billion billion years to perform this many operations"); Schneier, supra note 12, at 275-77 (discussing Diffie-Hellman); see also Diffie & Hellman, supra note 784, at 644 (noting that in "a public key cryptosystem, enciphering and deciphering are governed by distinct keys, E and D, such that computing D from E is computationally infeasible" (emphasis omitted)).
794. See OTA Information Security, supra note 97, at 55-56. For a thorough survey of the legal and policy issues involved in setting up and running a certification authority, see generally Michael Baum, National Institute of Standards and Technology, Federal Certification Authority Liability and Policy: Law and Policy of Certificate-Based Public Key and Digital Signatures (1994).
795. See PGP(TM) User's Guide, supra note 73.
796. See Garfinkel, supra note 73, at 235-36.
797. This is not irrefutable proof because a third party could obtain the key from the authorized user by stealth, purchase, accident, or torture (also known as "rubber hose cryptanalysis").
798. Consider the following example: To sign a message, Alice does a computation involving both her private key and the message itself; the output is called the digital signature and is attached to the message, which is then sent. Bob, to verify the signature, does some computation involving the message, the purported signature, and Alice's public key. If the results properly hold in a simple mathematical relation, the signature is verified as genuine; otherwise, the signature may be fraudulent or the message altered, and they are discarded. See RSA Cryptography Today FAQ, supra note 129, § 2.13.
799. See Schneier, supra note 12, at 35 (noting that a digital signature using a 160-bit checksum has only a one in 2^160 chance of misidentification).
800. See, e.g., Sherry L. Harowitz, Building Security into Cyberspace, Security Mgmt., June 1994, available in LEXIS, News Library, Curnws File (noting that the U.S. government "wants to discourage" the use of digital signatures other than the Clipper Chip for encryption purposes); Robert L. Hotz, Sign on the Electronic Dotted Line, L.A. Times, Oct. 19, 1993, at A1 ("Federal officials refused to adopt the earlier technique, called RSA, as a national standard because they were concerned that it could be used to conceal clandestine messages that could not be detected by law enforcement or national security agencies.").
801. See A Proposed Federal Information Processing Standard for Digital Signature Standard (DSS), 56 Fed. Reg. 42,980, 42,980 (1991). The DSS was to be "applicable to all federal departments and agencies for the protection of unclassified information," and was "intended for use in electronic mail, electronic funds transfer, electronic data interchange, software distribution, datA storage, and other applications which require data integrity assurance and data origin authentication." Id. at 42,981.
802. See Kevin Power, Use DSS with No Fear of Patent Liability, NIST Says, Gov't Computer News, Oct. 17, 1994, at 64 (noting overwhelming opposition to NIST's proposal to give a patent group an exclusive license to the Digital Signature Algorithm).
803. See Approval of Federal Information Processing Standards Publication 186, Digital Signature Standard (DSS), 59 Fed. Reg. 26,208, 26,209 (1994).
805. A subliminal message is invisible to the user because it is encrypted and mixed in with the garbage- like stream of characters that constitutes the signature.
806. Schneier, supra note 12, at 313; see also id. at 390-92 (explaining how the subliminal channel works). The DSA is the algorithm used in the DSS.
807. Simmons, supra note 56, at 218.
809. See Capstone Chip Technology, supra note 16 (The Capstone Chip "implements the same cryptographic algorithm as the CLIPPER chip. In addition, the CAPSTONE Chip includes . . . the Digital Signature Algorithm (DSA) proposed by NIST . . . .").
810. See ACM Report, supra note 15, at 3 (noting that "information that is authenticated and integrity-checked is not necessarily confidential; that is, confidentiality can be separated from integrity and authenticity").