1998 Congressional Hearings
Intelligence and Security


Testimony of Deirdre Mulligan

Staff Counsel

Center for Democracy and Technology

before the

House Committee on the Judiciary

Subcommittee on Courts and Intellectual Property

March 26, 1998

I. Introduction and Summary

The Center for Democracy and Technology (CDT) is pleased to have this opportunity to testify on the issue of privacy protection in the online environment.

CDT is a non-profit, public interest organization dedicated to developing and implementing public policies to protect and advance civil liberties and democratic values on the Internet. One of our core goals is to enhance privacy protections for individuals in the development and use of new communications technologies.

To focus my testimony this morning, I will begin by outlining five trends in technology with ramifications for the existing framework of privacy protections in electronic communications. The current mix of legal and self-regulatory protections for privacy has not kept pace with technology and its growing role in society. The core of my testimony is a series of policy recommendations:

identifying areas in which Congress should enhance existing privacy protections;

recommending the creation of an institutional structure for addressing privacy concerns in a proactive and ongoing manner; and,

urging the US government (and others) to engage in several non-traditional methods of developing and implementing privacy policy that are of particular relevance to the global, decentralized networks that comprise our communications infrastructure.

It is critically important to ensure that privacy protections keep pace with changes in technology. This requires a periodic assessment of whether changes in technology pose new threats to privacy that must be addressed through changes in law. Many of our existing laws were constructed to meet dual purposes, such as protecting privacy and meeting legitimate law enforcement needs, or protecting privacy and promoting the cost-effective operation of the health care system. We must examine whether they continue to set the bounds of permissible government and private sector action in a fashion consistent with privacy protection. In addition, we should evaluate whether technology itself can be used to advance privacy in this new environment. Finally, the globalization of the communications system requires us to consider alternative methods for achieving policy goals, be they self-regulation or international agreements.



II. Technology trends with ramifications for individual privacy in electronic communications

A. The explosive growth of the Internet is transforming our methods of communicating and methods of gathering, processing and sharing information and knowledge. In 1986, when Congress updated the communications privacy laws,(1) the Internet was comprised of approximately 50,000 computers. Today the Internet is comprised of upwards of 20 million Internet host computers globally and estimates on individual users hover around 100 million people worldwide. Unlike traditional media, the Internet supports interactions ranging from banking to dating, from one to one communications, town hall meetings, political events, to commercial transactions.

B. The transactional data generated through the use of new technologies is a rich source of information about individualsÕ habits of association, speech, and commercial activities. This vast new data is essential to the operation of the packet-switching medium and provides the raw material for many of the unique functions the Internet offers, yet it poses significant privacy concerns. Interactive media generate, capture and store a tremendous amount of information. At the same time the flexibility of new media is blurring the distinction between the content of a communication and the transactional data used to route the message to its destination. Transactional data in this new media is more detailed, descriptive, and identifying than ever before. Aggregated, it is capable of revealing as much about the individual as the content of a message.

C. The globalization of communications technology is eroding national borders. Governments are finding it increasingly difficult to enforce laws -- be they laws to protect or repress their citizens. The fluidity of borders on the Internet promises to promote pluralism, the free flow of information and ideas, diverse associations, and, we hope, democracy. On the other hand, enforceable, workable privacy protections for the global information infrastructure have yet to emerge leaving individualsÕ communications and personal data vulnerable.

D. The lack of centralized control mechanisms. The distributed nature of the InternetÕs infrastructure distinguishes it, at least in degree, from existing communications systems. Its decentralized nature allows it to cope with problems and failures in any given computer network by simply routing information along alternate paths. This makes the Internet quite robust. However, the lack of centralized control mechanisms may frustrate those seeking to regulate activities on the network.(2) Decentralized systems are inherently less secure. They pose new challenges to protecting data during storage and transmission.

E. Decrease in computing costs and the focus on client-side controls over network interactions present new opportunities to empower individuals. The Internet continues to shift control over interactions away from the government and large private sector companies. The ability to build privacy protections into the users interface with the network offers the opportunity to craft privacy protections that shield individuals regardless of the jurisdictional law and policy. Providing individuals with technical means to control and secure their communications and personal information may pave the way for privacy protections that are as decentralized and ubiquitous as the networks themselves.

III. Policies from the pre-network world

Current policies protecting individual privacy in electronic communications are built upon Fourth Amendment principles designed to protect citizens from government intrusion. While premised on Fourth Amendment concepts, the contours of existing statutory protections are also a product of the technical and social ÒgivensÓ of specific moments in history. Some of these historical givens have changed dramatically, with implications for the effectiveness and relevance of existing statutory protections for privacy.



Crafting proper privacy protections in the electronic realm has always been a complex endeavor. It requires a keen awareness of not only changes in technology, but also changes in how the technology is used by citizens, and how those changes are pushing at the edges of existing laws. From time to time these changes require us to reexamine our fabric of privacy protections. The issues raised below indicate that it is time for such a review.

A. From phones to email: The existing framework

In response to Supreme Court decisions finding that electronic surveillance was a search and seizure covered by the 4th Amendment(3) and law enforcementÕs arguments that it was a needed weapon against organized crime,(4) Congress passed Title III of the Omnibus Crime Control and Safe Streets Act of 1968.(5) The wiretap provisions of Title III authorized law enforcement wiretapping of telephones within a framework designed to protect privacy and compensate for the uniquely intrusive aspects of electronic surveillance.(6)

In brief, the legislation Congress enacted in 1968 had the following components: the content of wire communications could be seized by the government in criminal cases pursuant to a court order issued upon a finding of probable cause;(7) wiretapping would be otherwise outlawed;(8) wiretapping would be permitted only for specified crimes;(9) it would be authorized only as a last resort, when other investigative techniques would not work;(10) surveillance would be carried out in such a way as to ÒminimizeÓ the interception of innocent conversations;(11) notice would be provided after the investigation had been concluded;(12) and there would be an opportunity prior to introduction of the evidence at any trial for an adversarial challenge to both the adequacy of the probable cause and the conduct of the wiretap.(13) ÒMinimizationÓ was deemed essential to satisfy the Fourth AmendmentÕs particularity requirement, compensating for the fact that law enforcement was receiving all of the targetÕs communications, including those that were not evidence of a crime. The showing of a special need, in the form of a lack of other reasonable means to obtain the information, was viewed as justification for the failure to provide advance or contemporaneous notice of the search. (14)

Due to privacy considerations arising from changes in technology, primarily the advent of wireless services and the growing use of email, in 1986 Congress adopted the Electronic Communications Privacy Act (ECPA).(15) CongressÕ action was in part spurred by the recognition that individuals would be reluctant to use new technologies unless privacy protections were in place.(16)

ECPA did recognize the importance of transactional data. ECPA set forth rules for the use of pen registers and trap and trace devices, which capture out-going and incoming phone numbers respectively.(17) It also established rules for law enforcement access to information identifying subscribers of electronic communication services.(18) For transactional information relating to e-mail ECPA requires a warrant, for other transactional data it requires a court order, a mere subpoena, or consent.

To a large degree ECPA extended the Title III protections to the interception of wireless voice communications and to non-voice electronic communications such as fax and email while in transit. However, ECPA did not extend all of Title IIIÕs protections to electronic communications. Unlike Title III, which limits the use of wiretaps to a limited list of crimes, court orders authorizing interceptions of electronic communications can be based upon the violation of any federal felony. While constitutional challenges to the introduction of information obtained in violation of ECPA may succeed, ECPA contains no statutory exclusionary rule as Title III does.(19)

Moreover, Congress et very different rules for access to electronic communications while they are in storage incident to transmission.(20) When the government goes to AOL or another service provider and asks it to provide a copy of a personÕs email messages from the AOL server where they sit waiting to be read, an ordinary search warrant is enough without the special protections of minimization, judicial supervision and notice to the individual found in Title III.

B. Assumptions of the existing framework

In drafting ECPA Congress began the process of dealing with fundamental changes in technology. They recognized that transactional data needed privacy protections. However, the framework of Title III and the advances of ECPA did not envision the World Wide Web and the pervasive role technology would come to play in our daily lives. Underlying Title III and ECPA were a number of assumptions about both the nature and the use of electronic communications:

The transmission of private communications and records stored with third parties, including records of such communications, raise different privacy considerations.

The majority of electronic communications are by nature ephemeral.

The private sphere of personal communications and interactions would be located at the end-points, not in the medium itself.

The governmentÕs collection and use of information about individualsÕ activities and communications is the greatest threat to individual privacy.

Transactional data is not rich in intimate, personal detail.

Congress has only begun to wrestle with the fact that some of these assumptions, while perhaps accurate at one point in history, have changed dramatically since the initial framework for protecting electronic communications was articulated in 1986.

Congress took a first small step towards recognizing the changing nature of transactional data in the networked environment with amendments to ECPA enacted as part of the Communications Assistance for Law Enforcement Act of 1994 (CALEA).(21) The 1994 Amendments recognized that transactional data was emerging as a hybrid form of data, somewhere between addressing information and content, and was becoming increasingly revealing of personal patterns of association. For example, addressing information was no longer just a number and name, but contained the subject under discussion and information about the individualÕs location. Therefore, Congress raised the legal bar for government access to transactional data by eliminating subpoena access and requiring a court order, albeit one issued on a lower relevance standard.(22) Some issues were left unanswered, and new ones continue to arise as communications technology advance.

IV. Four examples reveal the current weaknesses of existing statutory protections for privacy in light of the shifts in electronic communications technology and its use in society.

A. Personal papers in cyberspace

IndividualÕs traditionally kept their diaries under their mattress, in the bottom drawer of their dresser or at their writing table. Situated within the four walls of the home these private papers are protected by the Fourth Amendment. With the advent of home computers individual diaries moved to the desktop and the hard-drive. Writers, poets, and average citizens quickly took advantage of computers to manage and transcribe their important records and thoughts. Similarly, pictures moved from the photo album to the CD-ROM.

Today, network computing allows individuals to rent space outside their home to store personal files and personal World Wide Web pages. The information has remained the same. A diary is a diary is a diary. But storing those personal thoughts and reflections on a remote server eliminates many of the privacy protections they were afforded when they were under the bed or on the hard-drive. Rather than the Fourth Amendment protections -- including a warrant based on probable cause, judicial oversight, and notice -- the individualÕs recorded thoughts may be obtained from the service provider through a mere court order with no notice to the individual at all.

B. Medical records in cyberspace

To bring home what this means in a business setting lets look at medical records. Hospitals, their affiliated clinics and physicians are using intranets to enable the sharing of patient, clinical, financial, and administrative data. Built on Internet technologies and protocols, the private networks link the hospitalÕs information system, to pharmacy and laboratory systems, transcription systems, doctors and clinic offices and others. The U.S. government is contemplating the development of a federal governmentwide computer-based patient record system.(23) According to news reports, the Internet and World Wide Web-based interfaces are under consideration.(24) The private sector is moving to integrate network computing into the a sensitive area of our lives -- the doctors office.(25)

As computing comes to medicine, the detailed records of individualsÕ health continue to move not just out of our homes, but out of our doctors offices. While the use of network technology promises to bring information to the fingertips of medical providers when they need it most, and greatly ease billing, prescription refills, and insurance pre-authorizations, it raises privacy concerns.

In the absence of comprehensive federal legislation to protect patient privacy, the protections afforded by ECPA and other statutes are of utmost importance. Unfortunately, the protections afforded to patient data may vary greatly depending upon how the network is structured, where data is stored, and how long it is kept. If records are housed on the computer of an individual doctor then access to that data will be governed by the Fourth Amendment.(26) Law enforcement would be required to serve the doctor with a warrant or subpoena and the doctor would receive notice and have the chance to halt an inappropriate search. Under federal law, the patient however, would receive no notice and have no opportunity to contest the production of the records. When information is in transit between a doctor and a hospital through a network, law enforcementÕs access is governed by the warrant requirements of ECPA, and neither doctor nor patient receive prior or contemporaneous notice. If the records are stored on a server leased from a service provider the protections are unclear. They may be accessible by mere subpoena. If they are covered by the Òremote computingÓ provisions of ECPA this would severely undermine privacy in the digital age.(27)

In addition to concerns about government access to personal health information, recent news stories have focused the public on the misuse of personal health information by the private sector -- particularly when its digitized, stored and manipulated. Recently the Washington Post reported that CVS drug stores and Giant Food were disclosing patient prescription records to a direct mail and pharmaceutical company. The company was using the information to track customers who failed to refill prescriptions -- sending them notices encouraging them to refill and to consider other treatments. Due to public outrage -- and perhaps the concern expressed by Senators crafting legislation on the issue of health privacy -- CVS and Giant agreed to halt the marketing disclosures.(28) But the sale and disclosure of personal health information is big business. In a recent advertisement Patient Direct Metromail advertised that it had 7.6 million names of people suffering from allergies, 945,000 suffering from bladder-control problems, and 558,000 suffering from yeast infections.(29)

The sale and disclosure of what many perceive as less sensitive information is also raising privacy concerns.(30) This past summer AOL announced plans to disclose its subscribers telephone numbers to business partners for telemarketing.(31) AOL heard loud objections from subscribers and advocates opposed to this unilateral change in the Òterms of service agreementÓ covering the use and disclosure of personal information.(32) In response, AOL decided not to follow through with its proposal.(33)

As we move forward we must ask, will personal records be afforded differing levels of privacy protection merely because of where and how they are stored? Will individuals be the arbiters of their own privacy, able to make decisions about who knows what about them? How will individual privacy be protected in interactions in the private sector.

C. The case of Timothy R. McVeigh(34)

In January news stories broke about a highly decorated seventeen-year veteran of the U.S. Navy who was to be discharged based on information obtained by the Navy from America Online.(35) The facts surrounding the incident raise many concerns with privacy in the online world. Using an AOL screenname Òboysrch,Ó Timothy McVeigh sent an email to a civilian Navy volunteer. The curious volunteer looked up the screenname in AOLÕs member profile directory and discovered that the subscriber identified himself as ÒTim, from Honolulu, Hawaii, employed by the military, and gay.Ó The volunteer passed the screen name and profile information on to her husband, a Navy officer. It eventually landed in the hands of the Judge Advocate General who undertook an investigation. A Navy paralegal called AOLÕs customer service and asked for information about the subscriber belonging to the screenname Òboysrch.Ó AOL identified Timothy R. McVeigh as the subscriber.

According to the administrative separation proceedings, the Navy paralegal had not obtained a warrant, a court order, a subpoena, or Timothy McVeighÕs consent prior to contacting AOL, and was therefore in violation of ECPA. In its statement arguing against Timothy McVeighÕs request for an injunction, the Navy stated that ECPA puts the obligation on AOL to withhold information, not on the government to follow appropriate procedures.(36) Equally troubling is the fact that because the statute penalizes only Òknowing or intentionalÓ violations, it is unclear whether a cause of action will succeed for this violation of privacy and ECPA.

This case illustrates a number of weaknesses of ECPA. ECPA limits the disclosure of information to the government but allows online service providers and others to disclose information, other than the contents of communications, about subscribers to other parties.(37) Is the disclosure of information to the Navy, or more generally the government, an individualÕs only privacy concern? We can certainly imagine scenarios in which information tying a screenname, and possibly online activities, to an individualÕs real world identity would substantially invade an individualÕs privacy and potentially enable further harm to befall him. Of specific concern would be the disclosure of information about children in such a setting. While the governmentÕs access to this information, and subsequent actions based upon it, are the source of harm in the McVeigh incident, it is quite possible to imagine a situation equally troubling involving the disclosure of such information to a private party.(38)

A second troubling aspect of ECPA revealed by the McVeigh case is that the lack of a statutory exclusionary rule coupled with penalties that only focus on intentional violations do not create incentives for parties to effectively implement its requirements. In the McVeigh case ECPA itself may not limit the use of the illegally obtained information. While the Constitution may, the lack of a statutory exclusionary rule undermines the goal of assuring that the government follow appropriate procedures designed to protect privacy at the front-end. Similarly, the existing penalty structure set out in ECPA does not encourage proactive behavior to protect privacy. In the incident involving McVeigh, AOL claimed that they did not know they were providing information to a government agent, and therefore under the existing statutory penalties they may not be liable.

D. We know where you are and what youÕre doing.

An example of the power of transactional data comes from the ÒlocationÓ information available through many cellular networks. In the course of processing calls, many wireless communications systems collect information about the cell site (location) of the person making or receiving a call. Location information can be useful, as Ted Rappaport, the inventor of the hand-held cell phone locator, stated, ÒIf you could know accurately where things are, not only would you feel safer because emergency services could find you, but law enforcement could use it more easily to track the bad guys.Ó(39) But as one reporter put it, ÒCellular telephones, long associated with untethered freedom, are becoming silent leashes...Ó(40) The technology is proceeding in the direction of providing more precise location information, a trend that has been boosted by the rulings of the Federal Communications Commission in its ÒE911Ó (enhanced 911) proceeding, which requires service providers to develop a locator capability for medical emergency and rescue purposes.(41) Location information may be captured when the phone is merely on, even if it is not handling a call.(42) Private sector uses of this information are also under consideration. A company in Japan is experimenting with a World Wide Web site that allows anyone to locate a phone and the person carrying it by merely typing in the phone number.(43)

In the online environment, transactional data can do more than just track the individualÕs location. It can provide insight into their thoughts, their affiliations, and their politics. It can reveal whether they are at home or at work. In a world where transactional data captures the full contours of a personÕs life it is time to provide it with stronger privacy protections.

V. Recommendations

As we consider privacy in the changing communications environment we must ask whether the assumptions of a previous time and technology, and legal distinctions based upon them, continue to make logical sense. Or more importantly, whether they provide protections reflective of our commitment to individual privacy autonomy, dignity, and freedom. Policies designed to implement the Fourth Amendment developed in a 20th century world of paper records -- even as extended to protect transient voice communications -- may not be applicable to 21st century technologies where many of our most important records are not "papers" in our "houses" but ÒbytesÓ stored electronically and our communications rather than disappearing into thin air are captured and stored at distant ÒvirtualÓ locations for indefinite periods of time.

To address privacy in the electronic communications environment the Congress should:

Reexamine the need for limits on the disclosure and use of personal information by private entities. Both the Federal Trade Commission and the Department of Commerce are engaged in initiatives designed to promote Òfair information practice principlesÓ in the online environment. We are encouraged that Congress is exploring protections for individual privacy during private sector activities. In considering this issue we recommend that discussions focus on the Code of Fair Information Practices developed by the Department of Health, Education and Welfare (HEW) in 1973(44) and the Guidelines for the Protection of Privacy and Transborder flows of Personal Data, adopted by the Council of the Organization for Economic Cooperation and Development in 1980. (45)

Reconsider how the lines have been drawn between records entitled to full Fourth Amendment protection and business records(46) that fall outside the protection of the Fourth Amendment. There are now essentially four legal regimes for access to electronic data: (i) the traditional Fourth Amendment standard, for records stored on an individualÕs hard drive or floppy disks; (ii) the Title III-ECPA standard, for records in transmission; (iii) the business records held by third-parties, available on a mere subpoena with no notice to the individual subject of the record; and, (iv) a third, the scope of which is probably unclear, for records stored on a remote server, such as the research paper (or the diary) of a student stored on a university server or the records (including the personal correspondence) of an employee stored on the server of the employer. As the third and fourth categories of records expand because people find it more convenient to store records remotely, the legal ambiguity and lack of strong protection grows more significant and poses grave threats to privacy in the digital environment.

Heighten the standard for access to transactional data. Transactional data are in many ways a personÕs digital fingerprints, although far more easily captured. Transactional records provide unprecedented information about the places, people, and activities that comprise the individualÕs daily life.

Create a privacy entity to provide expertise and institutional memory, a forum for research and exploration, and a source for guidance and policy recommendations on privacy issues. The existing crisis-driven approach to responding to privacy concerns has hindered the development of sound rational policy and failed to keep pace with changes in technology. The US needs an independent voice empowered with the scope, expertise, and authority to guide public policy. Such an entity has important roles to play on both the domestic and international fronts. Without an independent voice, privacy rights in the United States will not be afforded adequate consideration and protection in emerging media.

Encourage the development and implementation of technologies that support privacy on global information networks. Technological mechanisms for protecting privacy are critically important on the Internet and other global medium. Developing meaningful privacy protections in the online environment requires us to realize that our laws and Constitutional protections may not follow our citizens, their communications, or their data as it travels through distant lands. Technology can provide protections regardless of the legal environment.

Strong encryption is the backbone of technological protections for privacy. Today technical tools are available to send anonymous email, browse the World Wide Web anonymously, and purchase goods with the anonymity of cash. The World Wide Web ConsortiumÕs Platform for Privacy Preferences, currently under development, will provide an underlying framework for privacy -- allowing Web sites to make their information practices available to visitors and individuals to set privacy rules that control the flow of data during interactions with Web sites.(47) This effort has involved non-profit, for-profit and government representatives.

The U.S. should encourage the development of privacy-enhancing technologies that address the need either to eliminate data collection, or where data collection occurs: to limit the data collected; to communicate data practices; and, to facilitate individualized decision-making where consistent with policy.(48)

Collaborate with other governments, the public interest community and the business community to develop global solutions for the decentralized network communications environment.

Traditional top down methods of implementing policy and controlling behavior, be they international agreements, national legislation, or sectoral codes of conduct enforced by the private sector, offer incomplete responses to the privacy issues arising on the global information infrastructure. Implementing privacy policy in the decentralized, global and borderless environs of international networks raises difficult questions of effectiveness and enforcement. The U.S. should work with all parties -- other governments, international bodies, the public interest and for-profit communities to build consensus on appropriate policy. Providing a seamless web of privacy protection to individualsÕ data and communications as it flows along this international network may require new tools -- legal, policy, technical and self-regulatory -- for implementing policy. The U.S. should actively participate in their crafting.

Thank you for the opportunity to participate in this important discussion about protecting privacy in the online environment.

Judiciary Homepage


1.

0 Electronic Communications Privacy Act of 1986, Pub. L. No. 99-508, 100 Stat. 1848 (codified in sections of 18 U.S.C. including 2510-21, 2701-10, 3121-26.

2.

0 Attempts to regulate the availability of encryption on the Internet highlight the frustrations that regulators may experience. As many scholars and advocates have pointed out, national attempts to restrict the availability of encryption are likely to be ineffective. For if even one jurisdiction (or one network in one jurisdiction) fails to restrict it, individuals world-wide will be able to access it over the Internet and use it.

3.

0 See Berger v. New York, 388 U.S. 41, 56 (1967); Katz v. United States, 389 U.S. 347 (1967).

4.

0 See Controlling Crime Through More Effective Law Enforcement: Hearings on S. 300, S. 552, S. 580, S. 674, S. 675, S. 678, S. 798, S. 824, S. 916, S. 917, S. 992, S. 1007, S. 1094, S. 1194, S. 1333, and S. 2050 Before the Subcomm. on Criminal Laws and Procedures of the Senate Comm. on the Judiciary, 90th Cong. (1967), passim.

5.

0 18 U.S.C. 2510-22 (1996).

6.

0 In 1978, Congress enacted the Foreign Intelligence Surveillance Act (FISA) to regulate wiretapping in national security cases. It provides more limited protections than those afforded under Title III, and was meant to be used primarily in foreign intelligence and counter-intelligence cases. Of importance, FISA does not require that the subject of the surveillance ever be given notice, and for individuals who are not U.S. citizens or permanent residents it does not require the government to show probable cause that the target is engaged in criminal conduct. Pub. L. No. 95-511, tit. I, 101, 92 Stat. 1783 (1983) (codified at 50 U.S.C. 1801-11 (1996).

7.

0 18 U.S.C. 2518 (3) (1996).

8.

0 18 U.S.C. 2511 (1996).

9.

0 18 U.S.C. 2516 (2) (1996).

10.

0 18 U.S.C. 2518 (3)(c) (1996).

11.

0 18 U.S.C. 2518 (5) (1996).

12.

0 18 U.S.C. 2518 (8)(d) (1996).

13.

0 18 U.S.C. 2518 (9), (10) (1996).

14.

0 S. Rep. No. 90-1097, at 66 (1968).

15.

0 Electronic Communications Privacy Act of 1986, Pub. L. No. 99-508, 100 Stat. 1848 (codified in sections of 18 U.S.C. including 2510-21, 2701-10, 3121-26.

16.

0 See generally S. Rep No 99-541, at 5 (1986); and, H.R. Rep. No. 99-647, at 19 (1986).

17.

0 18 U.S.C. 3121-27 (1996).

18.

0 18 U.S.C. 2703 (c).

19.

0 See 18 U.S.C. 2515 (1966) (exclusionary rule refers to wire or oral communications, not electronic communications).

20.

0 18 U.S.C. 2703.

21.

0 Communications Assistance for Law Enforcement Act, Pub. L. No. 103-414, 108 Stat. 4279 (1994) (codified at 47 U.S.C. 1001 and scattered sections of 18 U.S.C. and 47 U.S.C.)

22.

0 18 U.S.C. 2703 (b) (A)-(B), (c) (1)(B), (d).

23.

0 ÒWhy the Government Wants a Computerized Patient Record,Ó Health Data Network News, Vol. 7, No. 6, March 20, 1998, p.1. ÒThe development of a federal

24.

0 Id. at 8.

25.

0 See generally, ÒSix Boston Hospitals Turn To the Internet as a clinical Network Tool,Ó Health Data Network News, Vol. 6, No. 6, June 20, 1997, p. 1; ÒMore Clearinghouses Conclude the Internet Makes Economic Sense,Ó Id.; and, ÒHospital Banks on Web Technology for Integration,Ó Health Data Network News, Nol. 6, No. 16, Nov. 20, 1997, p. 3.

26.

0 The record-keeper would have Fourth Amendment protections. Whether the patientÕs privacy is protected at all would largely depend upon state law, which is scattered and inconsistent. Until a federal law protecting individualÕs privacy in health information is crafted to protect data regardless of where it is stored or whose control it is under privacy is in danger.

27.

0 18 U.S.C. 2703 (b)

28.

0 ÒPrescription Fear, Privacy Sales,Ó Washington Post, February 15, 1998, p. A1.

29.

0 ÒMedical Privacy is Eroding, Physicians and Patients Declare,Ó San Diego Union-Tribune, February 21, 1998, B1.

30.

0 ÒInternet power feeds public fear,Ó USA Today, August 13, 1997, A1.

31.

0 ÒAOL will share usersÕ numbers for telemarketing,Ó Washington Post, July 24, 1998, E1; ÒSoon AOL users will get junk calls, not just busy signals and email ads,Ó July 24, 1998, B6.

32.

0 See letter to Steve Case, President of AOL from the Center for Democracy and Technology, Electronic Frontier Foundation, EFF-Austin, National Consumers League, Privacy Rights Clearinghouse, and Voters Telecommunications Watch.

33.

0 ÒAOL cancels plan for telemarketing: Disclosure of memberÕs numbers protested,Ó July 25, 1997, G1.

34.

0 On January 26, 1998 The United States District Court for the District of Columbia issued a preliminary injunction barring the Navy from dismissing McVeigh.

35.

0 ÒDonÕt chat, donÕt tell? Navy case tests privacy limits,Ó Wall Street Journal, January 14, 1998, B1.

36.

0 ÒAOL says it shouldnÕt have identified sailor,Ó Wall Street Journal, January 22, 1998, B10.

37.

0 18 U.S.C. 2703 (c)

38.

0 Privacy concerns with the disclosure of personal information about a specific individual to private citizens and institutions were the impetus behind two recent tightenings of privacy protections. In 1994 the DriverÕs Privacy Protection Act (DPPA) was passed in response to the murder of Rebecca Schafer, whose killer used department of motor vehicle records to locate her. The law sets limits on the disclosure of motor vehicle operator permits, motor vehicle titles, and motor vehicle registrations by motor vehicle departments. Under the DPPA, individuals must be informed of and given the opportunity to prohibit a) requests for their individual record (an Òindividual look-upÓ); and, b) disclosures for the bulk distribution of surveys, marketing or solicitations. More recently the Individual References Services Group, a group of companies that provide composite profiles of individuals based on data from both public and private sources, crafted a set of self-regulatory guidelines that limit access to their Òlook-up services.Ó One service offered by IRSG member companies is the ability to access profiles of specific individuals. Like the Òindivdualized look-upsÓ possible at motor vehicle departments or through the IRSG member companies, the disclosure of information to private parties that links an individual to her online identity (screenname) raises privacy concerns. If such information is provided to the wrong person, at the wrong time, it may lead to additional harm to the individual.

39.

0 ÒUsing cell phones to reach out and find someone: evolving technology will soon be able to pinpoint all mobile dialers,Ó USA Today, December 16, 1997, 6D.

40.

0 ÒTechnology that tracks cell phones draw fire,Ó New York Times, February 23, 1998, p. D3.

41.

0In June 1996, the FCC adopted a Report and Order and Notice of Proposed Rulemaking in Docket 94-102, requiring wireless service providers to modify their systems within 18 months to enable them to relay to public safety authorities the cell site location of 911 callers. Further, the FCC ordered carriers to take steps over the next 5 years to deploy the capability to provide latitude and longitude information of wireless telephone callers within 125 meters. Finally, the FCC proposed requiring at the end of the 5 year period that covered carriers have the capability to locate a caller within a 40 foot radius for longitude, latitude and altitude, thereby, for example, locating the caller within a tall building. In re Revision of the CommissionÕs Rules to Ensure Compatibility with Enhanced 911 Emergency Calling Sys., CC Docket No. 94-102, Report and Order and Further Notice of Proposed Rulemaking (last modified Jan. 2, 1997) [hereinafter FCC E-911 Order]

<http://www.fcc.gov/Bureaus/Wireless/Orders/1996/fcc96264.txt>.

42.

0 Albert Gidari, Locating Criminals by the Book, CELLULAR BUS. (June 1996) at 70.

43.

0 ÒThe scariest phone system,Ó Fortune, October 13, 1997, p. 168.

44.

0 1.There must be no personal data record-keeping systems whose very existence is secret;

2. There must be a way for an individual to find out what information is in his or her file and how the information is being used;

3. There must be a way for an individual to correct information in his or her records;

4. Any organization creating, maintaining, using, or disseminating records of personally identifiable information must assure the reliability of the data for its intended use and must take precautions to prevent misuse; and

5.There must be a way for an individual to prevent personal information obtained for one purpose from being used for another purpose without his or her consent.

Report of the Secretary's Advisory Committee on Automated Personal Data Systems, Records, Computers and the Rights of Citizens, U.S. Dept. of Health, Education & Welfare, July 1973.

45.

0 1. Collection limitation: There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.

2. Data quality: Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.

3. Purpose specification: The purposes for which personal data a re collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.

4. Use limitation: Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with the Òpurpose specificationÓ except: (a) with the consent of the data subject; or (b) by the authority of law.

5. Security safeguards: Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data.

6. Openness: There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.

7. Individual participation: An individual should have the right: (a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him; (b) to have communicated to him, data relating to him:

- within a reasonable time;

- at a charge, if any, that is not excessive;

- in a reasonable manner; and,

- in a form that is readily intelligible to him; (c) to be given reasons if a request made under subparagraphs (a) and (b) is denied, and to be able to challenge such denial; and, (d) to challenge data relating to him and, if the challenge is successful to have the data erased, rectified completed or amended.

8. Accountability: A data controller should be accountable for complying with measures which give effect to the principles stated above.

46.

0 In 1976 with US v. Miller, the Supreme Court began a line of cases holding that individuals have no constitutionally protected privacy interests in personal information contained in the business records held by third parties. In 1979, in Smith v. Maryland, the Court applied Miller to the electronic world ruling that the use of a pen register to collect the phone numbers dialed on a surveilled line did not implicate Fourth Amendment interests. While Congress responded to both decisions crafting procedural rules to govern law enforcement access to bank and telephone records, the Miller and Smith decisions leave personal information divulged or generated during business transactions without privacy protections -- unless Congress steps in to craft them. United States v. Miller, 425 U.S. 435 (1976); Smith v. Maryland, 442 U.S. 735 (1979).

47.

0 Public drafts of the specification and implementation guide should be available shortly at http://www.w3c.org/

48.

0 These incorporate the basic concepts of three recommendations of the Danish and Canadian Privacy Commissioners:

eliminate the collection of identity information, or if it is needed keep it separate from other information; minimize the collection and retention of identifiable personal information; and, make data collection and use transparent to data subjects and provide them with the ability to control the disclosure of their personal information, particularly identity information.