Introduction
Since its inception, the Internet has done more than place endless amounts of information at the fingertips of computer users; the Internet has forever altered the way human beings communicate and coexist. With relatively few hindrances, the Internet truly ushered in the information age and, with it, a sea of endless possibilities to explore. The 1-to-1 nature of communications that the Internet offered was, and continues to be, appealing both to computer users and business entrepreneurs alike. The Internet did not discriminate and anyone with a computer and a telephone line could gain access. As more and more computers became World-Wide-Web accessible, the business potentialities of the Web became readily apparent and sites sprang up overnight. Recognizing the utility and practicality of the Internet, Web sites and services began to aggressively target Web users for personal, demographic, and consumer-related information. Sites began to predicate access upon the disclosure of such information and Internet users either had to comply or look elsewhere for the information they craved. While the ultimate choice remained with the user, Web sites seemed to have the upper hand in this bargain for information. Pandora's Box was opened.
Those akin to the benefits of the Internet soon realized that potential users may never choose to access Web services if measures were not taken to garner user confidence and trust. One such measure was PICS, the Platform for Internet Content Selection, enabling Internet sites to apply a corresponding rating tag to the content contained within the site. In turn, Internet users could configure their systems to only recognize Web sites that contained a rating that corresponded with their set preferences. Users were soon presented with an array of labeling and filtering options which could easily be downloaded and stored on a user's computer. The user could then create a personal viewing preference statement of sorts and limit what information could be accessed. As a content filtering system, PICS provided the reassurance that undesired information would be withheld from unintended viewers and, as a result, increase consumer confidence in the Web.
While PICS seemed to constrain the Internet monster, concerns over user privacy came to the fore. Web sites were asking for more and more user personal information to be released and user's had to repeatedly the same information to secure access. To address the numerous privacy concerns and further secure user trust in the Web, the Platform for Privacy Preferences Project (P3P) was, and continues to be, developed. While PICS made it quick and easy for Internet user's to decide when and if a rating scheme was to be employed, P3P is not as clear cut. Due to its highly technical nature, P3P may make users increasingly dependent upon outside sources for guidance and recommendations regarding how to configure individual privacy preferences. However, setting one's preferences is only the tip of proverbial iceberg. What remains is a complex schema for the secure storage and dissemination of user personal information and browsing habits.
How P3P Works Currently being developed by the World Wide Web Consortium (W3C), P3P addresses the privacy concerns of Internet users when combined with the business practices of Web sites and electronic services. At it's core, P3P consists of two developmental stages: architecture and technical components. Think of the architecture as the shell and the technical aspects as the filling. In simple form, P3P's architecture contains two basic entities, the user agent and the service (although I believe that a third entity is present: the user).{1} The technical component consists of a grammar (or syntax) and a vocabulary for user privacy statements and the interaction between user and service. While each has been created separately, P3P as a whole is best understood when both entities are examined together.
For P3P to function, there needs to be a user, a user agent, and a service.{2} Anyone using the Web is considered a user. As the term implies, a user is 'using' the Web to access some kind of information; educational, recreational, or for online purchasing. However, using the Web often comes at price (in addition to the fee paid for the service in the first instance) - the user may have to reveal personal information to a Web site to receive desired information and/or products from that site. Users are asked to provide personal, demographic, and consumer-related information that the contacted party (service) seeks for any number of reasons. If the user fails to provide the desired information the user, in all likelihood, will be denied access to the Web site or service. However, if a user chooses to provide the desired information, that user will get more than he/she initially contemplated or even wanted. After the transaction has been completed, the user's supplied personal information could be freely distributed to other interested parties or retained by the service for future use. As a result, Internet users may falsify personal information, as a protective measure, or just refuse to supply the information with the concern that unwelcome parties may gain access to the supplied data.{3}
Utilizing P3P, before an exchange of personal information can occur the user must create a personal profile as well as a privacy preference statement. The personal profile contains basic background information such as name, age, date of birth, but can also be extremely detailed. In addition to personal and demographic information, the profile will also contain a privacy statement as to when such information can be disseminated - this is the heart of P3P. The preference statements are then stored on the user's computer within a 'data repository,' which can then be accessed by both the user and the user agent. Storing the information with the user allows for easier user access to that information for updating or changing (in theory) and provides a feeling of enhanced security to the user. Without establishing privacy preferences, the client software or user agent would not know what information can be shared and when such information can be provided to interested parties. Simply put, the privacy statement provides the user with assurances (whether accurate or unfounded is to be determined) that their information will only be passed along to contacted services that meet the established user guidelines.
However, establishing a set of standards and guidelines that are to be followed is not an easy task. The user must either manually configure the profiles (using an existing template){4} or the user must rely on recommended settings created by Third Parties (to be downloaded or purchased).{5} Each choice has problems (See sections 'Creating a Privacy Profile - How Technical is Too Technical' and 'Recommended Settings By Third Parties'). In creating a privacy statement, the user is configuring the user agent that will act on their behalf; the less accurate or complete the privacy statement, the more personal information may be accessible or overall access to desired sites may be curtailed. It is in the best interest of the user to choose the option that best represents their computer comfort level and trust in outside sources for guidance.{6} Choosing the wrong option could have potentially devastating effects on the overall effectiveness of the user's browsing experience and willingness to use the Web in the future, both key rationale behind the development of P3P.
In a P3P-less world (as is the case right now), Internet users communicate directly with the Web site or service. However, with P3P the direct chain of communication is severed and an intermediary of sorts is established to handle the varied and numerous requests for user information. Rather than having to repeatedly supply the same personal information to various services when seeking information, under P3P users need only enter personal information once. Without going into great detail (addressed in 'Creating a Personal Profile), the user supplies personal information and from then on all discussions all essentially automated.{7} Whenever the Internet user attempts to access information, any requests directed to the user by the Web site will proceed through the user agent. The user agent, a program acting on the user's behalf which can be either be downloaded or purchased, receives the requests for personal information and then cross-references the request with the privacy preferences supplied by the user. If the service-side request for information falls within the user's established privacy preferences then the user agent begins discussion with the service/web-site to exchange the information. If the service-side request requires that additional user information be supplied, the user agent will then prompt the user with the request. {8} At that point the user decides whether or not to permit the dissemination of the additional information and, if not, the communication between the user agent and the service will cease.
Web sites require different user-side information before granting access to the site. For example, if the user is interested in receiving a free sample of toothpaste via the Internet, the contacted Web-site will likely solicit consumer habit information, including the toothpaste brand which the user typically purchases. Such information helps the Web site understand what factors go into a user's consumer habits so that it can better target consumers in the future. If the user's privacy preference did not contain a desired piece of information, the Web site would then request the desired information from the user agent. In turn, the user agent would then prompt the user with the request. {9} While this seems simple enough, there is more at stake for the user than initially meets the eye. The user may be thinking that a free toothpaste sample would be very nice to have without understanding the implications of agreeing to the service's information requests. Over time, the user may supply an infinitesimal amount of personal information and a very detailed profile of that person could be developed. It is with this in mind that P3P was, and continues to be developed; to protect Internet users from an arbitrage of specifically targeted online, as well as offline, advertisements and solicitations.
As client software, a user agent relies on grammar and vocabulary to make automated decisions. If the Web site uses P3P privacy vocabulary, then the user agent can easily determine if there is match between the user and the site. However, if the Web site does not use familiar terms and syntax, the user agent will not understand the request and must prompt the user for his/her consent.{10} As such, vocabulary use and implementation is important on both the user and the service end. The proper balance between descriptive vocabulary on the one hand and subjective on the other can make a tremendous difference. For example, a privacy vocabulary most likely will include terms such as 'used for administration system' and/or 'used for marketing purposes.'{11} While the meaning of such terms may be apparent to the vocabulary creators and knowledgeable computer users, many users may not understand what the terms connote and, as such, may make uniformed decisions or decide not to formulate a privacy statement.
Up until this point, all communication between a user (via the user agent) and a Web site is based upon informed consent by the user. However, without a verification system, Web sites could intentionally use (or misuse) the P3P vocabulary to acquire user information.{12} Due to such a possibility, the need for a 'trust agent' or 'trust engine' to vouch for the honesty of a Web site is apparent.{13} Rather than placing this burden on the user, the Web site would subscribe to a trust mechanism. Similar to a site using RSACi for labeling purposes, the trust mechanism would verify that the statements put forth by the site were accurate in both content and style. The premise behind this stance is extremely sound and is good policy: the more secure a Web site can make a user (or user agent) feel, the more trust a user will have in that site and the Internet in general.
The Basics - Why P3P? The need for P3P is twofold: (1) P3P allows users to have the final word on the accessibility, or lack thereof, concerning the dissemination of user personal information; and (2) to create an environment that does not stymie business activities on the Web in the name of securing user trust.{14} On a cursory level, P3P would protect information such as a user's name, address, age, and gender; all of which can be unscrupulously used for targeting purposes by a site. Whereas such biographical and demographic information is oft sought after, the silent prize is 'click stream' data which can accurately describe a user's surfing and browsing preferences. In many cases, highly detailed personal profiles can be developed when combining presentable personal data with browsing behavior.
The ultimate goal of P3P is for the 'seamless' access of sites that fall within a user's set preferences without encountering unwanted information.{15} Depending upon how one views PICS, measures to control user access can be seen as either a necessary evil or an unnecessary intrusion. What is clear, however, is that the unchecked proliferation of personal profiles on the Web subjects users to a barrage of often times useless and cumbersome advertisements and information that was generated solely upon a profile. Furthermore, users are left in the dark as to who, where, and how often supplied personal information has been disseminated into the Internet void.
Are PICS and P3P the same Animal? With the creation of PICS, the specter of censorship on the Internet became a reality. While the intended effect was to prohibit access of indecent materials from the prying eyes of children, the true effect of PICS was much grander. Comparing such efforts to the ill-fated Communications Decency Act, Larry Lessig, a Harvard law professor specializing in Cyberspace law, argued that PICS, by its very nature, would build censorship into the very fabric of the Internet.{16} According to Professor Lessig, 'truth in filtering' could not be achieved by PICS because artistic, literary and educational materials dealing with difficult issues such as sexual orientation and societal violence would often be lumped together with pornographic and other suggestive materials that were not intended to be viewed by minors.{17} As such, these concerns would often fall by the wayside because parents were looking for a 'quick fix' to keep Junior from fulfilling his or her sexual fantasies and desires on the Internet.
The 'quick fix' that Lessig referred to is also present with P3P. As will be discussed in a later section, the potential for users to unduly rely on outside sources for privacy recommendations is a very real and ominous threat. However, the 'animal' is much different between the two technologies. Whereas the danger of misuse and abuse is extremely prevalent with PICS (based upon perceived threats), the true danger in the privacy realm is not taking measures to protect users from commonplace practices (a real threat). Stated another way, PICS was developed to keep unintended eyes away from Web sites; P3P was developed to protect Internet users personal information away from aggressive Web sites. Misuse by unintended participants (in the PICS context) can be reduced by increased parental education and better supervision. Web site practice and policy relating to gathering consumer data can only be curbed and monitored by equipping Internet users with the means to protect themselves and their personal information.
Requests for Personal Data The theory behind the user agent-service relationship is impliedly sound. Before a service can gain access to a user's personal information contained in a data repository, the service must first initiate discussion with the user agent. In simple form, the user agent and the service, through a series of requests, determine if the site's practices concerning the use of the personal data fall within the user's established guidelines. If so, the user agent will transfer the user's personal data either to a search engine ('a trusted intermediary') or to the service directly. In so doing, the user agent, on behalf of the user, agrees to the practice terms supplied by the service. However, questions remain regarding if the will the data reach the intended target or make other unintended pit stops along the way. As the P3P Architecture Working Group validly points out, the negotiations between the user agent and the service have not been specifically dealt with and questions whether such agreements are legally binding.{18} If the negotiation and subsequent agreement are not legal contracts then what protections does the user have against the undesired use of their personal information? Without providing a justifiable legal basis for remuneration by the user, P3P only circumvents the issue of user privacy by establishing a de facto contract in which the user ultimately relies upon.
The ultimate question here is whether, and to what extent, will in the interface between user agent and service correspond will actual user preferences? Even if the user agent only provides personal information as designated by the user, will that, by itself, be enough to establish user confidence and trust for P3P and, more specifically, in the Internet. A request by a user to access a site is more than a simple request for information. If a user agent simply verifies that a site's access requirements fall within the parameters established by the user than the goal behind P3P fails. P3P is more than a vehicle to provide information where, when and how often to a desired user - P3P is a statement about the utility and practicality of the Internet in everyday life. Stated plainly, what the user has opined to in a privacy statement may not be what the user really desires or had even contemplated.
Creating a Privacy Profile - How Technical is Too Technical For P3P to be successful, the profile that is to be created must be descriptive enough to cover all the bases of user privacy concerns without being too complex for the user to understand and configure. In a perfect computerized society, all computers users would be created equal and have a strong working knowledge of how a computer functions. However, this is not the case. In today's society, consumers in general are less likely to use or buy items that contain complex instructions and are difficult to use and implement in everyday life. As was the concern with PICS, for the application to protect unintended viewers from obscene material it first had to be understandable to the common computer user. The relative ease of use and implementation was one of the reasons behind the success of PICS; simply answer 'yes' or 'no' to a few brief questions and a filter has been created - it's that easy! But would that approach work for P3P? In short, the answer is a resounding 'no'.
While individuals most likely have an understanding of what they believe constitutes violence, language, nudity and sex for content and labeling purposes, that same person may be unfamiliar with 'privacy' vocabulary. For example, a person utilizing filtering software can select an option that sites containing 'nudity' be inaccessible. In making that choice, the person most likely will feel confident that they have a working understanding of the term 'nudity' and what the term connotes. Switching to the P3P context, users may be unfamiliar with such terms as 'click-stream data' and 'system administration' and, as such, may make uninformed choices that are contrary to their actual privacy preferences.{19} Without further explanation, a computer user wishing to establish privacy preferences may be left guessing as to what the selected preferences actually dictate. Profiles employing a simple vocabulary will be easy to use but will convey less information to the user. Profiles employing a more descriptive vocabulary will be more difficult to use but will convey more information to the user. Finding the right balance is absolutely essential if P3P is to succeed.
Recommended Settings By Third Parties A viable alternative to manual configuration is reliance upon a trusted third-party organization for a user's privacy profile. However, relying upon someone else to make decisions as a substitute for individualistic choices is problematic. Labeling systems are the perfect analogy: a user is essentially makes an all or nothing choice. While the choice remains with the user, the user must subscribe to the organization's notions of morality.{20} For example, if a user subscribes to a labeling system utilized by the Religious Right, that user does not have the ability to make meaningful choices on their own. Applying this to the P3P context, simply presenting user's with a set of recommended settings is not enough. That user must be able to make meaningful choices, both in selecting a recommended privacy setting and having the ability to configure that setting. Whereas labeling systems are employed to keep unintended eyes from viewing material (something that can be easily corrected upon discovery), disclosure of a user's personal information can, and will, remain in the Internet void without the ability to remove the information.
As is the case with rating and filtering software, recommended privacy settings provide a user with a choice (albeit a much larger choice): adopt a recommended setting or manually configure. While user behavior is difficult to predict, a large percentage of users may opt for pre- established choices due to time, energy, and knowledge constraints. Simply put, relying on someone else has always been easier than doing something for yourself. As was stated previously, many computer users simply do not have the technical background and experience to make informed decisions when programming or configuring a system. As such, many users will be pushed involuntarily towards reliance upon an outside source for a recommended privacy profile. If a user must select a setting that 'most closely' corresponds to their privacy preferences then their privacy preferences are not truly being expressed. Assuming that a large number of privacy settings will be made available for downloading by 'trusted' organizations, those settings must be both descriptive and understandable by even the most unsophisticated computer user.
As an example of sample privacy vocabulary and recommended settings,
refer to the diagrams provided in Figure 1.{21} The
user has the choice of four possible privacy settings from which to choose:
no privacy, no disclosure, moderate privacy, or anonymous surfing. While
the settings appear to be straightforward enough, the vocabulary used to
help make a selection is anything but clear cut. If the common, everyday
computer does not understand the vocabulary then it is essentially worthless
- an informed choice is not possible if the language used is not understood.
This author is unclear if the vocabulary used is explained in detail at
some point during the process or if the user is left to his/her own resources
to determine what the terms connote. If the former applies, the vocabulary
could appear as hypertext enabling the user to click on the vague term
for a complete and accurate description. If the latter applies, the user
must make an uniformed decision or decide not to establish a privacy profile
- both contrary to the purpose behind P3P.
Click here
for Figure 1
Click
here for Figure 2
The creators of P3P recognize that either most users will not take the time to manually configure a profile or do not possess the knowledge to do so. Depending upon the response to P3P by Web sites and services, the quantity and quality of recommended settings available on the Internet may vary. With the focus on utilizing recommended settings, users must be able to change their privacy profile as easily as possible. Additionally, organizations furnishing privacy recommendations should continuously update the settings and make the users aware of any changes made. In the privacy realm, simply providing recommended is not enough. As more and more computer users begin to experience the Internet, Web sites and services will target those potential consumers more aggressively. The creators of P3P should actively encourage third party organizations to contribute to the development of the project rather than take a 'wait and see' approach.
Dr. Jekyll and Mr. Hyde - The Use of Multiple Personas When establishing privacy preferences, a user will have a choice to individually configure each option to specifically address the desired preferences or can rely on pre-established recommended settings created by a trusted third party{22} (the latter will be addressed in a later section). Assuming a user possesses the requisite knowledge, skill, and patience to individually configure each privacy option, will that preference statement apply to all situations of Internet navigation? As the P3P Architecture Working Group validly points out, a user may have one persona while at work and another in a recreational capacity.{23} For example, Mike establishes a work persona that delineates occupational information so that his Internet browsing experience is more tailored to his work needs. In so doing, Mike will most likely avoid hitting sites that are outside the scope of his occupational interests and needs. However, if Mike was not allowed to create a separate persona his Internet browsing would be extremely limited and Mike may quickly lose interest in the Internet. By having separate personas, Mike will be able to fully enjoy the Internet without being subjected to unwanted information at a specific time. Not only does the use of distinct personas aid individual users, but such use also fosters commerce on the Internet by allowing sites to better target potential consumers.
Beside using different personas to create an 'occupational' profile as opposed to a 'general consumer' profile, such personas could be utilized to shield children from having to release personal information.{24} However, what may be seen as a protective measure by some parents can, in some situations, actually enable the child to gain access to sites that they otherwise would not have been able to reach. For example, a child may be able to circumvent the privacy statements that were established by the parent, either for personal use or for use by the child, if the child creates an alternative persona. This situation brings to the fore such issues Web access by children and the ease, or lack thereof, of creating a profile in the first instance. Forbidding children to browse the Internet is not only foolish but would be counterintuitive to creating user trust in the Internet - a central goal behind the creation of P3P. Parents must feel confident that their children will not be able to override established privacy preferences by creating an unknown persona. Whereas in a PICS context, parents try to shield their children from indecent material, in a P3P context, parents are trying to safeguard the release of a child's personal information. While labeling and filtering measures may hurt a child in the short-term, the release of personal information may have profound and lasting effects on a child that can not be easily remedied. (will continue with discussion of security measures).
The Data Repository: Control Over User Profile Data After a privacy preference has been established by the user or, in the alternative, upon reliance from a Third Party, the focus shifts to how that data will be stored, who(m) can access the data, and how the data will be transported. Rather than creating a new set of governing standards from scratch, the Open Profiling Standard (OPS), will be utilized, for the time being, to control the release of personal data in a secure fashion.{25}
The data does not leave the user's computer and access to the data is controlled by a 'user agent,' as selected by the user. Rather than require a user to be notified each and every time his or her browsing preferences and personal data are sought, the user agent intercedes on the user's behalf. Basic issues surrounding the relationship between a user and a user agent are: what information the user agent can access, accessibility of the data by the user, and what data can be transported to interested 'service[s].' Trust between the user and the user agent is essential; the user must feel completely confident that the user agent will abide by the terms of the contract entered. Of primary importance in such an arrangement is the ability to access personal data and make changes when appropriate.{26} However, if a user has relied upon a Third Party regarding his privacy preference statement, his ability to actively change that information may be complicated and/or restricted.
P3P may make the overall Internet browsing experience more enjoyable for many users. The simple fact that users will not have to reenter the same personal data on numerous occasions to access sites and information is a victory in itself. Not only does the user have to make on the spot decisions about the utility of the desired information but sites most likely lose a considerable amount of potential viewers because of the site's access requirements. Herein lies the dilemma. Once a user specifies his or her privacy preferences, the process becomes 'out of sight, out of mind.' In exchange for a 'seamless' browsing experience, the user turns over control to an user agent. The user may be pleased with the performance of the user agent but it is the lack of apparent control that is bothersome. For practical purposes, how often will a user communicate with or question the service of the user agent? So long as the user has not been bothered excessively with messages for additional personal information in order to gain access, the user may not even recall what personal preferences they selected.
The Verdict: P3P - Not Quite the Devil Several critical steps must be taken for P3P to enhance user confidence in the Web without adversely affecting business activities conducted via the Web. First and foremost, a vocabulary must be designed and implemented which every Internet user can understand and work from. As of the date material for this paper was collected this had not occurred. Without an understandable vocabulary, users will be forced to make uniformed privacy decisions which could result in he dissemination of personal information that was not intended - a major purpose behind P3P. As P3P's Architecture Working Group validly discussed, the proper balance between a descriptive and subjective language is extremely important. This problem could be easily solved by hypertexting each descriptive term used in privacy statement so that a user would, if they desired, have additional information on which to base a decision.
Another point of contention is user reliance upon outside sources for recommended privacy settings. While it can be argued that this is the same as using a rating or filtration program, this author believes the stakes are much higher in the P3P context. At present, once a user's personal information is disseminated into the Internet void it is no longer within a user's control. However, due to the complexity of manually configuring a personal privacy statement, users, in a majority of instances, will have to rely on such outside sources for guidance. To this end, the minds behind P3P should encourage organizations from all aspects of life to create recommended settings from which users can choose. Greater selection may encourage Internet users with privacy concerns to create a privacy statement with increased confidence. But simply having a large selection to choose from is not enough. Those supplying recommended privacy settings must constantly modify their recommended settings to correspond with the information being sought by Web sites.
This author believes that P3P can benefit both users and Web sites. As such, the creators of P3P must actively encourage and solicit Web sites, both small and large, to adopt the technology. Simply put, if Web sites do not implement P3P, then P3P is essentially worthless. As the Web continues to evolve, business and services done via the Web will continue to seek consumer information to bolster sales and better target consumers in general. No matter how it is conducted, business is still business. Money is as green over the Internet as it is in real time. While this author has reservations with some aspects of P3P, the possible benefits far outweigh any negatives. To succeed, P3P must focus on users rather than Web sites. Minimal user safeguards means greater chances for uninformed privacy decisions on the user-end. To this end, every privacy statement, whether provided by an outside source or through an existing template on a user's computer, should contain information about the dangers of releasing personal information over the Web. Only when Web sites and services believe that users are in fact more informed will user-directed behavior change.
2. For W3C definitions of these terms refer to P3P Architecture Working Group which can be located at http://www.w3.org/TR/WD-P3P-arch-971022. Back to text at note 2
3. Kehoe, C. & Pitkow, J. (1997, June). GVU's Seventh WWW User Survey. Available Online: http://www.gvu.gatech.edu/user_surveys/survey-1997-04/. See also Privacy and Profiling on the Web, submitted by the Microsoft Corporation. http://www.w3.org/TR/NOTE- Web-privacy.html. Back to text at note 3
4. See e.g., P3 Prototype Script, Joseph Reagle. (under section 1) http://www.w3.org/Talks/970612-ftc/ftc-mast.html. (valid 10/26/97). Back to text at note 4
5. Id. (under section 2.2) Back to text at note 5
6. See Lorrie Faith Cranor and Joseph Reagle, Jr., Designing a Social Protocol: Lessons Learned from the Platform for Privacy Preferences. Proceedings of the Telecommunications Policy Research Conference. Alexandria, VA, September 27-29, 1997. (valid 10/26/97). Cranor and Reagle, Designing a Social Protocol. Back to text at note 6
7. An example of a personal privacy template can be found at http://www.w3.org/TR/NOTE-Web-privacy.html. (under '[t]he schema for a persona') Back to text at note 7
8. See P3 Prototype Script. (under section 4.2) Back to text at note 8
9. There is a difference between additional requested information and information contained in a user's profile. When a Web site communicates with the user agent, the site can request information already contained in the personal profile or information not present in the profile. The user agent will only prompt the user when information not contained in the profile is sought. Back to text at note 9
10. See Privacy and Profiling on the Web, submitted by the Microsoft Corporation. (under Client-server exchange of personal information. http://www.w3.org/TR/Note-Web-privacy.html. (valid 11/08/97). Back to text at note 10
11. See e.g., Designing a Social Protocol, under Recommended Settings. Back to text at note 11
12. Privacy and Profiling on the Web, under Client-server exchange of personal information. Back to text at note 12
13. See id. See also, P3P Architecture Working Group, General Overview of the P3P Architecture, W3C Working Draft 10/22/97. (under Trust engines and the P3P sphere of coverage). http://www.w3.org/TR/WD-P3P-arch-971022. (valid 11/08/97). Back to text at note 13
14. Cranor and Reagle, Designing a Social Protocol. See also, W3C Executive Summary of Platform for Privacy Preferences (P3) Project, April 2, 1997. http://www.w3.org/P3/exec-P3Brief.html. (valid 10/26/97). Back to text at note 14
15. Id. Back to text at note 15
16. http://www.nytimes.com/library/cyber/law/103097law.html. Back to text at note 16
17. Id. Back to text at note 17
18. P3P Architecture Working Group, under recommendations for future action and open issues. Back to text at note 18
19. See Cranor and Reagle, Designing a Social Protocol. See also P3 Prototype Script, Joseph Reagle. http://www.w3.org/Talks/970612- ftc/ftc-mast.html. (valid 10/26/97). Back to text at note 19
20. See Electronic Frontier Foundation, Public Interest Principles for Online Filtration, Ratings and Labeling Systems. February 28, 1997. http://www.eff.org/pub/Net info/Tools/Ratings filters/eff filter.principles. Back to text at note 20
21. The diagrams have been taken from 'Designing a Social Protocol' without permission and are only being used as an illustrative reference. Back to text at note 21
22. Cranor and Reagle, Designing a Social Protocol. See also, P3P Architecture Working Group. http://www.w3.org/TR/WD-P3P-arch-971022. Back to text at note 22
23. Id. See also Privacy and Profiling on the Web. http://www.w3.org/TR/Note-Web-privacy.html. Back to text at note 23
24. Cranor and Reagle, Designing a Social Protocol. Back to text at note 24
25. Joseph Reagle, Jr., P3P and Privacy on the Web FAQ, August 4, 1997. http://www.w3.org/P3P/P3FAQ.html. (valid 10/26/97). Back to text at note 25
26. See P3P Architecture Working Group, General Overview. Back to text at note 26