Privacy enhancing technologies


Privacy enhancing technologies

Privacy Enhancing Technologies (PET) is a general term for a set of computer tools, applications and mechanisms which - when integrated in online services or applications, or when used in conjunction with such services or applications - allow online users to protect the privacy of their personally identifiable information (PII) provided to and handled by such services or applications.

Goals of PETs

PETs aim at allowing users to take one or more of the following actions related to their personal data sent to, and used by, online service providers, merchants or other users:
* increase control over their personal data sent to, and used by, online service providers and merchants (or other online users) (self-determination)
* data minimisation: minimise the personal data collected and used by service providers and merchants
* choose the degree of anonymity (e.g. by using pseudonyms, anonymisers or anonymous data credentials)
* choose the degree of unlinkability (e.g. by using multiple virtual identities)
* achieve informed consent about giving their personal data to online service providers and merchants
* provide the possibility to negotiate the terms and conditions of giving their personal data to online service providers and merchants (data handling/privacy policy negotiation). [The EU PRIME research project's [https://www.prime-project.eu/about/vision/ Vision on privacy enchanced identity management] ] As an example, it can be negotiated that personal data mustn't be handed out to third parties or that the data is to be deleted after 3 months following the end of the contract.
* provide the possibility to have these negotiated terms and conditions technically enforced by the infrastructures of online service providers and merchants (i.e. not just having to rely on promises, but being confident that it is technically impossible for service providers to violate the agreed upon data handling conditions)
* provide the possibility to remotely audit the enforcement of these terms and conditions at the online service providers and merchants (assurance)
* data tracking: allow users to log, archive and look up past transfers of their personal data, including what data has been transferred, when, to whom and under what conditions
* facilitate the use of their legal rights of data inspection, correction and deletion

Existing PETs

Examples of existing privacy enhancing technologies are:
* Communication anonymisers hiding the real online identity (email address, IP address, etc.) and replacing it with a non-traceable identity (disposable / one-time email address, random IP address of hosts participating in an anonymising network, pseudonym, etc.). They can be applied to email, Web browsing, P2P networking, VoIP, Chat, instant messaging, etc.
* Shared bogus online accounts. One person creates an account for MSN, providing bogus data for Name, address, phone number, preferences, life situation etc. She then publishes her user-ID and password on the Internet. Everybody can now use this account comfortably. Thereby the user is is sure that there is no personal data about him in the account profile. (Moreover, he is freed from the hassle of having to register at the site himself.)
* Access to personal data: The service provider's infrastructure allows users to inspect, correct or delete all their data stored at the service provider.

Future PETs

Examples of privacy enhancing technologies that are being researched or developed are: [The EU PRIME research project's [https://www.prime-project.eu/prime_products/whitepaper/PRIME-Whitepaper-V2.pdf White Paper] (Version 2)]
* Wallets of multiple virtual identities; ideally unlinkable. Such wallets allow the efficient and easy creation, management and usage of virtual identities.
* Anonymous credentials: asserted properties/attributes or rights of the holder of the credential that don't reveal the real identity of the holder and that only reveal so much information as the holder of the credential is willing to disclose. The assertion can be issued by the user herself, by the provider of the online service or by a third party (another service provider, a government agency, etc.). For example:
** Online car rental. The car rental agency doesn't really need to know the true identity of the customer. It only needs to make sure that the customer is over 23 (as an example), that the customer has a driving licence, that the customer has health insurance for accidents (as an example), and that the customer is paying. Thus no real need to know her real name nor her address nor any other personal information. Anonymous credentials allow both parties to be comfortable: they allow the customer to only reveal so much data which the car rental agency needs for providing its service (data minimisation), and they allow the car rental agency to verify their requirements and get their money. When ordering a car online, the user, instead of providing the classical name, address and credit card number, provides the following credentials, all issued to pseudonyms, i.e. not to the real name of the customer:
*** An assertion of minimal age, issued by the state, proving that the holder is older than 23 (i.e. the actual age is not provided)
*** A driving licence, i.e. an assertion, issued by the motor vehicle control agency, that the holder is entitled to drive cars
*** A proof of insurance, issued by the health insurance
*** Digital cash:With this data, the car rental agency is in possession of all the data it needs to rent the car, it can thus, as an example, provide the unlocking code to the customer with which she can unlock the closet where the car key is kept. : Similar scenarios are buying wine at an Internet wine store or renting a movie at an online movie rental store.

* Negotiation and enforcement of data handling conditions. Before ordering a product or service online, the user and the online service provider or merchant negotiate the type of personal data that is to be transferred to the service provider. This includes the conditions that shall apply to the handling of the personal data, such as whether or not it may be sent to third parties (profile selling) and under what conditions (e.g. only while informing the user), or at what time in the future it shall be deleted (if at all). While this negotiation takes place, the online service provider communicates his requirements about the minimum amount of data he needs to provide the wanted service. Additional personal data may be asked for, too, but will be clearly labelled as optional. After the transfer of personal data took place, the agreed upon data handling conditions are technically enforced by the infrastructure of the service provider, which is capable of managing and processing and data handling obligations. Moreover, this enforcement can be remotely audited by the user, for example by verifying chains of certification based on Trusted computing modules or by verifying privacy seals/labels that were issued by third party auditing organisations (e.g. data protection agencies). Thus instead of the user having to rely on the mere promises of service providers not to abuse personal data, users will be more confidend about the service provider adhering to the negotiated data handling conditions.

* Data transaction log. Users can log what personal data they sent to which service provider, when and under what conditions. These logs are stored and allow users to determine what data they have sent to whom, or they can establish the type of data that is in possession by a specific service provider. This leads to more transparency, which is a pre-requisite of being in control.

The business case for PETs

Companies will usually only invest in technologies enhancing the privacy of their customers if they see a financial benefit, i.e. if they anticipate a positive business case. (The other main reason being to comply with legal requirements (which could be considered as coming down to a 'financial benefit' as well; the benefit of avoiding a fine for non-compliance with the law.)) The anticipated financial benefit is the anticipated increase of income due to privacy enhancing technologies, minus the anticipated increased cost of implementing and running privacy enhanced technologies in their infrastructure. This anticipated comparison is usually done over a couple of years, whereby the income and cost of every year is cumulated.

In other words, if the anticipated additional income cumulated over a couple of years is larger than the anticipated additional cost cumulated over the same number of years, then there is a positive business case and it makes sense for the company to consider implementing and deploying the privacy enchanced technologies in question.

Note that the business case outlined here is a 'differential business case', assuming that privacy functions are added to an existing service and taking into account the additional benefits and costs caused by this added functionality. For example, it would be wrong to account all operational costs, including those that were there before the privacy enhancing functions were added. Instead, only the additional costs incurring when operating the infrastructure with implemented privacy enhancements must be counted in. If, however, the service in consideration is a pure privacy enhancing service, i.e. if the privacy enhancement is not part of or added to the service but intead is the only component of the service, then the business cost and benefit factors below become absolute (delete "additional" and "increased" in all benefits and cost components).

Cost components

The anticipated additional cost components for an online service due to enhancing it with privacy protecting technologies are:
* additional hardware
* additional software licences
* personnel costs for designing, developing, implementing, testing and deploying the privacy enhanced service (project costs)
* additional personnel costs for
** running / operating and maintaining the privacy enhanced service (with respect to what it would be if there were no such privacy enhancements)
** fixing additional system failures or problems due to increased system complexity (more functionality means higher complexity which leads to higher vulnerability)
** product management of the additional privacy enhancing functions (more functions require more time spent to manage them)
** more complex new developments of the infrastructure used to run the service
** training customer support and supporting customers
* additional marketing communications costs
* loss of income as a consequence of additional service downtimes or problems due to increased service / system complexity

Benefit components

The anticipated additional income for an online service due to enhancing it with privacy protecting technologies divide up into the following components:
* Increased usage of online services by existing customers and increased number of new customers due to
** fulfilment of the need for privacy of customers (Some customers may only use the service if their privacy needs are fulfilled, other may use the service more often.)
** higher trust of customers in the service
** increased public image and trust (especially if the privacy friendly attitude is advertised)
** competitive advantage (if the competition doesn't have a similar offer)
* increased customer retention (Customers appreciate the privacy enhancing functions of the service and don't like the idea of not finding them with competing services.)
* lower the risk of being fined for violating legal data protection requirements

ee also

*Data privacy
*Digital credentials
*Information processing
*Information security
*Privacy

References

External links

PETs in general:
* [http://www.prime-project.eu The EU PRIME research project] (2004 to 2008) aiming at studying and developing novel PETs
* [http://www.cdt.org/privacy/pet/ About PETs from the Center for Democracy and Technology]
* [http://petworkshop.org/ Annual symposium on PETs]
* [http://www.itst.dk/image.asp?page=image&objno=198999309 Report about PETs from the META Group, published by the Danish ministry of science]
* [http://ec.europa.eu/information_society/activities/privtech/index_en.htm Activities of the EU Commission in the area of PETs] Anonymous credentials:
*IBM Zürich Research Lab's [http://www.zurich.ibm.com/security/idemix/ idemix]
*Stefan Brands' [http://www.credentica.com/ 'credentica'] Privacy policy negotiation:
*The W3C's P3P
*IBM's EPAL


Wikimedia Foundation. 2010.

Look at other dictionaries:

  • Privacy — For other uses, see Privacy (disambiguation). Privacy (from Latin: privatus separated from the rest, deprived of something, esp. office, participation in the government , from privo to deprive ) is the ability of an individual or group to seclude …   Wikipedia

  • Information privacy — Information privacy, or data privacy is the relationship between collection and dissemination of data, technology, the public expectation of privacy, and the legal and political issues surrounding them. Privacy concerns exist wherever personally… …   Wikipedia

  • Internet privacy — involves the right or mandate of personal privacy concerning the storing, repurposing, providing to third parties, and displaying of information pertaining to oneself via the Internet. Privacy can entail both Personally Identifying Information… …   Wikipedia

  • Invasion of Privacy on the Internet — ▪ 2001 by Jeffrey Rosen       In the year 2000 concerns about privacy in cyberspace became an issue of international debate. As reading and writing, health care and shopping, and sex and gossip increasingly took place in cyberspace, citizens… …   Universalium

  • Consumer privacy — laws and regulations seek to protect any individual from loss of privacy due to failures or limitations of corporate customer privacy measures. They recognize that the damage done by privacy loss is typically not measurable, nor can it be undone …   Wikipedia

  • Data protection (privacy) laws in Russia — is a rapidly developing branch of the Russian legislation. All the basic legal acts in this field have been enacted most recently, mainly in the 2005 2006. The present article is an attempt to summarise the substance and main principles of the… …   Wikipedia

  • Medical privacy — Various methods have been used to protect patient s privacy. This 1822 drawing by Jacques Pierre Maygnier shows a compromise procedure, in which the physician is kneeling before the woman but cannot see her genitalia. The main subject of medical… …   Wikipedia

  • Tor (anonymity network) — Tor Developer(s) The Tor Project[1] Initial release 20 September 2002 (2002 09 20) …   Wikipedia

  • Collaborative search engine — Collaborative Search Engines (CSEs) are an emerging trend for Web search and Enterprise search within company intranets. CSEs let users concert their efforts in information retrieval (IR) activities, share information resources collaboratively… …   Wikipedia

  • Surveillance — For other uses, see Surveillance (disambiguation). A nest of surveillance cameras at the Gillette Stadium in Foxborough, Massachusetts Surveillance ( …   Wikipedia