Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Business Case for PETs'

Business Cases for Privacy-Enhancing Technologies

Roger Clarke **

Version of 12 June 2007

Chapter 7 in Subramanian R. (Ed.) 'Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions' IDEA Group, 2008, pp. 135-155

Republished as Chapter 3.15 in Lee I. (Ed.) 'Electronic Business: Concepts, Methodologies, Tools, and Applications' (4 Volumes), IDEA Group, 2008

© Xamax Consultancy Pty Ltd, 2005-07

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at


Many categories of eBusiness continue to under-achieve. Their full value cannot be unlocked while key parties distrust the technology or other parties, particularly the scheme's sponsors. Meanwhile, the explosion in privacy-intrusive technologies has resulted in privacy threats looming ever larger as a key impediment to adoption.

Technology can be applied in privacy-enhancing ways, variously to counter invasive technologies, to enable untraceable anonymity, and to offer strong, but more qualified pseudonymity. After their first decade, it is clear that privacy-enhancing technologies (PETs) are technically effective, but that their adoption lags far behind their potential. As a result, they have not delivered the antidote to distrust in eBusiness.

If individuals are not spontaneously adopting PETs, then the opportunity exists for corporations and government agencies to harness PETs as a core element of their privacy strategies. The financial investment required is not all that large. On the other hand, it is challenging to attract the attention of executives to an initiative of this nature, and then to adapt corporate culture to ensure that the strategy is successfully carried through. This chapter examines PETs, their application to business needs, and the preparation of a business case for investment in PETs.


1. Introduction

A substantial technical literature exists that describes privacy-enhancing technologies (PETs). On the other hand, there is a very limited literature on why organisations should encourage the adoption of PETs, invest in their development, and provide channels for their dissemination. The purpose of this chapter is to present a framework within which organisations can develop a business case for PETs.

The chapter commences by considering contexts in which trust and distrust of organisations by individuals are important factors in the achievement of organisational objectives. An examination is then undertaken of how an organisation's privacy strategy can make significant contributions to overcoming distrust and achieving trust. The role of information technology is then considered, including both privacy-invasive technologies ('the PITs'), and those that protect and enhance privacy. A taxonomy of PETs is presented, which distinguishes among mere pseudo-PETs, PETs that are designed as countermeasures against specific PITs, tools for uncrackable anonymity ('savage PETs'), and 'gentle PETs' that seek a balance between nymity and accountability. Opportunities for organisations to incorporate PET-related initiatives within their privacy strategies are examined, and the development of business cases is placed within a broader theory of cost-benefit-risk analysis.

2. Trust and Distrust

This chapter is concerned with how organisations construct business cases for the application of technology in order to preserve privacy. The need for this arises in circumstances in which firstly either trust is lacking or distrust inhibits adoption, and secondly effective privacy protections can be a significant factor in overcoming the trust gap.

Trust is confident reliance by one party about the behaviour of other parties (Clarke 2002). It originates in social settings. Many of the elements evident in social settings are difficult for organisations to replicate in merely economic contexts. Hence a great deal of what organisations call trust is merely what a party has to depend on when no other form of risk amelioration strategy is available to them.

If trust can be achieved, then it may become a positive driver of behaviour. A more common pattern, however, is for distrust to exist. This represents an impediment to fulfilment of the organisation's objectives, because it undermines the positive impacts of other drivers such as cost reductions and convenience.

During their headlong rush onto the Internet during the last decade, many organisations have overlooked the importance of human values to the parties that they deal with. Both consumers and small businesspeople feel powerless when they deal with larger organisations. They would like to have 'friends in high places' who can help them when they encounter difficulties. They also fear the consolidation of power that they see going on around them, as governments integrate vast data collections, corporations merge and enter into strategic alliances, and 'public-private partnerships' blur organisational boundaries across sectors. As a result, distrust is more commonly encountered than trust.

One context within which trust is critical is the relationship between employers on the one hand, and employees and contractors on the other. In some countries, particularly the USA, employers have been intruding into their employees' data, into their behaviour - not only in the workplace but also beyond it - and even into their employees' bodies in the form of substance-abuse testing, and even the insertion of identity chips. Such measures substitute a power-relationship for loyalty, with the result that employees become exactly what the employer treats them as - sullen opponents who are likely to disclose company secrets and even to commit sabotage. The negative impact on corporate morale and performance is even more marked in the case of staff-members on whose creativity the organisation depends for innovation, because a climate of surveillance and distrust chills behaviour and stultifies creative thought and action (Clarke 2006a).

Other contexts in which trust is critical are external to the organisation: the various aspects of eBusiness, particularly business-to-consumer (B2C) eCommerce, but also eGovernment (government-to-citizen - G2C), and even business-to-business (B2B) eCommerce if there is considerable disparity between the parties's size and hence market power.

The adoption of eBusiness depends on the parties perceiving benefits in adoption that are sufficient to overcome the disbenefits. The costs involved include the effort of turning one's attention to a new way of doing things, understanding it, acquiring and installing relevant software, and learning how to use it. But widespread cynicism exists about the reasons why eBusiness is being introduced. There are well-founded fears that large organisations will seek opportunities to reduce their level of service, and to transfer costs and effort to the other party - particularly where that other party is less powerful, such as a consumer/citizen, or a small business enterprise.

Organisations do indeed apply eBusiness to achieve those essentially negative purposes, but they have more constructive aims as well, including:

Achieving progress in the application of electronic tools is important to many organisations. One of the greatest impediments to the adoption of the various categories of eBusiness has been lack of trust in other parties or the technologies involved. Credible privacy protections are a key factor in ameliorating the poor relationships that derive from distrust.

3. Privacy Strategy

The activities of large organisations do not naturally protect the privacy of employees, nor of customers and suppliers. On the contrary, the increase in the scale of corporations and government agencies through the twentieth century, the greater social distance between institution and individual, the greater dependence on data instead of human relationships, and the de-humanising nature of computer-based systems, have together resulted in large organisations both being perceived to be, and being, seriously threatening to privacy.

If organisations are to avoid distrust arising from their privacy-invasive behaviour, and particularly if they wish to use their behaviour in relation to people as a means of inculcating trust, then they need to adopt a strategic approach to privacy. This section introduces privacy strategy and outlines key techniques.

3.1 Concepts

Organisations are ill-advised to consider privacy, or indeed any other potentially significant social factor, in isolation. Rather, privacy should be considered within the context of the organisation's mission and corporate strategy. Because the primary dimension of privacy is that relating to personal data, strategic information systems theory provides an appropriate basis for analysis Clarke (1994a).

Fundamentally, people want some space around themselves. Privacy is most usefully understood as the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations (Clarke 2006a).

People do not identify with 'privacy in the abstract', so the full power of public opinion is seldom brought to bear. One result of this has been that American legislators have been able to ignore public concerns and instead satisfy their donors by sustaining the myth that 'self-regulation' is good enough. The substantial protections embodied in the OECD Guidelines (OECD 1980) and the EU Directive (EU 1995 and its several successors) have been reduced to a limited and entirely inadequate sub-set referred to as the 'safe harbor' provisions (FTC 2000, DOC 2000).

The flaw in this approach is that people identify very strongly with 'privacy in the particular'. The statute books of the U.S. and its States are flooded with over 700 laws, most of them knee-jerk responses to privacy problems that exploded into the public eye (Smith 2002, Rotenberg 2004). Even countries that have broad information privacy protections are beset by these flurries from time to time. Public concern about privacy invasions continues to grow, as organisations harness technology and its applications with ever more enthusiasm. Demands for personal data are teaching people to be obstructionist. When dealing with organisations, it is best for them to obfuscate and lie in order to protect their private space. As irresponsible applications of technology continue to explode, and continue to be subject to inadequate protections and even less adequate regulation, these flurries are occurring more frequently (Clarke 2006b).

Given this pervasive distrust, organisations that are dependent on reasonable behaviour by the individuals they deal with need to implement a privacy strategy, in order to dissociate themselves from the mainstream of privacy-invasive corporations and government agencies. The foundations of privacy strategy were laid out in Clarke (1996), and expanded and updated in Clarke (2006c). The principles are:

Key elements of a process to develop a privacy strategy are:

3.2 Privacy-Sensitive Business Processes

A minimalist privacy plan involves a privacy policy statement that goes beyond the limited assurances dictated by the law. People appreciate clear, direct statements that are not qualified by large volumes of bureaucratic, lawyer-dictated expressions. Guidance is provided in Clarke (2005).

Real credibility, however, depends on more than mere statements. There is a need for organisations' undertakings to be backed up by indemnities in the event that the organisation breaches them. Complaints-handling processes are needed, to provide unhappy clients with an avenue to seek redress. Constructive responses to complaints are essential. Indeed, these are stipulated by industry standards relating to complaints-handling (ISO 10002 2004). A self-confident organisation goes further, and explains the laws that regulate the organisation, links to the sources of the law, and provides contact-points for relevant regulators.

To underpin privacy statements and indemnities, an organisation needs to ensure that its business processes are privacy-sensitive. This is a non-trivial task. Firstly, it is necessary for all business processes to be reviewed against a comprehensive set of privacy requirements. Secondly, it requires that privacy impact assessments (PIAs) be undertaken for each new project that is undertaken that involves impositions on individuals or the use of personal data. A PIA is a process whereby the potential privacy impacts and implications of proposals are surfaced and examined (Clarke 1998a).

Together, these measures can enable an organisation to at least reduce distrust by individuals, and, if well conceived and executed, can deliver the organisation a reputation among its employees and clientele that encourages appropriate behaviour, and even provides it with competitive advantage.

4. Technology's Role

The remainder of this chapter looks beyond the base level of privacy-sensitive business processes, and focusses on the role of organisations' use of technology in order to reduce the distrust held by the organisation's employees and eBusiness partners, or even enhance the degree of trust.

Information technologies have largely had a deleterious impact on privacy. Those that have a particularly negative impact, such as visual and data surveillance, person location and tracking, and applications of RFID tags beyond the retail shelf, are usefully referred to as 'privacy-invasive technologies' ('the PITs'). The first sub-section below addresses the PITs.

A further and more constructive way of treating privacy as a strategic variable is to apply technology in order to actively assist in the protection of people's privacy, hence 'privacy-enhancing technologies' or 'PETs'.

The history of the PETs is commonly traced back to applications of cryptography by David Chaum (1981, 1985, 1992). The term 'privacy-enhanced mail' (PEM) was used at least as early as the mid-1980s, in the RFC series 989 (February 1987), 1040 (January 1988), and 1113-1115 (August 1989), which defined a 'Privacy Enhancement for Internet Electronic Mail'. PEM proposed the use of cryptography to protect the content of email from being accessed by anyone other than the intended recipient. The more general term 'privacy enhancing technology' (at that stage without the acronym) has been traced by EPIC's Marc Rotenberg to CPSR (1991).

The first use of the acronym to refer to a defined category of technologies appears to have been by John Borking of the Dutch Data Protection Authority in 1994. A report was published as ICPR (1995). See also Goldberg et al. (1997), Burkert (1997), Borking & Raab (2001) and Borking (2003). Annual PET Workshops have been held since 2000, with significant contributions from computer scientists in Germany and Canada as well as the USA. These diverge somewhat in their interpretation of PETs from that of the Data Protection Commissioners of The Netherlands, Ontario and Germany, in particular in that they focus strongly on nymity.

A wide variety of tools exist (EPIC 1996-). More are being devised. It is useful to distinguish several broad categories. Some are used as countermeasures against PITs. Others provide users with anonymity on the Internet. Because anonymity is, by definition, unbreakable, there is an inevitable conflict with accountability. For this reason, tools for anonymity are referred to here as 'savage PETs'. An alternative is to promote tools that provide pseudonymity. This must be breakable in order to enable the investigation of suspected criminal behaviour; but it must be breakable only with sufficient difficulty, in order to attract people to use it and to overcome distrust. This group of tools is referred to in this chapter as 'gentle PETs'. Finally, some measures have been referred to by their proponents as PETs, but deliver little of substance, and are accordingly referred to in this chapter as 'pseudo-PETs'. Each of these categories of technology is addressed below.

4.1 The PITs

There are many applications of technology whose primary function is to gather data, collate data, apply data, or otherwise assist in the surveillance of people and their behaviour. A useful collective term is 'privacy-intrusive technologies', or 'the PITs'. Among the host of examples are data-trail generation and intensification through the denial of anonymity (e.g. identified phones, stored-value cards, and intelligent transportation systems), data warehousing and data mining, video-surveillance, stored biometrics, and imposed biometrics Clarke (2001a, 2001d).

A current concern is the various categories of 'spyware' (Stafford & Urbaczewski 2004). This is being applied by corporations to assist in the protection of their copyright interests, gather personal data about customers and project high-value advertising at consumers, and by fraudsters to capture authentication data such as passwords. The cumulative impact of PITs on consumers and citizens is heightened distrust of both large organisations and information technology.

One aspect of an organisation's privacy strategy is the examination of the technologies the organisation uses in order to appreciate the extent to which they are privacy-intrusive, and the extent to which that privacy-intrusiveness may militate against achievement of the organisation's objectives.

4.2 Pseudo-PETs

There have been attempts to take advantage of the PET movement by applying the label to techniques that provide only nominal protection. The most apparent of these is so-called 'privacy seals', such as TRUSTe, Better Business Bureau and WebTrust. They are mere undertakings that have no enforcement mechanism, and are just 'meta-brands' - images devised in order to provide an impression of protection (Clarke 2001c).

Another 'pseudo-PET' is Platform for Privacy Preferences (P3P - W3C 1998-). P3P was originally envisaged as a means whereby web-sites could declare their privacy undertakings, and web-browsers could compare the undertakings with the browser-user's requirements, and block access, or limit the transmission of personal data accordingly. But P3P was implemented server-side only, with the result that it contributes very little to privacy protection (Clarke 1998a, 1998c, EPIC 2000, Clarke 2001b).

4.3 Counter-PITs

Many PETs assist people to defeat or neutralise privacy-invasive technologies and hence are usefully referred to as 'Counter-PITs'. Examples include SSL/TLS for channel encryption, spam-filters, cookie-managers, password managers, personal firewalls, virus protection software and spyware-sweepers.

Although many protections are already productised, opportunities remain for organisations to contribute. For example, there is a need for services that display to the browser-user information about the owner of an IP-address before connecting to it, and for the monitoring of inbound traffic for patterns consistent with malware and hacking, and outbound traffic for spyware-related transmissions (DCITA 2005).

4.4 Savage PETs

For many people, that first category of PETs is unsatisfactory because they still permit organisations to accumulate personal data into dossiers and profiles. A much more aggressive approach is available. One class of PETs sets out to deny identity and to provide untraceable anonymity. Examples include genuinely anonymous ('Mixmaster') remailers and web-surfing schemes, and genuinely anonymous e-payment mechanisms. (The inclusion of 'genuinely' is necessary, because some remailers and payment mechanisms have been incorrectly described as 'anonymous', even though they are actually traceable).

Such techniques exist, and will always exist, nomatter what countermeasures are developed. Major literature in this area includes Chaum (1981, 1985, 1992), Onion (1996-), Syverson et al. (1997), Clarke et al. (2002) and Dingledine et al. (2004). See also Freehaven (2000-). For a critical review of policy aspects, see Froomkin (1995).

4.5 Gentle PETs

Where they are successful, 'Savage PETs' work against accountability, because they reduce the chances of retribution being wrought against people who use them to assist in achieving evil ends. It would be highly beneficial if a balance could be found between nymity on the one hand, and accountability on the other.

The means of achieving this is through 'protected pseudonymity'. It is the most technically challenging, and at this stage the least developed of the categories. The essential requirement of a gentle PET is that very substantial protections are provided for individuals' identities, but in such a manner that those protections can be breached when particular conditions are fulfilled.

Underlying this approach is a fundamental principle of human freedom that appears not yet to have achieved mainstream understanding: people have multiple identities, and to achieve privacy-protection those identities must be sustained. This favours single-purpose identifiers, and militates against multi-purpose identifiers (Clarke 1994b, 1999).

The protections against breach of protected psuedonymity must be trustworthy, and must comprise an inter-locking network of legal, organisational and technical features. If the power to override the protections is in the hands of a person or organisation that flouts the conditions, then pseudonymity's value as a privacy protection collapses. Unfortunately, governments throughout history have shown themselves to be untrustworthy when their interests are too seriously threatened; and corporations are dedicated to shareholder value alone, and will only comply with the conditions when they are subject to sufficiently powerful preventative mechanisms and sanctions. The legal authority to breach pseudonymity must therefore be in the hands of an independent judiciary, and the case for breach must be demonstrated to the court.

A range of technical protections is needed. The creation and controlled use of identities needs to be facilitated. The traffic generated using protected pseudonyms needs to be guarded against traceability, because that would enable inference of an association between a person and the identity. In addition, there must be technical support for procedures to disclose the person's identity, which must involve the participation of multiple parties, which in turn must be achieved through the presentation of reliable evidence (Goldberg 2000).

These features are unlikely to be satisfied accidentally, but must be achieved through careful design. For example, the original 'anonymous remailer', (1993-96), was merely pseudonymous because it maintained a cross-reference between the incoming (identified) message and the outgoing ('anonymised') message, and the cross-reference was accessible to anyone who gained access to the device - including Finnish police, who do not have to rely on judicial instruments as authority for access, because they have the power to issue search warrants themselves (Wikipedia 2002-).

The notion of 'identity management' has been prominent. The mainstream approaches, those of Microsoft Passport, and of the cutely but misleadingly named 'Liberty Alliance', are in fact privacy-invasive technologies, because they arrogantly 'provide' identities to individuals, and their fundamental purpose is to facilitate sharing of personal data among organisations. Microsoft's 'Identity Metasystem' (Microsoft 2006), based on Cameron (2005), is more sophisticated, but also fails to support protected pseudonymity.

The need is for 'demand-side' identity management tools that are PETs rather than PITs (Clauß et al. 2002, Clarke 2004). Organisations need to utilise multiple means to protect their interests, and rather than imposing unjustifiable demands for strong authentication of the identity of the individuals that they deal with - because that approach is inherently privacy-invasive, and generates distrust.

5. Business Cases for PETs

Organisations whose staff or customers distrust it because of privacy concerns need to consider using PETs as a means of addressing the problem. This section examines how organisations can evaluate the scope for PETs to contribute to their privacy strategy, and hence to their business strategy as a whole. There appear to be very few references to this topic in the literature, but see MIKR (2004, pp. 38-45). The first sub-section clarifies the much-abused concept of 'a business case'. The second then shows how it can be applied to PETs.

5.1 Concepts

The technique that organisations use to evaluate a proposal is commonly referred to as the development of a 'business case'. The term is rather vague, however, and a variety of techniques is used. One major differentiating factor among them is whether the sponsor's interests dominate all others, or whether perspectives additional to those of the sponsor need to be considered. A further distinction is the extent to which benefits and disbenefits can be expressed in financial or other quantitative terms. Exhibit 1 maps the primary techniques against those two pairs of characteristics.

Exhibit 1: A Classification Scheme for Business Case Techniques

The top-left-hand cell contains mechanical techniques that work well in relatively simple contexts where estimates can be made and 'what-if' analyses can be used to test the sensitivity of outcomes to environmental variables. The only stakeholder whose interest is reflected is the scheme sponsor; and hence the use of these techniques is an invitation to distrust by other parties.

The bottom-left-hand cell is relevant to projects in which the interests of multiple parties need to be appreciated, and where necessary traded off. But the distrust impediment can seldom be reduced to the quantitative form that these techniques demand.

The techniques in the top-right-hand cell are applicable to a corporation that is operating relatively independently of other parties but cannot express all factors in neat, quantitative terms. Even in the public sector, it is sometimes feasible for an agency to prepare a business case as though it were an independent organisation (e.g. when evaluating a contract with a photocopier supplier, or for the licensing of an electronic document management system). Internal Cost-Benefit Analysis involves assessments of benefits and disbenefits to the organisation, wherever practicable using financial or at least quantitative measures, but where necessary represented by qualitative data Clarke (1994), Clarke & Stevens (1997). Risk Assessment adopts a disciplined approach to considering key environmental factors, and the impact of potentially seriously disadvantageous scenarios. Once again, however, only the interests of the scheme sponsor are relevant, and the perspectives of other parties are actively excluded.

More complex projects require the more sophisticated (and challenging) techniques in the bottom-right quadrant of Exhibit 1. For example, a government agency cannot afford to consider only the organisation's own interests. It must at least consider the needs of its Minister, and there are usually other agencies with interests in the matter as well.

Outside the public sector, it is increasingly common for organisations to work together rather than independently. In some cases this takes the form of tight strategic partnerships, and in others looser value-adding chains. In yet others, 'public-private partnerships' inter-twine the interests of corporations and government agencies. At the very least, most organisations work within infrastructure common to all participants in the relevant industry sector, or within collaborative arrangements negotiated through one or more industry associations. Such projects therefore depend on 'win-win' solutions, and the business case must reflect the perspectives of the multiple stakeholders.

Some of the biggest challenges arise where there is significant disparity in size and market power among the participants, especially where the success of the undertaking is dependent upon the participation of many small business enterprises. Appropriate approaches for such circumstances are discussed in Cameron & Clarke (1996) and Cameron (2005).

The discussion in this sub-section has to this point assumed that all participants are organisations. There are many projects, however, in which the interests of individuals need to be considered, because their non-participation, non-adoption, or outright opposition, may undermine the project and deny return on investment. Clarke (1992) drew to attention the then-emergent concept of 'extra-organisational systems' such as ATM and EFTPOS networks, and the need to ensure that consumers' interests are reflected in the system design, by engaging with consumers and their representatives and advocates. Engagement requires information dissemination, consultation, and the use of participative design techniques. The rapid emergence of the open, public Internet in the years immediately following the publication of that paper enabled an explosion of such extra-organisational systems.

Yet corporations have seldom considered their customers as stakeholders, and even government agencies frequently leave them aside from business case evaluations. Organisations that want to avoid the distrust impediment need to apply the business case techniques in the bottom-right-hand corner of Exhibit 1, in order to reflect the perspectives of all of the important stakeholders, including human users and other individuals affected by the scheme. Impact and risk assessment activities need to encompass at least privacy, but the scope may need to extend to broader social and economic aspects such as accessibility, accidental discrimination against minorities, and the need for workplace re-training.

5.2 Application

This chapter's focus is the use of PETs as an adjunct to corporate privacy strategy. The application of PETs need to be evaluated, and a business case developed. Because of the multi-stakeholder context, and the difficulties of quantifying many of the benefits and costs, the relevant business case techniques are those in the bottom-right-hand quadrant of Exhibit 1.

This sub-section applies to PETs the business case concepts discussed above. It firstly identifies various ways in which an organisation might seek to use PETs as a means of overcoming distrust by its staff or by relevant segments of the public, particularly its customers or prospects. It then considers the kinds of benefits that may be able to be achieved, the costs and other disbenefits that may be incurred in the process, and the risks involved. Finally, approaches to reaching a conclusion about the proposal are examined.

(1) Ways to Work with PETs

There are various ways in which organisations can utilise PETs in their privacy strategy. They include the following:

(2) Benefits

Incorporating PETs into an organisation's privacy strategy provides tangible evidence of its intentions. Such actions are likely to be rated more highly than the mere assurances set out in privacy policy statements, at least by some target segments, and by representatives of and advocates for consumers.

Areas in which benefits can be sought include the following:

Because PET-related projects signal the organisation's willingness to address negative perceptions of its activities, and involve the engagement of stakeholders, benefits may arise from the mere act of conducting business case analysis, even if the eventual decision is to not proceed with the initiative.

(3) Costs and Other Disbenefits

There are costs involved in such measures. It is unlikely that the financial costs would be high relative to the scale of any reasonably large organisation's budget. On the other hand, an initiative of this kind inevitably involves considerable executive and managerial effort, and adaptation of business processes, and, perhaps more challengingly, adaptation of organisational culture.

To have the desired effect, the initiative needs to be integrated into the organisation's marketing communications mechanisms, in order to convey the message to the targeted market-segments. Moreover, the preparation of a business case using a method with necessarily broad scope is itself potentially expensive.

(4) Risks

Many benefits and disbenefits are inevitable or at least highly likely. But some further impacts may arise, depending on various environmental factors.

One potential is that a project of this nature, and deep analysis of it, may be divisive among the participants, because their perspectives may be distinctly different. Another possibility is that the intentions may be seen as inappropriate, perhaps by the media, or by a regulator, or by a competitor or industry a ssociation. A further concern is the possibility of failure or non-adoption, which could result in disappointment and loss of morale.

Factors that embody significant risk need to be the subject of a management strategy.

(5) The Net Effect

Each organisation, in its own context, needs to evaluate the net effect of the benefits and disbenefits, moderated by the risks. There are many circumstances in which project sponsors can extract sufficient benefit from a PET-related initiative to make it well worth the effort, investment and management of the risks. And even where the net effect of an initiative is not attractive, the effort invested in preparing a business case can pay dividends, by pointing the project team towards a variant in the approach that will overcome the primary disbenefit or risk.

Even if the costs appear high, investment in PETs may well be justified as a strategic measure, rather than one that needs to be formally justified by means of discounted cash flows. This is because it is of the nature of infrastructure, or an enabler. One strategic opportunity is differentiation leading to competitive advantage, particularly first-mover advantage - such as market-share gains through the attraction of users dissatisfied with other suppliers. Another is where a PET initiative has the capacity to unblock adoption processes, such that eBusiness initiatives that would otherwise stall can instead flourish.

6. Conclusions

Organisations need to appreciate that the achievement of their objectives may be seriously hindered by distrust of eBusiness and of the organisations that provide eBusiness services. Organisations need to adopt a positive approach to the privacy of the parties that they deal with, and to conceive, articulate and implement a privacy strategy.

For some corporations and government agencies, simple approaches based on privacy impact assessment and privacy-sensitive business practices may suffice. For others, however, avoiding distrust and instead inculcating trust demands more substantial initiatives. Initiatives in relation to PETs can make important contributions towards their overall privacy strategies.

As with any other project, a business case is needed. Care is necessary in selecting the appropriate approach to adopt, because the perspectives of other key stakeholders have to be reflected, particularly the parties whose participation is crucial. This chapter has provided an overview of the rationale and the process involved, together with indicators of benefits, disbenefits and risks.

Using the guidance in this chapter, an organisation can evaluate the potentials that PETs offer to staff, or to key customer segments, and build the business case.


Except where otherwise noted, URLs were checked in January 2007.

Borking J.J. (2003) 'The status of privacy enhancing technologies, Certification and security in E-services: From e-government to e-business' Kluwer, 2003

Borking J.J. & Raab C. (2001) 'Laws, PETS and other technologies for privacy protection' Journal of Information, Law and Technology 1 (February 2001)

Burkert H. (1997) 'Privacy-Enhancing Technologies: Typology, Critique, Vision' in Agre P.E. & Rotenberg M. (Eds.) (1997) 'Technology and Privacy: The New Landscape' MIT Press, 1997

Cameron J. (2005) 'Ten concepts for an ebusiness collaborative project management framework ' Proc. 18th Bled eConf., Bled, Slovenia, June 2005

Cameron K. (2005) 'The Laws of Identity' Microsoft, May 2005, at

Chaum D. (1981) 'Untraceable electronic mail, return addresses, and digital pseudonyms' Commun. ACM 4, 2 (February 1981), at

Chaum D. (1985) 'Security without Identification: Transaction Systems to Make Big Brother Obsolete' Commun. ACM 28, 10 (October 1985), at

Chaum D. (1992) 'Achieving Electronic Privacy' Sci. Am. (August 1992) 96-101, at

Clarke I., Miller S.G., Hong T.W., Sandberg O. & Wiley B. (2002) 'Protecting Free Expression Online with Freenet' IEEE Internet Computing (January-February 2002), at

Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988), at

Clarke R. (1992) 'Extra-Organisational Systems: A Challenge to the Software Engineering Paradigm' Proc. IFIP World Congress, Madrid, September 1992, at

Clarke R. (1994a) 'The Path of Development of Strategic Information Systems Theory' Xamax Consultancy Pty Ltd, July 1994, at

Clarke R. (1994b) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues', Information Technology & People 7,4 (December 1994) 6-37, at

Clarke R. (1996) 'Privacy, Dataveillance, Organisational Strategy' Proc. I.S. Audit & Control Association Conf. (EDPAC'96), Perth, 28 May 1996, at

Clarke R. (1998a) 'Privacy Impact Assessment Guidelines' Xamax Consultancy Pty Ltd, February 1998, at

Clarke R. (1998b) 'Platform for Privacy Preferences: An Overview', Privacy Law & Policy Reporter 5, 2 (July 1998) 35-39, at

Clarke R. (1998c) 'Platform for Privacy Preferences: A Critique', Privacy Law & Policy Reporter 5, 3 (August 1998) 46-48, at

Clarke R. (1999) 'Identified, Anonymous and Pseudonymous Transactions: The Spectrum of Choice', in Fischer-Hübner S., Quirchmayr G. & Yngström L. (Eds.) 'User Identification & Privacy Protection: Applications in Public Administration & Electronic Commerce' Kista, Schweden, IFIP WG 8.5 and WS 9.6, June 1999, at

Clarke R. (2001a) 'Introducing PITs and PETs: Technologies Affecting Privacy' Privacy Law & Policy Reporter 7, 9 (March 2001) 181-183, 188, at

Clarke R. (2001b) 'P3P Re-visited' Privacy Law & Policy Reporter 7, 10 (April 2001) at

Clarke R. (2001c) 'Meta-Brands' Privacy Law & Policy Reporter 7, 11 (May 2001), at

Clarke R. (2001d) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Information Technology & People 14, 2 (Summer 2001) 206-231, at

Clarke R. (2002) 'Trust in the Context of e-Business' Internet Law Bulletin 4, 5 (February 2002) 56-59, at

Clarke R. (2004) 'Identity Management: The TechnologiesTheir Business ValueTheir ProblemsTheir Prospects' Xamax Consultancy Pty Ltd, March 2004, from

Clarke R. (2005) 'Privacy Statement Template' Xamax Consultancy Pty Ltd, December 2005, at

Clarke R. (2006a) 'What's Privacy?' Xamax Consultancy Pty Ltd, July 2006, at

Clarke R. (2006b) 'Vignettes of Corporate Privacy Disasters', Xamax Consultancy Pty Ltd, September 2006, at

Clarke R. (2006c) 'Make Privacy a Strategic Factor - The Why and the How' Cutter IT Journal 19, 11 (October 2006) 26-31, at

Clauß S., Pfitzmann A., Hansen M. & Van Herreweghen E. (2002) 'Privacy-Enhancing Identity Management ' IPTS Report, Issue 67 (September 2002), at

CPSR (1991) 'CPSR Co-sponsors Meeting on Encryption, Privacy and Communications' CPSR News Volume 9, Number 2: Winter-Spring 1991, archived at

DCITA (2005) 'Taking Care of Spyware' DCITA, September 2005, at

Dingeldine R., Mathewson N. & Syverson P. (2004) 'Tor: The Second-Generation Onion Router' Proc. 13th USENIX Security Symposium, August 2004, at

DOC (2000) 'Safe Harbor Overview', U.S. Department of Commerce, 2000, at

EPIC (1996-) 'EPIC Online Guide to Practical Privacy Tools', at

EPIC (2000) 'Pretty Poor Privacy: An Assessment of P3P and Internet Privacy' Electronic Privacy Information Center and Junkbusters, June 2000, at

EU (1995) 'Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data' European Union, 24 October 1995, at

Freehaven (2000-) 'Anonymity bibliography', at

Froomkin A.M. (1995) 'Anonymity and Its Enmities' J. Online L.,, 1995, at

FTC (2000) 'Privacy Online: Fair Information Practices in the Electronic Marketplace: A Federal Trade Commission Report to Congress' Federal Trade Commission, May 2000, at

Goldberg. I. (2000) 'A Pseudonymous Communications Infrastructure for the Internet' PhD thesis, UC Berkeley, Dec 2000, at

Goldberg I., Wagner D. & Brewer E. (1997) 'Privacy-enhancing Technologies for the Internet' Proc. 42nd IEEE Spring COMPCON, February 1997

IPCR (1995) 'Privacy-Enhancing Technologies: The Path to Anonymity' Information and Privacy Commissioner (Ontario, Canada) and Registratiekamer (The Netherlands), 2 vols., August 1995, at

ISO 10002 (2004) 'Quality management -- Customer satisfaction -- Guidelines for complaints handling in organizations' International Standards Organisation, 2004, from

Microsoft (2006) 'The Identity Metasystem: Towards a Privacy-Compliant Solution to the Challenges of Digital Identity', Microsoft, October 2006, at

MIKR (2004) 'Privacy-Enhancing Technologies: White Paper for Decision-Makers' Dutch Ministry of the Interior and Kingdom Relations, December 2004, at

Onion (1996-), at

OECD (1980) 'OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data' Organisation for Economic Co-operation and Development, Paris, 1980, at,2340,en_2649_201185_1815186_1_1_1_1,00.html

PET Workshops (2000-) 'Workshops on Privacy Enhancing Technologies', at

Rotenberg M. (2004) 'The Privacy Law Sourcebook: 2001' Electronic Privacy Information Center, 20001, from

Smith R.E. (2002) 'Compilation of State and Federal Privacy Laws' Privacy Journal, 2002, with a 2006 Supplement, from

Stafford T.F. & Andrew Urbaczewski A. (2004) 'Spyware: The Ghost in the Machine' Commun. Association for Information Systems 14 (2004) 291-306, at

Syverson P., Goldschlag D. & Reed M. (1997) 'Anonymous Connections and Onion Routing' Proc. 18th Symposium on Security and Privacy, Oakland, 1997, at

W3C (1998-) 'Platform for Privacy Preferences', World Wide Web Consortium, at

Wikipedia (2002-) 'Penet remailer', Wikipedia entry, at


This paper was stimulated by an invitation from Mike Gurski, Director of the Privacy Centre of Excellence of Bell Information & Communications Technology Solutions in Toronto, and Caspar Bowden, Chief Privacy Advisor EMEA, Microsoft, of London and Toulouse. A preliminary version was presented at the Executive Session of PETS 2005 - 5th Workshop on Privacy-Enhancing Technologies, 2 June 2005, Cavtat, Croatia. The slide-set for that presentation is available.

Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Baker & McKenzie Cyberspace Law & Policy Centre at the University of N.S.W., a Visiting Professor in the E-Commerce Programme at the University of Hong Kong, and Visiting Fellow in the Department of Computer Science at the Australian National University.

xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 24 February 2005 - Last Amended: 12 June 2007 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy