Privacy Requirements of Public Key Infrastructure

Roger Clarke

Principal, Xamax Consultancy Pty Ltd, Canberra

Visiting Fellow, Department of Computer Science, Australian National University

Version of 10 March 2000

Presented at the IIR IT Security Conference, Canberra, 14 March 2000

Republished in Internet Law Bulletin 3, 1 (April 2000) 2-6

Republished in 'Global Electronic Commerce', published by the World Markets Research Centre in collaboration with the UN/ECE's e-Commerce Forum on 'Electronic Commerce for Transition Economies in the Digital Age', 19-20 June 2000

© Xamax Consultancy Pty Ltd, 2000

This document is at


Public key cryptography was heralded as the means whereby parties could deal confidently with one another using electronic means. Many security risks are involved, and addressing them has proven to be much more challenging than was originally anticipated. It turns out that a substantial Public Key Infrastructure (PKI) is necessary.

It was drawn to attention some years ago how highly privacy-intrusive a PKI would be. Despite these warnings being repeated many times, and explained in increasing degrees of detail, PKI designers are still showing lamentably poor understanding of the issues involved.

Because very little has been done to overcome the privacy-threatening nature of PKI, there is an increasing likelihood that public key cryptography will fail to become generally applied for the purposes of authentication. Substantial investments are at serious risk of delivering little or no return.


1. Introduction

Public key cryptography involves two related keys, referred to as a 'key-pair', one of which only the owner knows (the 'private key') and the other which anyone can know (the 'public key'). Outlines are provided at Clarke (1996), and in Whittle (1996-) and Schneier (1996).

Public key cryptography suffers disadvantages compared with conventional secret key cryptography. This is because it involves a considerably longer key, and hence encryption and decryption require much more processing power, or, for a given processor, significantly more processing time. Messages are sent in large volumes; so the resulting delays are of considerable consequence.

On the other hand, public key cryptography provides the capability to address all of the risks involved in the transmission of messages electronically. In particular, through the use of a key-pair to create 'digital signatures', it offers the prospect of authentication of assertions, such as the identity or attributes of the originator of a message.

In order to deliver on their promise, schemes that apply public key cryptography require a substantial set of elements and must satisfy a substantial set of requirements. The conventional term used to describe the elements is public key infrastructure (PKI).

For background information on PKI, see Clarke (1996). An early mover in the field was Utah (1995-). An early Australian proposal was PKAF (1996). The Australian Government prepared a strategy for public key technology use in the Government (OGIT 1998, commonly known as 'Gatekeeper', and now administered by the Office for Government Online, with the assistance of the Government Public Key Authority - GPKA).

This paper provides a brief review of the privacy implications of PKI, and a summary of the limited progress that has been made to date in addressing those pressing and fundamental needs.

2. PKI and Privacy

Fully three years ago, Greenleaf & Clarke (1997) presented an analysis of the privacy threats inherent in public key infrastructure (PKI). That paper categorised the threats into the following structure:

Almost 2 years ago, the requirements of a privacy-sensitive PKI were expressed in a crisp form in Clarke (1998). This stated that:

"To avoid ... seriously negative implications, and consequential public opposition, a PKI needs to have the following characteristics:

3. Related Privacy Threats

Meanwhile, the threats to privacy increase. Of particular concern is the push for the use of biometrics as a means of authenticating identity. For background, see Clarke (1994) and Clarke (1997).

There are enormous risks in the use of biometrics. In essence, a biometric is comparable to a PIN which can never be changed; hence, if it is ever compromised, it is compromised for ever. Some kinds of biometric are already very easily forged, such as fingerprints. Others will also be forged with ease, such as thumb geometry. Given the vast sums being invested in DNA technologies, it has to be regarded as very likely indeed that DNA will also be readily forged.

It is critical that databases of biometrics not be permitted to develop. PINs are protected by ensuring that they never leave the secure PIN-pad at the ATM or EFT/POS terminal. Unless a similar approach is adopted with biometrics, and central storage of such identifiers precluded, individuals will be subject to masquerade, identity theft and identity denial, not only by other people, but also by the State.

This of course runs completely counter to the current law-and-order push for registers of fingerprints and DNA of (initially) convicted criminals, and (as soon as the proponents believe that they can get away with it) of suspects, and of the public-at-large.

Another area of threat is the application of location and tracking technologies to people (Clarke 1999b). By combining the tracking of devices with authenticated identities of individuals, enormously powerful social control mechanisms would become available to corporations and governments alike.

A substantial counter-movement is gathering momentum, which is intent on producing 'privacy-enhancing technologies' (PETs) that circumvent the 'privacy-invasive technologies' (the PITs), by ensuring anonymity (Clarke 1999a). A Workshop on these topics, entitled 'Privacy by Design', is being held as part of the Computers, Freedom & Privacy Conference in Toronto in early April.

4. Progress?

Progress in addressing privacy concerns has been abysmally slow. This has resulted to a considerable extent from the slowness of technologists and policy-makers alike to appreciate just how substantial the privacy issues are.

4.1 Progress in the Commonwealth Context?

In 1997, privacy advocates expressed outrage at the complete absence from the Gatekeeper drafts of so much as a mention of public policy issues. The word 'privacy' did not appear in the draft. As a result of a meeting with the then OGIT and its consultants, the final version of the report contained a short chapter that at least recognised the existence of concerns.

During 1998, the GPKA was formed to guide the implementation of infrastructure to support the Commonwealth Government's use of public keys to authenticate agencies, companies and citizens. Initially this excluded any representation of the public interest. Once again, privacy advocates expressed outrage, and once again the government added a public interest advocate to the panel. See Greenleaf (1998). The initial appointee to that role was Prof. Graham Greenleaf, and since mid-1999 it has been the author of this paper.

Despite many, many attempts by Prof. Greenleaf and myself, the GPKA has been extremely tardy at addressing privacy aspects of the matter. During the first 18 months and 15 meetings of the GPKA's operations (in about 12 of which the public interest advocate actively participated), it has done no more than include in the accreditation guidelines for Certication Authorities (CAs) a requirement that they comply with the relevant provisions of the Privacy Act 1988.

As at early March 2000, the following matters still need to be addressed:

  1. A Statement of Privacy Requirements needs to be finalised. This has been on the agenda since July 1999;
  2. The Statement of Privacy Requirements needs to be formally incorporated as part of the Gatekeeper Specification. This may involve some further delays, because a number of agencies and CAs are proceeding on the basis of the existing version of the Specification;
  3. The Statement of Privacy Requirements needs to be formally applied to all Gatekeeper-Accredited Schemes. There appears to be an assumption by at least some members of the GPKA Board that it is relevant only to those in which consumers or citizens are keyholders. This is not the case, because schemes such as the Australian Business Number (ABN) involve unincorporated business enterprises as keyholders. Such enterprises are indistinguishable from the individual businessmen who operate them, and hence privacy concerns arise;
  4. The Statement of Privacy Requirements needs to be communicated to all current and intending sponsors of schemes that need to be Gatekeeper-compliant. This includes at least ATO, HIC, and Centrelink, but may include other agencies also;
  5. A public consultation process needs to be undertaken in relation to the Statement of Privacy Requirements, in particular with privacy and consumer representatives and advocates. The public interest representative on the Board is an adjunct to consultation, not a substitute for it. It would have been highly desirable for consultation to be held prior to promulgation of the Statement; but the lateness of the process, and the speed of development of schemes, together precluded it. A further source of serious concern is the refusal of OGO officers to use the word 'consultation', and to speak instead of 'public education';
  6. All current and intending sponsors of schemes that need to be Gatekeeper-compliant need to have formally and immediately drawn to their attention the importance of early consultation with stakeholders' representatives and public interest advocates, prior to the conceptual design of PKI schemes;
  7. The Privacy Act 1988 needs to be extended to outsourcing providers, in order to ensure effective coverage of CAs. (The Board had previously worked under the assumption that the Government's Privacy Amendment Bill 1998 would be re-introduced, but it has not been);
  8. Evaluation of privacy compliance needs to be clearly stated to be a pre-requisite to Entry-Level Accreditation as well as Full Accreditation. (This was originally the case, but it appears that it has been informally slid back to the second phase of the process).

It is far from clear that these matters will be addressed in a satisfactory manner. The senior Attorney-General's Department executive on the GPKA, Peter Ford, has formally stated that he sees no need to go beyond the provisions of the Privacy Act 1988. This is despite the arguments presented to him in multiple documents and on multiple occasions that highlight the enormous differences between the technology of the 1970s (for which the OECD Guidelines were designed, and which the Privacy Act addresses) and that of the year 2000. A summary of the manifold inadequacies of current laws in the modern context is in Clarke (2000a).

4.2 Progress in the Private Sector?

The private sector appears to have held its collective breath, pending progress with the Commonwealth's efforts on PKI. A consultative group was formed during 1999 under the auspices of the National Office of the Information Economy (NOIE), with the intention of stimulating more rapid progress. It is called the National Electronic Authentication Council (NEAC). In early 2000, NEAC let a cluster of consultancies that are intended to encourage more rapid development.

NEAC's 14 members include a single public interest advocate (Charles Britton of the Australian Consumers Association). Judging by the first two meetings that have been held, it appears that privacy is not being given much consideration at all in the NEAC context.

Meanwhile, the Attorney-General is continuing to defer the introduction of the long-promised Privacy Amendment (Private Sector) Bill. Judging by the extremely privacy-unfriendly nature of the 'key provisions' that the Attorney-General's Department released on 14 December 1999, that may be good news rather than bad. A critique of that document is in Clarke (2000b). See also ACS (2000).

If the Bill reflects the 'Key Provisions' document, then it is not a measure to provide consumers with privacy protections, but rather a bald-faced attempt to legitimate a flotilla of privacy-invasive practices of corporations. My paper suggests that about 50 major amendments would be necessary if such a Bill were to become worthy of support. Public confidence will be seriously undermined by this Bill, rather than being supported by it. Passage of the Bill would be to the very serious detriment of relationships between consumers and marketers.

5. Conclusions

Regrettably, four years into the PKI development process, policy-makers and technology-providers have still failed abysmally to appreciate the privacy risks inherent in PKI. Public acceptance of public key cryptography will be seriously undermined by this failure. I fear for the safety of the investments that companies have made in PKI.


ACS (2000) 'Privacy Bill needs much more work', Australian Computer Society, in The Australian, 15 February 2000, at

Clarke R. (1994) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Info. Technology & People 7,4 (December 1994). At

Clarke R. (1996) 'Cryptography in Plain Text', Privacy Law & Policy Reporter 3, 2 (May 1996), pp. 24-27, 30-33, at

Clarke R. (1997) 'Chip-Based ID: Promise and Peril', for the International Conference on Privacy, Montreal (September 1997), at

Clarke R. (1998) 'Public Key Infrastructure: Position Statement', May 1998, at

Clarke R. (1999a) 'Anonymous, Pseudonymous and Identified Transactions: The Spectrum of Choice', Proc. IFIP User Identification & Privacy Protection Conference, Stockholm, June 1999, at

Clarke R. (1999b) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. 21st International Conf. Privacy and Personal Data Protection, Hong Kong, September 1999, at

Clarke R. (2000a) 'Beyond the OECD Guidelines: Privacy Protection for the 21st Century', January 2000, at at

Clarke R. (2000b) 'Submission to the Commonwealth Attorney-General Re: 'A privacy scheme for the private sector: Release of Key Provisions' of 14 December 1999' January 2000, at

EPIC (1996-) 'Cryptography Policy', at

Froomkin A.M. (1996) 'The Essential Role of Trusted Third Parties in Electronic Commerce' Oregon L. Rev. 75,1 (Spring, 1996) 49-115

Greenleaf G.W. (1998) 'Gatekeeper leaves the door ajar on privacy' Privacy Law & Policy Reporter 5,1 (May 1998), pp. 1, 3-4

Greenleaf G.W. & Clarke R. (1997) `Privacy Implications of Digital Signatures', IBC Conference on Digital Signatures, Sydney (March 1997), at

OGIT (1998) 'Project Gatekeeper: Implementation of Public Key Technology by Commonwealth Agencies' Office of Government Information Technology (now Office for Government Online), March 1998, via

PKAF (1996) 'Strategies for the Implementation of a Public Key Authentication Framework (PKAF) in Australia', Miscellaneous Publication SAA MP75-1996, Standards Australia, October 1996, 88 pp.

Schneier B. (1996) 'Applied Cryptography' Wiley, 2nd Ed., 1996

Utah (1995-) 'Utah Digital Signature Program', Utah Division of Corporations and Commercial Code, at

Whittle R. (1996-) 'Public Key Authentication Framework: Tutorial', at


Go to Roger's Home Page.

Go to the contents-page for this segment.

Send an email to Roger

Created: 10 March 2000

Last Amended: 10 March 2000

These community service pages are a joint offering of the Australian National University (which provides the infrastructure), and Roger Clarke (who provides the content).
The Australian National University
Visiting Fellow, Faculty of
Engineering and Information Technology,
Information Sciences Building Room 211
Xamax Consultancy Pty Ltd, ACN: 002 360 456
78 Sidaway St
Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 1472, 6288 6916