Principal, Xamax Consultancy Pty Ltd, Canberra
Visiting Fellow, Department of Computer Science, Australian National University
Version of 10 March 2000
Presented at the IIR IT Security Conference, Canberra, 14 March 2000
Republished in Internet Law Bulletin 3, 1 (April 2000) 2-6
Republished in 'Global Electronic Commerce', published by the World Markets Research Centre in collaboration with the UN/ECE's e-Commerce Forum on 'Electronic Commerce for Transition Economies in the Digital Age', 19-20 June 2000
© Xamax Consultancy Pty Ltd, 2000
This document is at http://www.anu.edu.au/people/Roger.Clarke/DV/PKI2000.html
Public key cryptography was heralded as the means whereby parties could deal confidently with one another using electronic means. Many security risks are involved, and addressing them has proven to be much more challenging than was originally anticipated. It turns out that a substantial Public Key Infrastructure (PKI) is necessary.
It was drawn to attention some years ago how highly privacy-intrusive a PKI would be. Despite these warnings being repeated many times, and explained in increasing degrees of detail, PKI designers are still showing lamentably poor understanding of the issues involved.
Because very little has been done to overcome the privacy-threatening nature of PKI, there is an increasing likelihood that public key cryptography will fail to become generally applied for the purposes of authentication. Substantial investments are at serious risk of delivering little or no return.
Public key cryptography involves two related keys, referred to as a 'key-pair', one of which only the owner knows (the 'private key') and the other which anyone can know (the 'public key'). Outlines are provided at Clarke (1996), and in Whittle (1996-) and Schneier (1996).
Public key cryptography suffers disadvantages compared with conventional secret key cryptography. This is because it involves a considerably longer key, and hence encryption and decryption require much more processing power, or, for a given processor, significantly more processing time. Messages are sent in large volumes; so the resulting delays are of considerable consequence.
On the other hand, public key cryptography provides the capability to address all of the risks involved in the transmission of messages electronically. In particular, through the use of a key-pair to create 'digital signatures', it offers the prospect of authentication of assertions, such as the identity or attributes of the originator of a message.
In order to deliver on their promise, schemes that apply public key cryptography require a substantial set of elements and must satisfy a substantial set of requirements. The conventional term used to describe the elements is public key infrastructure (PKI).
For background information on PKI, see Clarke (1996). An early mover in the field was Utah (1995-). An early Australian proposal was PKAF (1996). The Australian Government prepared a strategy for public key technology use in the Government (OGIT 1998, commonly known as 'Gatekeeper', and now administered by the Office for Government Online, with the assistance of the Government Public Key Authority - GPKA).
This paper provides a brief review of the privacy implications of PKI, and a summary of the limited progress that has been made to date in addressing those pressing and fundamental needs.
Fully three years ago, Greenleaf & Clarke (1997) presented an analysis of the privacy threats inherent in public key infrastructure (PKI). That paper categorised the threats into the following structure:
Almost 2 years ago, the requirements of a privacy-sensitive PKI were expressed in a crisp form in Clarke (1998). This stated that:
"To avoid ... seriously negative implications, and consequential public opposition, a PKI needs to have the following characteristics:
Meanwhile, the threats to privacy increase. Of particular concern is the push for the use of biometrics as a means of authenticating identity. For background, see Clarke (1994) and Clarke (1997).
There are enormous risks in the use of biometrics. In essence, a biometric is comparable to a PIN which can never be changed; hence, if it is ever compromised, it is compromised for ever. Some kinds of biometric are already very easily forged, such as fingerprints. Others will also be forged with ease, such as thumb geometry. Given the vast sums being invested in DNA technologies, it has to be regarded as very likely indeed that DNA will also be readily forged.
It is critical that databases of biometrics not be permitted to develop. PINs are protected by ensuring that they never leave the secure PIN-pad at the ATM or EFT/POS terminal. Unless a similar approach is adopted with biometrics, and central storage of such identifiers precluded, individuals will be subject to masquerade, identity theft and identity denial, not only by other people, but also by the State.
This of course runs completely counter to the current law-and-order push for registers of fingerprints and DNA of (initially) convicted criminals, and (as soon as the proponents believe that they can get away with it) of suspects, and of the public-at-large.
Another area of threat is the application of location and tracking technologies to people (Clarke 1999b). By combining the tracking of devices with authenticated identities of individuals, enormously powerful social control mechanisms would become available to corporations and governments alike.
A substantial counter-movement is gathering momentum, which is intent on producing 'privacy-enhancing technologies' (PETs) that circumvent the 'privacy-invasive technologies' (the PITs), by ensuring anonymity (Clarke 1999a). A Workshop on these topics, entitled 'Privacy by Design', is being held as part of the Computers, Freedom & Privacy Conference in Toronto in early April.
Progress in addressing privacy concerns has been abysmally slow. This has resulted to a considerable extent from the slowness of technologists and policy-makers alike to appreciate just how substantial the privacy issues are.
In 1997, privacy advocates expressed outrage at the complete absence from the Gatekeeper drafts of so much as a mention of public policy issues. The word 'privacy' did not appear in the draft. As a result of a meeting with the then OGIT and its consultants, the final version of the report contained a short chapter that at least recognised the existence of concerns.
During 1998, the GPKA was formed to guide the implementation of infrastructure to support the Commonwealth Government's use of public keys to authenticate agencies, companies and citizens. Initially this excluded any representation of the public interest. Once again, privacy advocates expressed outrage, and once again the government added a public interest advocate to the panel. See Greenleaf (1998). The initial appointee to that role was Prof. Graham Greenleaf, and since mid-1999 it has been the author of this paper.
Despite many, many attempts by Prof. Greenleaf and myself, the GPKA has been extremely tardy at addressing privacy aspects of the matter. During the first 18 months and 15 meetings of the GPKA's operations (in about 12 of which the public interest advocate actively participated), it has done no more than include in the accreditation guidelines for Certication Authorities (CAs) a requirement that they comply with the relevant provisions of the Privacy Act 1988.
As at early March 2000, the following matters still need to be addressed:
It is far from clear that these matters will be addressed in a satisfactory manner. The senior Attorney-General's Department executive on the GPKA, Peter Ford, has formally stated that he sees no need to go beyond the provisions of the Privacy Act 1988. This is despite the arguments presented to him in multiple documents and on multiple occasions that highlight the enormous differences between the technology of the 1970s (for which the OECD Guidelines were designed, and which the Privacy Act addresses) and that of the year 2000. A summary of the manifold inadequacies of current laws in the modern context is in Clarke (2000a).
The private sector appears to have held its collective breath, pending progress with the Commonwealth's efforts on PKI. A consultative group was formed during 1999 under the auspices of the National Office of the Information Economy (NOIE), with the intention of stimulating more rapid progress. It is called the National Electronic Authentication Council (NEAC). In early 2000, NEAC let a cluster of consultancies that are intended to encourage more rapid development.
NEAC's 14 members include a single public interest advocate (Charles Britton of the Australian Consumers Association). Judging by the first two meetings that have been held, it appears that privacy is not being given much consideration at all in the NEAC context.
Meanwhile, the Attorney-General is continuing to defer the introduction of the long-promised Privacy Amendment (Private Sector) Bill. Judging by the extremely privacy-unfriendly nature of the 'key provisions' that the Attorney-General's Department released on 14 December 1999, that may be good news rather than bad. A critique of that document is in Clarke (2000b). See also ACS (2000).
If the Bill reflects the 'Key Provisions' document, then it is not a measure to provide consumers with privacy protections, but rather a bald-faced attempt to legitimate a flotilla of privacy-invasive practices of corporations. My paper suggests that about 50 major amendments would be necessary if such a Bill were to become worthy of support. Public confidence will be seriously undermined by this Bill, rather than being supported by it. Passage of the Bill would be to the very serious detriment of relationships between consumers and marketers.
Regrettably, four years into the PKI development process, policy-makers and technology-providers have still failed abysmally to appreciate the privacy risks inherent in PKI. Public acceptance of public key cryptography will be seriously undermined by this failure. I fear for the safety of the investments that companies have made in PKI.
ACS (2000) 'Privacy Bill needs much more work', Australian Computer Society, in The Australian, 15 February 2000, at http://www.anu.edu.au/people/Roger.Clarke/DV/ACS000215.html
Clarke R. (1994) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Info. Technology & People 7,4 (December 1994). At http://www.anu.edu.au/people/Roger.Clarke/DV/HumanID.html
Clarke R. (1996) 'Cryptography in Plain Text', Privacy Law & Policy Reporter 3, 2 (May 1996), pp. 24-27, 30-33, at http://www.anu.edu.au/people/Roger.Clarke/II/CryptoSecy.html
Clarke R. (1997) 'Chip-Based ID: Promise and Peril', for the International Conference on Privacy, Montreal (September 1997), at http://www.anu.edu.au/people/Roger.Clarke/DV/IDCards97.html
Clarke R. (1998) 'Public Key Infrastructure: Position Statement', May 1998, at http://www.anu.edu.au/people/Roger.Clarke/DV/PKIPosn.html
Clarke R. (1999a) 'Anonymous, Pseudonymous and Identified Transactions: The Spectrum of Choice', Proc. IFIP User Identification & Privacy Protection Conference, Stockholm, June 1999, at http://www.anu.edu.au/people/Roger.Clarke/DV/UIPP99.html
Clarke R. (1999b) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. 21st International Conf. Privacy and Personal Data Protection, Hong Kong, September 1999, at http://www.anu.edu.au/people/Roger.Clarke/DV/PLT.html
Clarke R. (2000a) 'Beyond the OECD Guidelines: Privacy Protection for the 21st Century', January 2000, at at http://www.anu.edu.au/people/Roger.Clarke/DV/PP21C.html
Clarke R. (2000b) 'Submission to the Commonwealth Attorney-General Re: 'A privacy scheme for the private sector: Release of Key Provisions' of 14 December 1999' January 2000, at http://www.anu.edu.au/people/Roger.Clarke/DV/PAPSSub0001.html
EPIC (1996-) 'Cryptography Policy', at http://www.epic.org/alert/
Froomkin A.M. (1996) 'The Essential Role of Trusted Third Parties in Electronic Commerce' Oregon L. Rev. 75,1 (Spring, 1996) 49-115
Greenleaf G.W. (1998) 'Gatekeeper leaves the door ajar on privacy' Privacy Law & Policy Reporter 5,1 (May 1998), pp. 1, 3-4
Greenleaf G.W. & Clarke R. (1997) `Privacy Implications of Digital Signatures', IBC Conference on Digital Signatures, Sydney (March 1997), at http://www.anu.edu.au/people/Roger.Clarke/DV/DigSig.html
OGIT (1998) 'Project Gatekeeper: Implementation of Public Key Technology by Commonwealth Agencies' Office of Government Information Technology (now Office for Government Online), March 1998, via http://www.gpka.gov.au/working-groups/board-only/GPKA-documents/gatekeeperreport/GATEKEEPER.pdf
PKAF (1996) 'Strategies for the Implementation of a Public Key Authentication Framework (PKAF) in Australia', Miscellaneous Publication SAA MP75-1996, Standards Australia, October 1996, 88 pp.
Schneier B. (1996) 'Applied Cryptography' Wiley, 2nd Ed., 1996
Utah (1995-) 'Utah Digital Signature Program', Utah Division of Corporations and Commercial Code, at http://www.commerce.state.ut.us/digsig/dsmain.htm
Whittle R. (1996-) 'Public Key Authentication Framework: Tutorial', at http://www.ozemail.com.au/~firstpr/crypto/pkaftute.htm
Go to Roger's Home Page.
Go to the contents-page for this segment.
Created: 10 March 2000
Last Amended: 10 March 2000
These community service pages are a joint offering of the Australian National University (which provides the infrastructure), and Roger Clarke (who provides the content). |
The Australian National University Visiting Fellow, Faculty of Engineering and Information Technology, Information Sciences Building Room 211 | Xamax Consultancy
Pty Ltd, ACN: 002 360 456 78 Sidaway St Chapman ACT 2611 AUSTRALIA Tel: +61 2 6288 1472, 6288 6916 |