Roger Clarke's Web-Site
© Xamax Consultancy Pty Ltd, 1995-2024
Principal, Xamax Consultancy Pty Ltd, Canberra
Visiting Fellow, Department of Computer Science, Australian National University
Version of 14 October 1998
© Association for Computing Machinery Inc., 1998
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distibuted for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
This paper appears in the Communications of the ACM 42, 2 (February 1999) 60-67, Special Issue on Internet Privacy
This document is at http://www.rogerclarke.com/DV/CACM99.html
Public confidence in matters of online privacy seemingly lessens as the Internet grows. Indeed, there is mounting evidence that the necessary remedy may be a protective framework that includes (gulp) legislative provisions.
Cyberspace is invading private space. Controversies about spam, cookies and the clickstream are merely the tip of an iceberg. Behind them loom real-time person-location technologies including intelligent transportation systems, geo-location, biometric identification, 'hard' authentication techniques, and minituarised processors embedded in plastic cards, anklets, watches, rings, products, product packaging, livestock, pets and people.
It's small wonder that lack of public confidence is a serious impediment to the take-up rate of consumer electronic commerce. The concerns are not merely about security of value, but about something much more significant: trust in the information society.
Conventional thinking has been that the Internet renders laws less relevant. On the contrary, this paper argues that the current debates about privacy and the Internet are the harbingers of a substantial shift. Because the United States has held off general privacy protections for so long, it will undergo much more significant adjustments than European countries.
Privacy is often thought of as a moral right or a legal right. But it's often more useful to perceive privacy as the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations.
'Personal space' has multiple dimensions, in particular privacy of the person (concerned with the integrity of the individual's body), privacy of personal behaviour, privacy of personal communications, and privacy of personal data. Information privacy refers to the claims of individuals that data about themselves should generally not be available to other individuals and organisations, and that, where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. Definitional issues are examined in Clarke (1997a).
Information privacy has been under increasing threat as a result of the rapid replacement of expensive physical surveillance by what this author referred to in this journal over a decade ago as 'dataveillance': the systematic use of personal data systems in the investigation or monitoring of people's actions or communications (Clarke 1988).
Intensive data trails about each individual provide a basis for the exercise of power over them by public and private sector organisations. Profile data can be combined with sender-driven technologies, to 'push' customised information at each individual and thereby exercise significant influence over their behaviour, and reduce their freedom of thought and action.
Along with its dramatic impacts for social and economic good, the mainstreaming of the Internet has accelerated the privacy-negative impacts of information technology. Specific Internet privacy issues are examined in Clarke (1998). Marketing, technological and provider 'imperatives' are creating tendencies for hitherto anonymous events to be converted into identified transactions, resulting in yet more data trails available to be 'trawled' and 'mined'. To facilitate data consolidation, governments and corporations make spasmodic attempts to impose multi-purpose human identifiers (Clarke 1994). During 1998, for example, U.S. government agencies were actively seeking to establish two multi-purpose national id schemes, one building on consolidation of State driver licensing ids, and the other in the health care sector.
An important implication of the definition of privacy as an interest is that privacy has to be balanced against many other, often competing, interests, variously of the individuals themselves, of other individuals, of groups, and of society as a whole. The balancing process is political in nature, involving the exercise of power deriving from authority, markets, or any other available source.
Against the ravages of technology-driven privacy invasion, natural defences have proven inadequate. Data is increasingly collected and personalised. Storage technology ensures that it remains available. Database technologies make it discoverable. And telecommunications enables its rapid reticulation. Organisations are only faintly restrained by professional and industry association codes.
Cost-contraints continue to diminish rapidly. In any case, economic limitations have simply not acted as a constraint. In the case of public sector data matching programs, for example, cost/benefit analysis is seldom performed voluntarily, there are many serious deficiencies in the few analyses that have become publicly available, and programs are continued even after they have been clearly demonstrated to be financially unjustifiable (Clarke 1995). In the private sector, many highly privacy-invasive practices are routinely undertaken by corporations precisely because they are economic, from the organisation's own, limited perspective. In the natural state, companies' interest in efficiency dominates consumer interests, and hence the exercise of corporate power over people is endemic.
Exercise of countervailing power by individuals has had limited impact. Perhaps the most effective 'campaign' to date has been the public's inaction. Business and governments in most advanced countries have attributed the slow adoption of electronic commerce to a severe lack of trust by consumers and small business in corporations and governments. Trust in electronic commerce is dependent on a complex of factors including consumer rights, freedom of expression and social equity. This article directly addresses only the central element of privacy.
Some government and corporate executives depict the increasing problems, and the inadequacies in the protective framework, as evidence that privacy not only is dead, but it ought to be dead. They perceive that the public are not to be trusted, and that the benefits of modern society must only be granted in return for substantially enhanced organisational access to personal data. The argument has been put forward in such contexts as the prevention of fraud on the public revenue, on credit grantors and on insurance companies; the efficient gathering of taxes; and the efficient marketing of goods and services. Law enforcement and national security are also recurring themes.
Claims of this kind seem to be badly out of touch with the realities of the networked world. The influence of the nation-state is under serious threat, because of the power of multi-national corporations, regionalism, globalism, the Information Infrastructure, and the new patterns of information society and information economy.
In the past, only the wealthy had ready access to trans-jurisdictionality (whereby transactions are contrived in such a manner that key elements occur in different countries or States) and extra-jurisdictionality (whereby a key element of a transaction occurs in a 'regulatory haven'). In the developed world, the Internet has dramatically changed the cost-profile of such manoeuvres, and with it their accessibility by organisations and people of lesser means.
The Internet is also enabling effective 'supra-jurisdictionality', whereby acts are subject to no jurisdiction at all. The 'high seas' have always required special legal treatment, and the law of space (such as the apportionment of liabilities for events that occur in earth-orbit) has added new challenges. The Internet creates the ability to contrive acts to take place in undefinable or undiscoverable geographic space, such that no courts (even of a powerful and bold country) could convincingly claim jurisdiction. Governments are thereby losing their existing power to impose their desires on their governed populations, and the threshhold has slid down well below mega-corporations and the seriously rich.
Meanwhile, corporations are dis-integrating (in accordance with various fashions including outsourcing, downsizing, tele-commuting and virtualisation), in order to take advantage of the economies of small-organisation flexibility and adaptability, and owner-manager tendencies to under-quote and over-work. This is less evident than it might be, because ongoing concentration of financial control tends to mask the changes in organisational processes and structures. If these trends continue, power within western economies may come to be exercised through the scale and persuasiveness of alliances, rather than of single corporations.
Meanwhile, the Internet offers new opportunities for consumers and citizens to exchange information about the behaviour of organisations, and to organise actions against them. Corporations that seek to sustain collusive arrangements may be increasingly capable of withstanding government pressure, but less able to hold off consumer groups increasingly well-organised through constructive use of the Internet.
In short, rationalist economic precepts may be convincing to government officials, and the business efficiency criterion may be axiomatic to corporate executives; but they may not be convincing to their citizen-consumer adversaries. If a powerful populace of the mid-twenty-first century demands privacy, it might be quite capable of getting it. There is no irresistible force towards de-humanisation; we can choose.
A less extreme attack on the importance of privacy is the argument for greater 'transparency', expressed most lucidly in Brin (1998). Based as much on rampant, uncontrolled growth in visual surveillance as on the Internet, Brin argues that the technological imperative is irresistible; and that privacy protections are futile. He believes that privacy can only be sustained by focussing instead on freedom of information for everyone: to achieve privacy, rely on freedom, not secrecy.
Brin's argument can be most succinctly expressed as a question-answer pair: Q: Who will keep a watch on the watchers? A: The watched. His antidote is ubiquitous openness, with the powerful just as subject to visual and data surveillance as everyone else. Policemen will be judged by the viewers who (using the Internet) watch them watching others.
Brin's argument is based on the premise that the watchers will not exercise political power in order to preclude others from watching them. The history of societies suggests that there have always been uneven distributions of power, and that the powerful have had incentives, and in most cases the ability, to exercise their power, and to resist diminution of their power. It would appear that Brin's transparent society can only be achieved if the patterns repeated across millenia of human experience are able to be overturned in short order.
So his argument is undermined by the implicit presumptions that the less powerful are more powerful than the more powerful, that no-one will succeed in establishing enclaves of privilege, and that the actions of all will really be able to be monitored by all. Brin's counter-argument (private communication, 30 June 1998) is that the powerful will be only as successful in avoiding observation as they already are in resisting privacy laws that offend their own interests.
This (necessarily cursory) examination of alternatives to privacy protection concludes that they cannot deliver what is needed right now, which is a solution to the crisis in public confidence in information technology that is being brought to a head by the rapid growth and far-reaching impact of the Internet.
To address the unwillingness of consumers to transact electronically, some industry associations have established codes of conduct and trademarks, supplemented by audits and amended terms of contract. A cross-industry initiative of importance is TRUSTe, which is examined in an article elsewhere in this Issue [@@@INSERT CACM Feb 1999 REFERENCE]. Another is the WebTrust initiative of North American accounting associations.
In parallel, technological measures have been proposed. Some anonymous and pseudonymous mailers and web-surfing aids have been motivated by a desire to enable people to protect themselves against organisations. Others, however, have been driven by corporate appreciation that lack of confidence is bad for business, and undermines effective government. Examples of such tools include AT&T's Crowds [@@@INSERT CACM Feb 1999 REFERENCE], 'Anonymizer' [@@@INSERT CACM Feb 1999 REFERENCE], the Lucent Personalized Web Assistant (LPWA) [@@@INSERT CACM Feb 1999 REFERENCE] and Onion Routing [@@@INSERT CACM Feb 1999 REFERENCE].
An even more substantial standard has been developed by the business-funded World Wide Web Consortium (W3C). The Platform for Privacy Preferences (P3P) is an especially important architectural innovation [@@@INSERT CACM Feb 1999 REFERENCE].
What these various initiatives add up to is an emergent movement to recognise a form of intellectual property (IP) rights in personal data, vested in the individual, which that individual can trade. The nature of such IP rights will need to be significantly different from existing models like copyright, patent, trademark and designs. It will need to establish a default of individual control, but then envisage means whereby licences to use data may be granted by the individual subject to qualifications of the individual's choosing. It will also have to permit measured and explicit compromise of those IP rights by legal authority.
The idea of tradeable IP rights offends the purist notion of an inviolate individual, because it implicitly acknowledges the dominance of the economic model of humankind over the social perspective. On the other hand, it may well establish a more effective basis for the protection of individual rights than has previously been available. Moreover, although targeted at economic relationships, such property rights may quickly migrate towards governmental contexts as well. Public confidence in governments is under serious challenge because of their increasing capability and capacity to submit their populations to data surveillance. Property rights would reverse the onus, forcing governments to be explicit about precisely what compromises to the IP were required by law; which would in turn bring into public focus the justification offered for each legal incursion.
Much of the developed world has progressively legislated broad Fair Information Practices. Two decades ago, these were codified in the OECD's 1980 Guidelines (OECD 1980 - see sidebar). The European Union has recognised the need for a tightening of the provisions, and a Directive came into effect in October 1998 (EU 1995).
There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.
Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.
The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes.
Personal data should not be disclosed, made available or otherwise used for additional purposes, except with the consent of the data subject or by the authority of law.
Personal data should be protected by reasonable security safeguards.
There should be a general policy of openness about developments, practices and policies with respect to personal data.
An individual should have the right to obtain data relating to himself, and to challenge data relating to himself.
A data controller should be accountable for complying with measures which give effect to the principles stated above.
In most economically advanced countries, particularly within Europe, OECD-compatible laws apply to the public sector, and in many they apply to the private sector as well (GILC 1998).
As regards the virile U.S. private sector, on the other hand, successive governments have heeded the demands of business, and resisted calls from the public for effective privacy protection laws. That stance has been based on the presumptions that economic efficiency is the greater good; that the market is capable of solving all problems; that administrative efficiency is a protector of privacy; and that what economists call 'moral suasion' is all that is needed to encourage corporations to act responsibly.
In the absence of a unifying framework, issues that catch public attention lead to sporadic, knee-jerk legislation. The United States has a vast number of statutes that impose privacy regulation, both at federal level, and in most States (Smith 1997), and inconsistencies abound. For example, video-rental records are subject to significant protections, whereas a vast array of more significantly sensitive personal data is not.
Market forces are subject to many imperfections. It is in the economic self-interest of individuals and corporations to exploit those imperfections, and to generate new ones. Unsurprisingly, self-regulation has continually demonstrated itself to be inadequate by itself, and only of value if it is instituted within a context. Although the Internet creates the prospect of co-ordinated consumer and citizen action, it would be premature to expect that the present imbalance of power between organisations and individuals is about to be overturned. Hence it is unrealistic to expect privacy to be adequately protected in the absence of intervention into government agency and marketplace behaviours.
Corporate marketing activities on the Internet have brought this need sharply into focus. During 1995-98, the U.S. Federal Trade Commission examined the behaviour of corporate web-sites. In June 1998, it concluded that "the Commission has not seen an effective self-regulatory system emerge". It recommended "that Congress develop legislation to require commercial Web sites that collect personal identifying information from children 12 and under to provide actual notice to the parent and obtain parental consent". FTC had no difficulty finding a Senator to sponsor the Bill, and it made clear that this was very likely to be merely the first instalment (FTC 1998). Meanwhile, the Clinton Administration has been adapting its stance (Gore 1998). In the United States, advocating statutory privacy protections has long been politically risky. During 1998, advocating statutory protections rapidly mutated into a politically desirable stance.
It might seem incongruous to advocate legislative intervention at a time, and in a context, in which the capacity of nation-states to enforce laws is decreasing. But the power of nation-states is decreasing, not vanishing. In the information society and economy, law, like location, will still matter. In what may well be turbulent times in the early twenty-first century, individuals and corporations alike will seek to operate in locations where they and their staff are relatively secure, and relatively confident in the rule of law.
Privacy protections demand a multi-tier approach, involving individuals, organisations, industry associations and governments, operating within a legislative framework. Comprehensive organisational, procedural and technical measures are necessary, back-ended by mechanisms that exercise control over non-compliant organisations. Legal stiffening is needed behind self-regulatory arrangements, in order to encourage compliance (by creating economic and social incentives for 'good citizenship'), and to discourage non-compliance (by creating social and economic disincentives, such as higher cost-profiles, and formal sanctions). The model, outlined in the sidebar, has been usefully described as 'co-regulatory'. An effective implementation is to be found in New Zealand (NZ 1993).
The features outlined in the sidebar are not alternatives, but are mutually inter-dependent. All need to be in place, and need to be in place before netizens and small businesspeople will be confident about their electronic transactions with large corporations and with government.
'Fair information practices' principles (FIP) which originated in the late 1960s appeared to be appropriate to the expanded data processing capabilities of the time. They have, however, proven to be totally unadaptive to the ravages of technological advance, and entirely inadequate for the dramatically more powerful, network-based I.T. of the 2000s. The precepts on which privacy-protective infrastructure for the twenty-first century needs to be built must transcend the limited principles of 'fair information practice'.
Although the OECD is currently discussing the application of its 1980 Guidelines to the Internet, it asserts that they do not require revision (OECD 1998). The OECD's motivation was always primarily economic rather than social, and it is under pressure from several quarters to ease the restrictions on access to personal data, rather than to enhance protections.
The current OECD stance is untenable. The information privacy principles on which regulatory regimes are based must be extended beyond mere FIP, in the following ways:
The concept of privacy is traditionally traced to a U.S. Supreme Court judgement at the close of the 19th century. It has come into sharp focus since the mid-twentieth century because of information-intensive practices supported by rapidly evolving information technologies.
At the close of the 20th century, the Internet is having profound impacts on aspects of our lives as diverse as our place and patterns of work, the means whereby we interact, who we interact with, and the cultures within which we live. It should not be surprising that the Internet's impacts and effects on freedoms, and on the concepts underlying our laws, are profound as well.
Privacy is one of several interests in information that are greatly affected by the Internet. They need to be re-considered in the context of the now well-established notions of information economics, and the emergent concept of information law. A form of intellectual property rights in data about oneself needs the opportunity to mature very quickly.
Privacy has always been about trade-offs, and information law will involve the formalisation of balancing processes between ownership and access, and between freedoms to know, to publish and to express, on the one hand, and freedoms to be, to hide, and to deny, on the other. The information economy is dependent on trust, trust has to be earned, and intrusion-permissive and intrusion-enabling arrangements preclude trust.
Privacy is both sustainable and a necessary focal point of the information society, firstly as a means of resisting the commoditisation of human beings, and secondly as a means of enabling electronic commerce and electronic service delivery. Industry self-regulation and the development and application of privacy-enhancing technologies are necessary, but they are not sufficient. This paper has outlined the necessary privacy-protective framework.
The paper has further argued that the principles around which this framework revolves must extend well beyond the outdated set codified in the 1980 OECD Guidelines, in order to cope with the last quarter-century's dramatic enhancements to the capabilities and capacity of information technology.
The Internet is continuing to release a great deal of pent-up energy, in areas as diverse as inter-personal relationships, the processes and even the very concepts of community, and the processes of commerce. The threats it embodies for individuals' interest in sustaining a private space are severe. The dam wall is breaking: Americans must now join the rest of the world in accepting that legislation and a publicly-funded watchdog are essential elements within a privacy-protective framework for the information society and economy.
Brin D. (1998) 'The Transparent Society' Addison-Wesley, 1998
Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988), at http://www.rogerclarke.com/DV/CACM88.html
Clarke R. (1994) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Info. Technology & People 7,4 (December 1994), at http://www.rogerclarke.com/DV/HumanID.html
Clarke R. (1995) 'Computer Matching by Government Agencies: The Failure of Cost/Benefit Analysis as a Control Mechanism' Informatization and the Public Sector (March 1995), at http://www.rogerclarke.com/DV/MatchCBA.html
Clarke R. (1996a) 'Privacy and Dataveillance, and Organisational Strategy', Proc. Conf. EDPAC'96, Perth, 28 May 1996, at http://www.rogerclarke.com/DV/PStrat.html
Clarke R. (1996b) 'Identification, Anonymity and Pseudonymity in Consumer Transactions: A Vital Systems Design and Public Policy Issue', Conference on 'Smart Cards: The Issues', Sydney, 18 October 1996, at http://www.rogerclarke.com/DV/AnonPsPol.html
Clarke R. (1997a) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms' August 1997, at http://www.rogerclarke.com/DV/Intro.html
Clarke R. (1997b) 'Chip-Based ID: Promise and Peril', Proc. Int'l Conf. on Privacy, Montreal, 23-26 September 1997, at http://www.rogerclarke.com/DV/IDCards97.html
Clarke R. (1998) 'Information Privacy On the Internet: Cyberspace Invades Personal Space' Telecommunication Journal of Australia 48, 2 (May/June 1998), at http://www.rogerclarke.com/DV/IPrivacy.html. This paper is drawn from a more detailed paper, at http://www.rogerclarke.com/DV/Internet.html
EU (1995) 'The Directive on the protection of individuals with regard to the processing of personal data and on the free movement of such data', European Commission, Brussels, 25 July 1995, at http://www2.echo.lu/legal/en/dataprot/directiv/directiv.html
FTC (1998) 'Privacy Online: A Report to Congress', Federal Trade Commission, June 1998, at http://www.ftc.gov/reports/privacy3/toc.htm
GILC (1998) 'Privacy And Human Rights: An International Survey of Privacy Laws and Practice' Global Internet Liberty Campaign, Washington DC, September 1998, at http://www.gilc.org/privacy/survey/
Gore (1998) 'Vice President Gore Announces New Comprehensive Privacy Action Plan For The 21st Century', Office of the Vice-President, Washington DC, 14 May 1998, at http://www.epic.org/privacy/laws/gore_release_5_14_98.html
NZ (1993) Privacy Act 1993 (NZ) at http://www.knowledge-basket.co.nz/privacy/legislation/legislation.html
OECD (1980) 'Guidelines on the Protection of Privacy and Transborder Flows of Personal Data', Organisation for Economic Cooperation and Development, Paris, 1980, at http://www.oecd.org/dsti/sti/it/secur/prod/PRIV-en.HTM, accessed 3 April 1998
OECD (1998) 'Implementing the OECD 'Privacy Guidelines' in the Electronic Environment: Focus on the Internet', Committee for Information, Computer and Communications Policy, Organisation for Economic Cooperation and Development, Paris, May 1998, at http://www.oecd.org/dsti/sti/it/secur/news/
Smith R.E. (1997) 'Compilation of State and Federal Privacy Laws' Privacy Journal, Providence RI, 1997
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.
From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.
Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916
Created: 5 June 1998 - Last Amended: 10 November 1998 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/CACM99.html