Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Beyond Fair Info. Practices'

Beyond 'Fair Information Practices':
A New Paradigm for 21st-Century Privacy Protection

Roger Clarke

Principal, Xamax Consultancy Pty Ltd, Canberra

Visiting Fellow, Department of Computer Science, Australian National University

Work-in-Progress Version of 1 February 1998

© Xamax Consultancy Pty Ltd, 1997, 1998

Paper being prepared for submission to a leading journal of public policy and I.T.

This document is at

Where does this go?

OECD P's omit:


Since their inception in about 1970, privacy protection laws throughout the world have been developed within a particular framework commonly referred to as 'fair information practices'.

Data surveillance technologies have undergone enormous development during the subsequent quarter-century. As a result, protective frameworks based on the fair information practices notion are no longer adequate to satisfy the public need. The public feels the need for effective laws, policies, procedures and practices ever more keenly.

This paper presents a new paradigm for information privacy protections, and proposes that all proposals for new and amended laws be tested against this framework.

It is conceived not as a revolutionary mechanism that sweeps out the old schemes and replaces them with new ones. Rather, it is conceived as an evolutionary path, whereby existing protections can be sustained, refined, strengthened and extended, and new protections can be added to existing regimes.

The key concepts underlying the new paradigm are the justification of system purposes, the development and deployment of privacy-enhancing technologies, institutionalised resistance to multi-purpose identification schemes, and the widespread application of anonymity and pseudonymity.





draws on a quarter-century of my own research, including many references to prior analyses of particular parts of the whole problem. As indicated in the Postscript, it is less well-referenced to other people's publications than it desirably would be. This is excused on the basis of the urgency of injecting the proposal into the public policy debate.

plan of the paper


This segment of the paper provides introductions to a number of matters that underlie the argument. It commences with a review of the concept and technologies of data surveillance. It then examines the concept of information privacy. Finally, an introduction is provided to the contemporary paradigm within which privacy protective regimes have been developed, viz. 'fair information practices'.

* Data Surveillance

Surveillance is the systematic investigation or monitoring of the actions or communications of one or more persons. Its primary purpose is generally to collect information about them, their activities, or their associates. There may be a secondary intention to deter a whole population from undertaking some kinds of activity.

The basic form, physical surveillance, comprises watching and listening (visual and aural surveillance). Monitoring may be undertaken remotely in space, with the aid of image- amplification devices like field glasses, infrared binoculars, light amplifiers, and satellite cameras, and sound-amplification devices like directional microphones; and remotely in time, with the aid of image andsound- recording devices. Several kinds of communications surveillance are practiced. including mail covers and telephone interception. The popular term electronic surveillance refers to both augmentations to physical surveillance (such as directional microphones and audio bugs) and to communications surveillance, particularly telephone taps.

These forms of direct surveillance are commonly augmented by the collection of data from interviews with informants (such as neighbors. employers. work- mates, and bank managers). As the volume of information collected and maintained has increased, the record collections (or personal data systems) of organizations have become an increasingly important source.

The term 'dataveillance' was coined in Clarke (1988) to refer to the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons.

Personal surveillance is the surveillance of an identified person. In general, a specific reason exists for the investigation or monitoring. Mass surveillance is the surveillance of groups of people, usually large groups. In general, the reason for investigation or monitoring is to identify individuals who belong to some particular class of interest to the surveillance organization.

Personal surveillance is an important weapon in the fight against such social evils as terrorism and organized crime. It is used to collect evidence in civil cases. It is also a means of learning sufficiently embarrassing facts about a person to assist in discrediting him or her in the eyes of some other person or group, or buying his or her silence or agreement. At its most secret, it can deny the subject natural justice, and at its most open, it can be tantamount to coercion or blackmail.

Personal- surveillance activities are undertaken by "private investigators" for corporate and personal clients. The majority of these activities, however, are undertaken by staff employed by government agencies, including police, national security, customs, and telecommunications officials.

Mass surveillance is difficult to discuss dispassionately because of the impact on our culture of the anti- utopian novels, particularly 1984. Its primitive forms include guards on raised walkways and observation turrets. More recently, closed- circuit television has offered a characteristic that significantly enhances the effectiveness of surveillance: The subjects, even if they know they are subject to monitoring, cannot know precisely when the observer is actually watching.

Dataveillance is substantially cheaper than other forms. As a result, it is tending to supplant physical monitoring. Moreover, it has made feasible the observation of far more individuals and far larger populations than ever before. Dataveillance has accordingly been increasingly implemented as a means of protecting public safety and the public revenue.

Benefits accrue from dataveillance; but so too do significant risks to the public. What is monitored is not the person, but the person's 'data shadow' or 'digital persona' (Clarke 1994a). Hence erroneous inferences are very easily drawn. At the sociological level, dataveillance generates a 'chilling effect', that restrains people from exercising their freedoms. The dangers are considered in detail in Clarke (1988).

* Information Privacy

Privacy is often discussed as though it were a moral or constitutional right. For analytical purposes, it is much more useful to define it as an interest that people have (Morison 1973):

Privacy is the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations.

It is not a single interest, however, but rather has several dimensions:

With the close coupling that has occurred between computing and communications, particularly since the 1980s, the last two aspects have become closely linked, and are commonly referred to as 'information privacy':

Information Privacy is the interest an individual has in controlling, or at least significantly influencing, the handling of data about themselves.

An important implication of the definition of privacy as an interest is that it has to be balanced against many other, often competing, interests. At the level of an individual, it may be necessary to sacrifice some privacy, in order to satisfy another interest. The privacy interest of one person may conflict with that of another person, or of a groups of people, or of an organisation, or of society as a whole. Hence:

Privacy Protection is a process of finding appropriate balances between privacy and multiple competing interests.

Establish that there are considerable concerns

Analyses of privacy-intrusive behaviours and technologies are available (e.g. Rule 1974, Foucault 1977, Kling 1978, Rule et al 1980, Burnham 1983, Marx & Reichman 1984, OTA 1985, OTA 1986, Roszak 1986, Laudon 1986, Clarke 1988, Flaherty 1989, Bennett 1992, Davies 1992, Clarke 1993a).

* Fair Information Practices

Around the world, information privacy protections display a number of variants. All, however, can be classified as 'fair information practices' (FIP) legislation. The essential postulate of FIP is that the efficiency of business and government should not be hindered.

The origins of FIP lie in the foundation work of Columbia University economist, Alan Westin (Westin 1967, 1971; Westin & Baker 1974). In those early years of personal data systems, the dominant school of thought, legitmised by Westin's publications, was that the invisible economic hand of business and government activity would ensure that IT did not result in excessive privacy invasion. Hence privacy regulation was unnecessary, or, to the extent that it was imposed, it was essential that the detrimental effects on business and government be minimised.

During the 1970s (which the Chair of the OECD Expert Group later described as 'the decade of privacy'), a great deal of legislative activity occurred, particularly in the legislatures of countries on the Continent of Europe. Some States of the United States of America also enacted laws around this time. At federal level, the efforts of President Ford's administration succeeded in emasculating the U.S. Privacy Act of 1974 (Rule 1980, pp.75,110).

The OECD, concerned that a proliferation of varied privacy protection laws might harm economic growth by creating accidental trade-barriers, codified the FIP-based regime in the OECD Guidelines (1980). Legislation passed subsequently by many other countries reflects those Guidelines. A re-structuring of the OECD Guidelines into a form suitable for the creation of new schemes or the evaluation of existing ones is at Clarke (1989).

The FIP/OECD approach is concerned with 'data protection': they protect data about people, rather than people themselves. This is justified on the pragmatic grounds that it is an operational concept more easily coped with by business and government agencies than the abstract notion of privacy, and it is therefore easier to produce results. It is argued in this paper that the intervening quarter-century has demonstrated quite comprehensively that, pragmatic or not, FIP-based privacy protection laws have not delivered what humans actually need.

For reviews of the origins of FIP laws and guidelines, and collections of contemporary privacy protection regimes, see Smith (1974-), Flaherty (1986), Bennet (1992, pp.96-101), and Madsen (1992).

Inadequacies of the Fair Information Practices Approach

Privacy protection regimes that adopt the FIP approach provide, with qualifications, worthwhile protections. However they have failed to satisfy the real public need. This section catalogues the weaknesses of FIP-based schemes, as a basis upon which a new paradigm can be developed.

The headings under which the weaknesses are developed are as follows:

* Failure to Enact Comprehensive Privacy-Protective Laws

A fundamental requirement for a privacy protective regime is that national and regional legislatures recognise the need, and enact appropriate legislation. Many countries have done so, but many others have not. Most of the world's richest countries have at least some form of legislation, and many of the countries that do not are less well developed in economic terms. Coupled with privacy's emergence as a matter of consequence only during the second half of the twentieth century, this lends credence to the presumption that privacy is highly valued only once an acceptable level of economic well-being is achieved.

Moreover, privacy-protective laws evidence many weaknesses in comparison with the norms set by the OECD Guidelines (1980; See also Clarke 1989).

The following sections highlight some of the most important factors involved.

Legislation needs to:

* Failure to Preclude Unjustified Personal Data Systems

A direct implication of the FIP approach has been the legitimation of existing practices. Moreover, FIP-based privacy laws permit virtually any new practice to come into existence. Any use of any personal data is permitted, by any organisation, for any purpose, provided that it is handled 'fairly'.

A genuinely privacy-protective environment would include means whereby organisations can be called to account, and their justification for existing and intended new practices subjected to examination. Such a law would encompass several elements.

(a) Onus of Proof

The Australian Privacy Charter expresses it this way: "The onus is on those who wish to interfere with privacy to justify doing so" ( Australian Privacy Charter, 1994). A regime that meets the populace's needs would therefore:

(b) Failure to Require Justification

Claims that a system is justified need to be assessed using an appropriate method. Given that the tension is between a social interest and an economic one, mere financial analysis is inadequate; instead, the appropriate technique to apply is cost/benefit analysis. A description of CBA is provided in Clarke (1995a). Application of the technique to the evaluation of computer matching programs is examined in Kusserow (1984, pp.33-40), Privacy Commissioner of Canada (1989), Privacy Commissioner of Australia (1990), and Clarke (1995b).

As Laudon noted in relation to the U.S. public sector, "a pattern has emerged ... in which the identification of a social problem [such as tax evasion, welfare fraud, illegal immigration, or child maintenance default] provides a blanket rationale for a new system without serious consideration of the need for a system" (1986, p.385). Such 'blanket rationale' has no place in a genuinely privacy-protective environment.

(c) Failure to Ensure Public Information

It is insufficient for a requirement to be imposed without a control mechanism to ensure its satisfactory implementation. Credible justification for practices is therefore dependent on public availability of the information on which the organisation's justification for the system depends.

In the case of especially privacy-intrusive applications of information technology, a formal Privacy Impact Analysis is needed. Governments have accepted the principle that projects that have significant potential impact on the physical environment require careful and independent assessment prior to a commitment being made to them. In the same way, Governments must not commit to IT projects which have significant privacy impacts, until after those impacts have been submitted to a public assessment process.

In some cases, it may be unreasonable to ask an organisation to make its full strategy publicly available, because of the harm this might do to the system's effectiveness, or possibly because of the costs involved. As a proxy for full public availability, it may be appropriate for the control to exist in the form of availability of the information to an organisation that is independent of the proponent, and that enjoys the public's confidence in relation to such matters, e.g. a privacy protection agency, together with appropriate public reporting by that organisation.

* Failure to Control the Purposes of Personal Data Systems

The lynchpin of data protection within FIP-based regimes is the limitation of data-handling only to those purposes that are :

There are several critical weaknesses that undermine the effectiveness of this nominal protection.

(a) Justification of the Purpose of Personal Data Systems

FIP schemes impose almost no controls on the original purposes of collection. In some regimes, for example, statements of purpose are used that are constructively abstract or vague, and hence all-inclusive. Examples include the 'routine uses' clause much-loved among agencies subject to the U.S. Privacy Act, 'for the purpose of government (or taxation, or welfare) administration', and 'to enable the organisation to perform its statutory functions'.

A genuinely privacy-protective regime must require more than that the purpose or purposes of personal data be stated, and perhaps published. It must:

A conventional defence against the imposition of such meaningful privacy-protection measures of this nature is that they represent an imposition on the conduct of business, and are expensive. This need not be so. Contrary to the presumptions made by the framers of some 1970s legislation (notably the U.S., U.K. and Australian statutes), it does not require the maintenance of a register of systems and their purposes. The appropriateness of many data collection, use and disclosure activities will rarely, if ever, be called into serious question, and hence such registers are largely a waste of time, effort and money.

What is needed is for the onus to be set such that all organisations have a responsibility in law to respond to requests for justification of usage. The power to challenge justification may be:

It can be argued that the arbitration of the acceptability of system purposes is best left to conventional, vague social and political processes, including letters to the editor, talk-back radio, and marches in the streets. If so, then it is essential that legal rfequirements be established relating to the public availability of information, and public consultation on major systems.

If, on the other hand, a legal authority is created whereby an organisation's justification can be rejected, that power may be vested in:

(b) Justification of Changes of Purpose

Some regimes provide for organisations to declare a change in system purposes. This increases the malleability of purpose, and hence further undermines this, most fundamental element of FIP-based data protection.

In a genuinely privacy-protective regime, it is essential that controls exist over changes in system purpose. The mechanisms described in the previous section need to extend beyond original purposes, to encompass changes in purpose.

(c) Justification of the Relevance of Data to a Decision

It is common for a requirement to exist that data be relevant to a decision. It is equally common, however, that the decision-maker is under no compulsion to justify its contention that data is relevant.

An effective regime would:

* Failure to Control the Legal Authorisation of Use and Disclosure

A further serious weakness common to many implementations of FIP-style privacy protection is the provision of legal authorisation for disclosures of such scope that privacy protection is undermined. Examples are the blanket authorisation of disclosures that relate to law enforcement activities, and in the Australian legislation even to the enforcement of laws that involve "a pecuniary interest".

It is clear that effective mechanisms are needed, whereby agencies of government, including but not limited to law enforcement, and indeed organisations outside the public sector, need access to personal data in the performance of their functions. It is essential to privacy protection, however, that:

In particular, a genuinely privacy-protective regime would demand:

* Insufficient Scope of Privacy Laws

Many FIP-based privacy protection regimes are of limited application. Some of the limitations are intentional, whereas others are less obviously intentional, or were clearly accidental.

(a) Lack of Coverage of the Private Sector

The most apparent limitation is that in some countries the laws were expressly applied only to government agencies, on the grounds that government databases created the greatest dangers, that natural processes would limit the intrusiveness of companies, or that governments should gather experience with the public sector first prior to imposing regulation on the private sector. It is all-too-apparent that privacy threats arise in the private sector that can only be successfully addressed through governmental regulatory initiatives.

FROM Flaws.html:

(b) Records versus Information

Some laws refer to 'records containing personal information'. This deflects attention away from personal information and towards particular I.T. mechanisms, and creates opportunities for schooldays-debaters-turned-barristers to ply their trade against the public interest. Genuine privacy-protective law applies to all data, and its applicability should not be constrained through the use of qualifiers.

(c) Identification

Some laws apply only where the person's identity is apparent. In the case of the Australian law, the scope is further constrained by the requirement that the identity needs to be able to be apparent or to be reasonably ascertained "from the information or opinion". This leaves unregulated circumstances in which the identity of the person is apparent from context, or where it can be ascertained only by the addition of further information.

Genuine privacy-protective law applies to all data that is capable of being associated, through any means, with an identified person.

(d) People

It is highly desirable that the privacy-protective regime apply to people generally, and not to some sub-set or sub-sets, such as citizens, permanent residents, or people in Australia.

Some FIP-based privacy regimes do not apply to every data subject. For example, it may be applicable only to citizens, or only to people residents in the country. More subtly, the Australian Privacy Act precludes investigation of a breach of the Alteration Principle unless the person concerned is either an Australian citizen or has rights of permanent residence.

Genuine privacy-protective law applies to all data that is capable of being associated, through any means, with an identified person, and is subject to no qualifications based on who the person is.

(e) 'Public Registers'

Many FIP-based schemes are qualified in relation to what are sometimes referred to as 'public registers', such as electoral rolls and land titles registers. Some schemes even exempt such data collections. For example, the Australian Privacy Act fully exempts "generally available publications". As is argued in Clarke (1997d), qualifications to privacy laws for such registers are unjustifiable. As with many other personal data collections, however, the law needs to be applied and interpreted in a manner that reflects the particular circumstances.

Genuine privacy-protective law applies to all personal data, including data in so-called public registers.

(f) Limitation to Data Protection

As discussed earlier, privacy has many dimensions. Information privacy is a crucial concern, and is the focus of this paper. It is inadequate, however, to address information privacy alone, and ignore the privacy of the person and the privacy of individual behaviour.

Some privacy-protection regimes provide the privacy watchdog with responsibilities and commensurate powers to research into, and advise government in relation to, such matters. In a few instances, such as the N.S.W. Privacy Committee, the agency also has complaints-investigation powers.

Genuine privacy-protective law encompasses all dimensions of privacy.

* Uncontrolled Exemptions and Exceptions

An FIP-based privacy scheme permits manifold exemptions and exceptions, based on the cornerstone objective of avoiding inconvenience to business and government.

This involves the treatment of privacy principles as being 'generally applicable' rather than 'universally applicable'. By 'universal' is meant that they are applicable in all contexts and do not admit of exceptions. In terms of the Macquarie Dictionary: "fundamental or primary truth, doctrine or tenet, relating to the requirements and obligations of right conduct".

A 'general principle', on the other hand, is a statement that can be avoided by anyone who can claim special privilege. As argued in Clarke (1997d), to permit Principles to be subject to exemptions or exceptions is to permit them to be undermined, both symbolically and practically.

Universal principles need to be phrased very carefully, in order that they have the effect that is intended: balance between privacy and other interests, and indeed among the privacy interests of multiple people, is essential.

A genuinely privacy-protective regime establishes universal Principles, and applied them to all organisations, and to all systems. It does not permit powerful interests to arrange for particular organisations or systems to slide out from beneath their responsibilities.

FROM PaperScared.html:

The general principles of information privacy must be applied to all agencies and all systems, and regulatory regimes to all programs. The widely practised arrangement of exempting whole classes must therefore be rolled back, and, in particular, the favoured status traditionally granted to defence, national security, criminal intelligence, law enforcement and more recently child support agencies and activities must be rolled back. Parliaments must make these agencies understand that they are subject to democratic processes and that their distinctive and challenging functions and operational environments dictate the need for careful formulation and implementation of privacy protections, not exemption.

FROM Flaws.html:

* Exemptions Instead of Balanced Implementation

The Privacy Act blindly adopted the exemption classes that had been established a decade earlier in relation to the Freedom of Information Act.

Any form of exemption, whether of classes of data, system, organisation or anything else, is a very blunt weapon, because it creates a void within which uncontrolled abuses can occur.

The appropriate approach is careful implementation of universal principles such that all interests are protected.

* Exceptions to Use and Disclosure Protections

The Privacy Act contains an exception to the use and disclosure conventions that is so broad as to cripple the entire statute. In common with the Telecommunications Act, there is an exception for "[use or disclosure for any purpose] reasonably necessary for enforcement of the criminal law or of a law imposing a pecuniary penalty, or for the protection of the public revenue".

This is an invitation to all agencies subject to the Act to use and disclose personal data for virtually any purpose. Such an uncontrolled power has no place in the legal framework of a free nation. To comply with the public's needs, it must be simply deleted, forcing each such use and disclosure to be the subject of explicit legal authorisation.

* General Versus Universal Principles

This matter is vitally important, and a separate brief paper addresses the question of Exemptions from General Principles versus Balanced Implementation of Universal Principles.

FROM MatchFrame.html:

Care must be taken to ensure that exemptions do not rob privacy protection legislation of its effectiveness. The general principles of information privacy must be applied to all agencies and all systems, and the regulatory regime for computer matching to all programs. The widely practised arrangement of exempting whole classes must therefore be rolled back.

It is entirely reasonable, on the other hand, for the specific nature of controls to reflect the particular features of an organisation, system or program. This applies particularly to operations whose effectiveness would be nullified in the event of full or even partial public disclosure. In such instances, the privacy protection agency needs to be explicitly nominated as the proxy for the public, authorised in law to have access to sensitive operational data, but precluded in law from disclosing details to the public. Not only government agencies but also government business enterprises and private sector organisations have tenable claims for exceptional treatment along these lines.

Finally, the favoured status traditionally granted to defence, national security, criminal intelligence and law enforcement agencies must be rolled back. Parliaments must make these agencies understand that they are subject to democratic processes and that their distinctive and challenging functions and operational environments dictate the need for careful formulation and implementation of privacy protections, not for exemption.


* Failure to Sustain Anonymity

Failure to stem the tide of conversion of anonymous into identified transaction streams

FROM Flaws.html:

From Anonymity to Identification

I.T.'s capacity is tempting organisations to convert many hitherto anonymous transactions into identified ones. The web of old, new, and near-future data trails is intensifying around us.

In general, consumer transactions and government programs should permit anonymity, except where clear justification for some degree of identification is demonstrated. Schemes involving direct identification should require very careful justification, which should be published in order to enable public scrutiny.

Moreover, it is commonly assumed that the choice lies between identification and anonymity. This is simply not the case, because indirect identification, or pseudonymity, is available as means of achieving trade-off among the various interests. Maximum use should be made of I.T.'s capacity to support pseudonymous transactions and trails.

Nothing in the Australian privacy-protection regime provides any resistance against the tendencies toward increased data-intensity, and away from anonymous towards identified schemes; and there is little evidence of pressure by privacy watchdogs towards pseduonymous schemes.

FROM IDCards97.html:

Organisations are seeking to exploit the ongoing technological revolutions, and are trying to convert hitherto anonymous transactions into identified ones. Examples include:

Government agencies are frequently in a position to legally impose on individuals the condition that they identify themselves when performing particular kinds of transactions. Corporations may use a combination of inducements and market power to achieve the same end.

* Failure to Negate the Monolothic State

FROM PaperScared.html:

The notion of monolithic government must be expressly denied. Parliaments must make clear that agencies are independent organisations for the purposes of data transfer. The recent tendency towards 'super-ministries' and 'mega-departments', arising from the conventional presumption that economies of scale know no bounds, must be reversed; for example, all welfare appears to be gravitating toward a mega-Social Security Department, and all financial policing roles toward a mega-Taxation Office. The temptation toward power concentration must be resisted; and multi-function agencies must provide bulkheads between the data maintained in relation to their various functions, and recognise that dissemination of data between functions is subject to privacy regulation.


Failure to withstand the threat of the monolithic State, in which agencies to appear as a monolith, with agencies supporting one another against threats from outside the public sector, e.g. by Cabinet, Parliament, the media and the public. Instances of this include the tendency (recently somewhat slowed by the Privacy Act) to exchange data with considerable freedom; the tendency towards cross-notification between agencies of data that may be of value to other parts of the monolith, e.g. of change of address of a 'client'; and cross-system enforcement, whereby one agency withholds money or services from its client, in order to enforce a debt the client is claimed to owe to another agency.

FROM MatchFrame.html:

It was noted earlier that, when it suits their interests, agencies adopt the attitude that the agencies of government form a monolith, and hence all data transfers are internal to government. This is inimical to the information privacy interest, and it is necessary for Parliaments to make clear that agencies are independent organisations for the purposes of data transfer, and that all data transfers are therefore subject to the rules regarding collection and dissemination.

In addition, there is a danger that privacy protections may be subverted by the concentration of functions and their associated data into large, multi-function agencies. Hence boundaries must be drawn not only between agencies but also between functions and systems.

'Virtual centralisation' of data by way of network connection, and data linkage via common identifiers, also embody extremely high information privacy risks. The 'national databank' agenda of the mid-1960s is being resuscitated by government agencies, and with it is coming pressure for a general-purpose identification scheme. These must be strenuously resisted if the existing balance of power between individuals and the State is not to be dramatically altered.

ALSO CFCL.html!!!

* Inadequate Institutional Countervailing Power

Clarke (1993a) identified ways in which government agencies have exercised their dominance over people, subject to only limited interference by FIP legislation:

* Lack of a Watchdog

Privacy protection regimes based on cases being brought by private citizens against the might of large information-rich and worldly-wise agencies have not worked, and are highly unlikely to do so in the future. To achieve appropriate balance between information privacy and administrative efficiency, it is necessary for an organisation to exist which has sufficient powers and resources to effectively represent the information privacy interest (Australian Law Reform Commission 1983; OTA 1986, pp.57-59, 113-122; and Flaherty D.H. 1989, esp. pp.359-407).

It has been argued that, in the United States, "a small agency of perhaps sixty professionals drawn from legal, social science and information systems backgrounds" would provide sufficient resources to address the problem (Laudon K.C. 1986, p.383). From the perspective of Australia and Canada, this would seem parsimonious for such a large country, but there can be little doubt that, given an appropriate set of powers, and sufficient independence from the Executive, it could make a significant difference to the balance of power.

It would be valuable to complement such a body with an effective option for individuals to prosecute and sue organisations which fail to comply with legal requirements. This can only come about if the requirements of organisations are made explicit, and this in turn is only likely to come about if detailed codes, prepared by a privacy protection agency on the basis of research and experience, are given statutory authority. In addition to valuable discussions in Privacy Protection Study Commission (1977), Australian Law Reform Commission (1983), Laudon (1986, pp.382-4), and Flaherty (1989), elements of all of these can be found in Australian and Canadian legislation and practice.

There are two competing models. The conventional one involves the agency being required to balance information privacy against other interests (such as administrative efficiency), and is based on negotiation and conciliation rather than adversary relationships. This risks capture of the privacy protection agency by the other much larger, more powerful, information-rich and diverse agencies, as is occurring in Australia. Laudon argues strongly for the alternative - an agency which is explicitly an advocate for information privacy, and which can represent that interest before the courts and Congress (Laudon 1986, p.384).

* Inadequate Watchdog

FROM PaperScared.html:

The Privacy Commissioner must have sufficient powers and resources to effectively represent the information privacy interest. This requires his involvement in the development of policy by governments and agencies, not just in its implementation. Labour-intensive activities designed to sap his office's energies, such as registration and digest publication, must be devolved to the agencies themselves.

FROM Flaws.html:

Some privacy laws contain requirements about the creation and maintenance of a register of personal data systems. The Commonwealth Privacy Act contains such a provision, although admittedly one less sapping of the Commissioner's resources than some statutes.

Such provisions are unnecessary, wasteful and provide a negligible contribution to privacy protection. They are merely a device for making it appear that something is being done and money is being spent. The cynic interprets them as a means of ensuring that the statutory authority is primarily an administrator rather than a watchdog.

The purpose of the OECD Public Access Principle (in the Commonwealth Act this is currently approximated by IPP 5 (1)) is to impose on all organisations a clear responsibility to provide the kinds of data that a member of the public needs, in order to understand the nature of the organisation's personal data holdings. Whether the information is maintained on a permanent basis is a decision for each organisation; a consolidated register is a worthless exercise.

* Inadequate Consultation with Advocates and Representatives

During the first eight years of the present regime, the Privacy Commissioner has consistently consulted with government agencies, and has consistently failed to involve privacy interest groups in an equivalent manner. The outcomes of consultations with agencies are not made available, on the grounds of confidentiality; and nor even are the results of surveys of agency practices. This dismissiveness appears, at least at times, to have extended beyond advocacy groups, to include State Government bodies, and in particular the N.S.W. Privacy Committee.

Concern has to exist that the low regard for privacy advocates may also be shared with the Commonwealth Government itself. That the privacy lobby was unpopular with the Labor Government of 1983-96 is not particularly surprising, because privacy law was forced upon it. But the current Government also has yet to establish any form of relationship with privacy advocacy groups.

* Inappropriate Handling of Data Sensitivity

FROM Flaws.html:

Data varies greatly in its sensitivity, depending on the data-item, the person concerned, and the circumstances. It is inadequate to assume that particular data-items (e.g. date-of-birth, marital status, number of dependants) have a particular, fixed degree of sensitivity. Australian privacy law does not adequately reflect the variability of data sensitivity.

There are already several ways in which data sensitivity may have to be considered in the Australian legal context, including:

An effective information privacy protective regime would:

* Failure to Preclude Inequitable Access to Services

the possibility of services being inaccessible to, or differentially accessible to, people who refuse to participate in particular, privacy-intrusive schemes

* Failure to Preclude Multiple Use of Identification Schemes

run the argument about the last bulwark against a national personal data system

multiple use of identification schemes

this lays the foundation for privacy protections to be undermined

examples include the national data systems of Denmark and Singapore, which represent a permanent census, in which all data are fully identified



FROM MatchFrame.html:

Tight limitations must be place on the multiple use of identification schemes. Agencies should use separate human identification schemes, and resist the temptation to converge on a single scheme. The risks of abuse are simply too high, despite the apparent efficiencies.

FROM IDCards97.html:

Many identification schemes are used by a single organisation, for a single purpose; but there are obvious attractions in sharing the costs across multiple organisations and business functions.

A special case of a multi-purpose id scheme is what is usefully described as an 'inhabitant registration scheme'. This provides everyone in a country with a unique code, and a token (generally a card) containing the code. It is typically used for the administration of taxation, national superannuation and health insurance. In some countries, is is used for additional purposes, such as the administration of social welfare and banking, and to ensure that particular rights are exercised only by people entitled to them, such as the exercise of voting rights, the right of residence, the right to work, the right of movement across the country's borders, and the right of movement within the country.

Inhabitant registration schemes are endured, and perhaps even welcomed, by the inhabitants of some countries; but are disliked, actively opposed, and undermined in many others.

Privacy-related protections vary from very little to moderately strong; but the very existence of such a scheme represents a threat against which mere 'data protection' or 'fair information practices' arrangements are almost an irrelevance. The public policy aspects of schemes of this nature are discussed in (Clarke 1992), and at Clarke (1994c).

To create a surveillance society, three conditions need to be fulfilled:

  1. there needs to be a range of personal data systems, each processing data for specific purposes;
  2. personal data systems must be connected via one or more telecommunications networks; and
  3. the data must be identified consistently.

The first two have been satisfied during the last two decades, as a result of the application of information technology. The third is accordingly the critical technical factor inhibiting the achievement of a surveillance society. Inhabitant registration schemes overcome that hurdle ( Clarke 1988).

* Dominance of Administrative Efficiency over the Privacy Interest

The most critical aspect of privacy protections based on the FIP notion is that administrative efficiency has been treated as the fundamental need, with privacy protections being permitted only to the extent that they do not conflict with, or undermine, the more important need. The justification for this, advanced by Westin, was that administrative efficiency was the means whereby the privacy interest would be protected.

Rather than supporting individual freedoms, however, administrative efficiency has been shown to generally conflict with it (Rule 1974, Rule et al. 1980). Organisations have perceived their interests to dictate the collection, maintenance and dissemination of ever more data, ever more 'finely grained'. There has been a concomitant trend toward 'scientific management' and 'rational decision-models' - decision-making which is more precise, and based on detailed criteria and a significant amount of data.

Modern practices have little regard for what have been called 'atypical, idiosyncratic or extenuating circumstances' (Marx & Reichman 1984, p.436). Achieving a full understanding of the circumstances generally requires not only additional data which would have seemed too trivial and/or too expensive to collect, but also the application of common sense, and contemporary received wisdom, public opinion and morality.

These developments have been criticised, e.g. "What we confront in the burgeoning surveillance machinery of our society is not a value-neutral technological process ... It is, rather, the social vision of the Utilitarian philosophers at last fully realized in the computer. It yields a world without shadows, secrets or mysteries, where everything has become a naked quantity" [Roszak 1986, pp.186-7]. "Information, [even today], is no more than it has ever been: discrete little bundles of fact, sometimes useful, sometimes trivial, and never the substance of thought [and knowledge] ... The data processing model of thought ... coarsens subtle distinctions in the anatomy of mind ... Experience ... is more like a stew than a filing system ... Every piece of software has some repertory of basic assumptions, values, limitations embedded within it ... [For example], the vice of the spreadsheet is that its neat, mathematical facade, its rigorous logic, its profusion of numbers, may blind its user to the unexamined ideas and omissions that govern the calculations ... garbage in - gospel out" [Roszak 1986, pp.87,95,98,118,120].

FROM Matchframe.html:

In the United States, the early discussions of protections, and in particular HEW (1973), resulted in the enactment of the Privacy Act 1974. President Ford threatened to veto that statute, and forced the proposed Privacy Commission to be reduced to a short term Study Commission (Laudon K.C. 1986, p.6). The Commission's Report (Privacy Protection Study Commission 1977) implicitly accepted the need to make surveillance as publicly acceptable as possible, consistent with its expansion and efficiency (Rule 1980, pp.75,110). Agencies had little difficulty subverting the Privacy Act. The President's Council for Integrity and Efficiency (PCIE) and the Office of Management and Budget (OMB) have worked not to limit computer matching, but to legitimise it.

The legitimation process has also been evident in developments in other countries and in international organisations. The OECD's 1980 Guidelines for the Protection of Privacy were quite explicitly motivated by the economic need for freedom of trans-border data flows. In the United Kingdom, the Government stated that its Data Protection Act of 1984 was enacted to ensure that U.K. companies were not disadvantaged with respect to their European competitors. The purpose of the 'EC'92 Directive', which has been under discussion within the European Community for several years, is the application of the limited 'fair information practices' tradition uniformly throughout Europe (European Union, 1992). Because it would make it mandatory for European Community nations to prohibit the export of personal data to countries which do not provide 'an adequate level of protection', it would significantly increase the influence of international instruments such as the OECD's 1980 Guidelines.

There have been almost no personal data systems, or even uses of systems, which have been banned outright. Shattuck (1984, p.540) reported that during the first five years, the unsympathetic interpretation of the U.S. Privacy Act by the supervising agency, the Office of Management and the Budget, resulted in not a single matching program being disapproved. Few sets of Information Privacy Principles appear to even contemplate such an extreme action as disallowing some applications of IT because of their excessive privacy-invasive nature. Exceptions include those of the New South Wales Privacy Committee (NSWPC 1977), which are not legally enforceable, and, with qualifications, Sweden. This contrasts starkly with the conclusions of observers: "At some point ... the repressive potential of even the most humane systems must make them unacceptable" (Rule 1980, p.120), emphasis in original; and "We need to recognize that the potential for harm from certain surveillance systems may be so great that the risks outweigh their benefits" (Marx 1986, p.48).

FIP-based privacy regimes have been described as an 'official response' which legitimated dataveillance measures in return for some limited procedural protections commonly referred to as 'fair information practices' (Rule 1973, Rule et al. 1980).

The first requirement of a control regime for computer matching is comprehensive and universally applicable data protection legislation which achieves a suitable balance between the various economic and social interests, rather than subordinates information privacy concerns to matters of administrative efficiency.

* Conclusion

Increasing 'information-intensity' of administration during the twentieth century, has resulted in the collection, maintenance, use and dissemination of ever more data, ever more 'finely grained'.

The 'information-intensity' phenomenon has arisen from the increasing scale of human organisations, making them more remote from their clients, and more dependent on abstract, stored data rather than personal knowledge. Other factors have been an increasing level of education among organisations' employees, the concomitant trend toward 'scientific management' and 'rational decision-models', and, particularly since the middle of the century, the brisk development in IT.

FIP legislation has facilitated these trends, in return for quite limited constraints on organisational behaviour in relation to personal data.

FROM MatchFrame.html:

Laudon concluded that "a second generation of privacy legislation is required" (Laudon 1986, p.400). This second generation has since begun, with the United States somewhat improving control over computer matching with its 1988 Act; with Canada rolling back the uses of the Social Insurance Number and regulating data matching in 1989; and with Australia issuing draft Guidelines and passing its first (admittedly limited) second generation legislation in 1990 (after finally catching up with the first generation only at the beginning of 1989). The challenge is to regulate computer matching in such a way that it clears the path for worthwhile applications of the technique, while preventing unjustifiable information privacy intrusions.

The Changed Context

technological advances


ever-increasing 'data intensity' a la Rule

(where have I catalogued the tendencies?)


increasing incentive to convert anonymous into identified transactions


ever-increasing public nervousness about dataveillance



international and cultural harmonisation

economically rather than socially motivated


international and cultural differences

esp. perception of public and private sector as threats

esp. market-forces versus public policy interventionism

also relative importance of economic progress and quality-of-life


* Non-Adaptiveness to Technological Change

* Need for Design Guidance

* Insufficient Empowerment of Individuals

* Automated Decision-Making About People

* Personal Control Over Biometrics

* Non-Adaptiveness to Technological Change

FROM Flaws.html:

* Developments of the 1970s and 1980s

The previous section noted that data matching continues out of control. The significance of this is that a relatively simple but highly privacy-invasive technology of the mid-1970s is still not subject to effective controls. What chance is there then, that complex technologies will be understood, let along mastered.

Since the model on which the existing privacy legislation is based was drafted in the 1970s, dramatic developments have occurred in I.T. Examples of developments with significant privacy implications include:

The Privacy Commissioner has undertaken some research in these areas, and has published a couple of useful reports. But no substantive steps whatsoever have been taken to address the issues.

* The Information Society of January 1990 and Beyond

One particular, and particularly important development in I.T. has been the emergence of the Information Infrastructure, primarily evidenced by the Internet, but to a limited extent also by the roll-out of Cable TV.

Successive Commonwealth Governments have entirely failed to understand the opportunities and threats the Information Infrastructure presents, other than as a means of maximising the sale-value of the national telecommunications carrier.

Moreover, attempts are being made to carry draconian and highly privacy-intrusive data collection powers that exist within the voice communications arena across into the new forms of electronic communications.

In addition to this serious concern, other challenges present themselves, including:

The existing privacy-protective regime is entirely incapable of coping with and responding to these challenges, and legal and administrative staff are not going to be able to help understand them, let alone achieve a reasonable balance between the privacy interest and other, more powerfully represented interests.

FROM MatchFrame.html:

Technological developments have rendered some of the early information privacy protections ineffective: "new applications of personal information have undermined the goal of the Privacy Act that individuals be able to control information about themselves" (OTA, 1986). See also Thom & Thorne (1983) and Greenleaf & Clarke (1984). If a proper balance between information privacy and other interests is to be maintained, processes must be instituted whereby technological change is monitored, and appropriate modifications to law, policy and practice brought about. This needs to be specified as a function of the privacy protection agency, and the agency funded accordingly.

Despite the long-standing dominance of the arguments of Westin (1967, 1971, 1974), it is clear that the needs of administrative efficiency are at odds with individual freedoms. The power of computer matching techniques is far greater than it was two decades ago, and refinements and new approaches are emerging which will require further regulatory measures in the near future.

* Need for Design Guidance

e.g. in relation to chip-cards and identification:


So extend the PC'er's capability to interact with industry ...

* Insufficient Empowerment of Individuals


no sense of contract or right to decline

FROM Flaws.html:

Rights in Relation to Personal Data

In most jurisdictions, property and other rights in data are entirely unclear. This is not some new challenge: the difficulty has been evident for at least a quarter-century. But it is the subject of renewed interest as a result of the explosion in Internet services.

An effective privacy-protective regime needs to address the question in some constructive manner. There are several approaches that could be adopted.

* Ownership of Personal Data

Australian privacy laws are silent regarding the ownership of data. The data ownership is separate from (or, reflecting the recent judgement about medical records, perhaps one needs to say 'separable from') the question of ownership of the medium on which data is stored, e.g. it is undisputed that doctors own their medical records, but it is contentious as to whether anyone owns the data stored in them, and if so, who; and it is uncontestable that the data subject has a very substantial interest in the data, which must be formally recognised.

The German Supreme Court has read a right of 'informational self-determination' into the Constitution. Two leading New York sociologists of the information age (Alan Westin and Ken Laudon) have argued for property in personal data as a means of addressing privacy problems.

Establishing ownership rights in data, and vesting them in the person to whom it relates, would address the privacy concern. Some qualifying rights would be needed, however, to ensure practicality, e.g. an implied licence for record-keepers to retain personal data on records, and to use it in ways that are consistent with privacy laws.

Less ambitious measures are also available, including the creation of more restricted rights, such as those discussed below.

* Right to Use Personal Data

Whether or not personal data is owned by anyone, the use of personal data might be precluded in the absence of a right to do so. Such a right could arise under law, or under consent (express or implied). This approach is similar to the notion of an 'opt-in' arrangement.

Such a placement of the onus of proof on the user would require a workably long grace period within which all organisations could assess whether they have such a right, and, if not, seek it, or adapt their practices.

* Right to Rent Out or Charge for Use

Whether or not personal data is owned by anyone, the use of personal data might be subject to payment by the user to the data subject. This is the key element of the Westin-Laudon argument for ownership rights in personal data, because it enables the use of data to be determined in a marketplace, through contractual arrangements in respect of each relationship between a person and an organisation. Organisations would only be likely to enter into negotiations for such a contractual right where they foresaw financial or other advantages to themselves in doing so; and this would represent a considerable protection for information privacy.

There are precedents for such approaches in the form of U.S. supermarkets that offer a discount in return for access to personal data; and the 'Fly-Buys' scheme in Australia, which is widely recognised by consumers as involving a sacrifice of privacy in return for the possibility of a reward.

Such a right would be likely to be circumscribed by express legal authority for specific organisations to use specific personal data.

* Right to Preclude Use

Alternatively, all uses of personal data might be regarded as being legal simply because they are not illegal; but each person would have the opportunity to deny organisations the right to use their data. Such a right would be likely to be circumscribed by express legal authority for specific organisations to use specific personal data.

This is similar to the notion of an 'opt-out' arrangement. As a general solution to personal privacy invasions, it is likely to be inadequate; it may, on the other hand, be of value in specific circumstances.

* Automated Decision-Making About People

An effective information privacy protection regime would impose responsibility on the operator of a personal data system to ensure that all decisions about human beings (or at least those that might reasonably be expected to have negative consequences for the people concerned) are subject to review by a human being before being communicated or implemented.

* Personal Control Over Biometrics

personal control over biometrics, including genetic data

Kirby papers?




regulatory, self-regulatory and co-regulatory models

build in the Uni Toronto work

wind back ('the great American way', a la Westin)

an enhanced paradigm (the EU Directive)

a new paradigm (privacy-protection in the new millenium)

an evolutionary approach, to build upon existing regimes:

- refinements to existing regimes

- strengthening of existing regimes

- extensions to existing regimes

On the one hand, the situation demands a significant break with the limited, and inadequate, notion of fair information practices. On the other, it is important that such progress as has been made not be undermined.

Accordingly, this paper proposes a new paradigm for privacy protection that is evolutionary in nature, and carefully embodies fair information practices as a sub-set of the complete framework.

Elements of the New Paradigm


A supra-set of IPPs?

An extension to the OECD G's?

This segment of the paper presents a set of integrated proposals that together represent a new paradigm of information privacy protection for the 21st century. The first cluster of proposals represent refinements to existing regimes. The next group identify necessary strengthenings. The final group are extensions to the existing framework.

* Refinements to Existing Privacy-Protection Regimes

Justification Flaws.html

OECD.html (small omissions)

Matching.html (the part that states the framework needed)

Flaws.html (the small points)

* Strengthening of Existing Privacy-Protection Regimes

The existing implementations of fair information practices philosophy are mostly deficient from the overall model. This is not surprising given the complexity and variability of the technologies and of the settings in which they are applied, and the 25 years that has elapsed between the first such statute in Hesse in 1970 and the most recent (presently the Italian legislation of 1997).

Typical areas in which shortfalls occur are:


Flaws.html (the bigger points)

Matching.html (the meta-requirements, like enforceability)

* Extensions to Existing Privacy-Protection Regimes

Important examples of more up-to-date Principles are:

multi-purpose identification schemes

roles not persons

anonymity and pseudonymity

personal control over data generally

Summary and Implications

A previous paper has argued that privacy has become strategically important for organisations of all kinds ( Clarke 1996b). This paper argues that public policy initiatives are urgently required, to mature the framework within which privacy protections are conceived, in order to address flaws in existing regimes, and adapt them to cope with the dramatic advances in information technology of the last quarter-century.

The winding-back of privacy protections is untenable. Merely sustaining the fair information practices model is also untenable, because it results in only two scenarios, both of which are unacceptable:

Mankind's needs at the beginning of the new millenium are for a new paradigm of privacy protection. This paper has argued that this new paradigm must provide very substantial additional features; but that it does not need to be revolutionary to the extent of forcing the rescission of existing laws.


This document draws on a quarter-century of my own research, including many references to prior analyses of particular parts of the whole problem

It also draws also on the last decade's intellectual debates; but it is not as carefully referenced as would be desirable, and some ideas should be attributed to other people, at least in the form of 'personal communications' and 'discussions at conferences', and in some cases in the form of published conference papers and even orthodox journal articles.

I apologise to people whose work I should have cited, and invite them to draw specific instances of missing references to my attention, for inclusion in the living version of this document, at

References to Other People's Works

Australian Privacy Charter (1994) At

Bennett C. (1992) 'Regulating Privacy: Data Protection and Public Policy in Europe and the United States' Cornell University Press, New York, 1992

Bentham J. (1791) 'Panopticon; or, the Inspection House', London, 1791

CSA (1995) 'Model Code for the Protection of Personal Information' Canadian Standards Association, CAN/CSA-Q830-1995 (September 1995)

Davies S. (1992) 'Big Brother: Australia's Growing Web of Surveillance' Simon & Schuster, Sydney, 1992

Davies S. (1996) 'Monitor: Extinguishing Privacy on the Information Superhighway', Pan Macmillan Australia, 1996

EPIC (1995-) 'National ID Cards', Electronic Privacy Information Center, Washington DC, at

European Union Directive (1996), CHECK THE TITLE: 'The Directive on the protection of individuals with regard to the processing of personal data and on the free movement of such data', European Commission, Brussels, 25 July 1995, at

Flaherty D.H. (1989) 'Protecting Privacy in Surveillance Societies' Uni. of North Carolina Press, 1989

Foucault M. (1977) 'Discipline and Punish: The Birth of the Prison' Peregrine, London, 1975, trans. 1977

HREOC (1992-) 'Federal Privacy Handbook: A Guide to Federal Privacy Law and Practice' Redfern Legal Centre Publishing Ltd, 13A Meagher St, Chippendale NSW 2008

HREOC (1995a) 'Community Attitudes to Privacy', Information Paper No. 3, Human Rights Australia - Privacy Commissioner, Sydney (August 1995)

Hughes G. (1991) 'Data Protection Law in Australia', Law Book Company, 1991

IPCO (1995) 'Privacy-Enhancing Technologies: The Path to Anonymity' Information and Privacy Commissioner for Ontario and Registriekammer, The Netherlands (August 1995)

Kim J. (1997) 'Digitized Personal Information and the Crisis of Privacy: The Problems of Electronic National Identification Card Project and Land Registry Project in South Korea', at

Madsen W. (1992) 'Handbook of Personal Data Protection' Macmillan, London, 1992

Morison W.L. (1973) 'Report on the Law of Privacy' Govt. Printer, Sydney 1973

NSWPC (1977) 'Guidelines for the Operation of Personal Data Systems' New South Wales Privacy Committee, Sydney, 1977

NSWPC (1995) 'Smart Cards: Big Brother's Little Helpers', The Privacy Committee of New South Wales, No.66, August 1995, at

OECD (1980) 'Guidelines on the Protection of Privacy and Transborder Flows of Personal Data' OECD, Paris, 1980

OTA (1985) 'Electronic Surveillance and Civil Liberties' OTA-CIT-293, U.S. Govt Printing Office, Washington DC, October 1985

OTA (1986) 'Federal Government Information Technology: Electronic Record Systems and Individual Privacy' OTA-CIT-296, U.S. Govt Printing Office, Washington DC, June 1985

Privacy International (1996) 'Privacy International's FAQ on Identity Cards'', at

Roszak T. (1986) 'The Cult of Information' Pantheon 1986

Rule J.B. (1974) 'Private Lives and Public Surveillance: Social Control in the Computer Age' Schocken Books, 1974

Rule J.B., McAdam D., Stearns L. & Uglow D. (1980) 'The Politics of Privacy' New American Library 1980

Smith R.E. (ed.) (1974-) ' Privacy Journal', monthly since November 1974

Tucker G. (1992) 'Information Privacy Law in Australia' Longman Cheshire, Melbourne, 1992

Westin A.F. (1967) 'Privacy and Freedom' Atheneum 1967

Westin, A.F., Ed. (1971) 'Information Technology in a Democracy', Harvard University Press, Cambridge, Mass., 1971

Westin A.F. & Baker M.A. (1974) 'Databanks in a Free Society: Computers, Record-Keeping and Privacy' Quadrangle 1974

The Winds (1997) 'The future has arrived' (June 1997), at

References to the Author's Own Works

Clarke R. (1987) 'Just Another Piece of Plastic for Your Wallet: The Australia Card' Prometheus 5,1 June 1987 Republished in Computers & Society 18,1 (January 1988), with an Addendum in Computers & Society 18,3 (July 1988). At

Clarke R. (1988) 'Information Technology and Dataveillance', Commun. ACM 31,5 (May 1988). Republished in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991, at

Clarke R. (1989) 'The OECD Data Protection Guidelines: A Template for Evaluating Information Privacy Law and Proposals for Information Privacy Law' Working Paper (25pp.) (October 1989). At

Clarke R. (1992) 'The Resistible Rise of the National Personal Data System' Software Law Journal 5,1 (January 1992) , at

Clarke R. (1993a) 'Why the Public Is Scared of the Public Sector', IIR Conference paper, February 1993, at

Clarke R. (1993b) 'Profiling: A Hidden Challenge to the Regulation of Data Surveillance', Journal of Law and Information Science 4,2 (December 1993), at . A shorter version was published as 'Profiling and Its Privacy Implications' Australasian Privacy Law & Policy Reporter 1,6 (November 1994), at

Clarke R.A. (1994a) 'The Digital Persona and Its Application to Data Surveillance' The Information Society 10,2 (June 1994), at

Clarke R. (1994b) 'Information Technology: Weapon of Authoritarianism or Tool of Democracy?' Proc. World Congress, Int'l Fed. of Info. Processing, Hamburg, September 1994. At

Clarke R. (1994c) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Information Technology & People 7,4 (December 1994) 6-37, at

Clarke R. (1994d) 'Dataveillance by Governments: The Technique of Computer Matching' Information Technology & People 7,2 (December 1994). Abstract at

Clarke R. (1995a) 'Computer Matching by Government Agencies: The Failure of Cost/Benefit Analysis as a Control Mechanism' Informatization and the Public Sector (March 1995). At

Clarke R. (1995b) 'A Normative Regulatory Framework for Computer Matching' Journal of Computer and Information Law XIII,4 (Summer 1995) 585-633, at

Clarke R. (1995c) 'When Do They Need to Know 'Whodunnit?': The Justification for Transaction Identification; The Scope for Transaction Anonymity and Pseudonymity' Proc. Conf. Computers, Freedom & Privacy, San Francisco, 31 March 1995. At Revised version published as 'Transaction Anonymity and Pseudonymity' Privacy Law & Policy Reporter 2, 5 (June/July 1995) 88-90. Condensed paper published as 'Identification, Anonymity and Pseudonymity in Consumer Transactions: A Vital Systems Design and Public Policy Issue', October 1996, at

Clarke R. (1995d) 'Clients First or Clients Last? The Commonwealth Government's IT Review' Privacy Law & Policy Reporter 2, 4 (April / May 1995). At

Clarke R. (1995e) 'Trails in the Sand', at

Clarke R. (1996a) 'Smart move by the smart card industry: The Smart Card Industry's Code of Conduct' Privacy Law & Policy Reporter 2, 10 (January 1996) 189-191, 195. At

Clarke R. (1996b) 'Privacy and Dataveillance, and Organisational Strategy', EDPAC Conference Paper (May 1996), at

Clarke R. (1996c) 'Data Transmission Security, or Cryptography in Plain Text'Privacy Law & Policy Reporter 3, 2 (May 1996), pp. 24-27 , at

Clarke R. (1996d) 'Privacy Issues in Smart Card Applications in the Retail Financial Sector', in 'Smart Cards and the Future of Your Money', Australian Commission for the Future, June 1996, pp.157-184. At

Clarke R. (1996e) 'The Information Infrastructure is a Super Eye-Way: Book Review of Simon Davies' 'Monitor'' Privacy Law & Policy Reporter 3, 5 (August 1996), at

Clarke R. (1997a) 'What Do People Really Think? MasterCard's Survey of the Australian Public's Attitudes to Privacy', Privacy Law & Policy Report 3,9 (January 1997), at

Clarke R. (1997b) 'Flaws in the Glass; Gashes in the Fabric: Deficiencies in the Australian Privacy-Protective Regime', Invited Address to Symposium on 'The New Privacy Laws', Sydney, 19 February 1997 , at

Clarke R. (1997c) 'Smart Cards in Banking and Finance' The Australian Banker 111,2 (April 1997), at

Clarke R. (1997d) 'Privacy and 'Public Registers'', Proc. IIR Conference on Data Protection and Privacy, Sydney, 12-13 May 1997, at

Clarke R. (1997e) 'Chip-Based ID: Promise and Peril', Proc. International Conference on Privacy, Montreal, 23-26 September 1997, at

Greenleaf G.W. & Clarke R. (1997) 'Privacy Implications of Digital Signatures', IBC Conference on Digital Signatures, Sydney, 12 March 1997, at

xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 27 September 1997 - Last Amended: 1 February 1998 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy