Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Impact Assessments'

Approaches to Impact Assessment

Notes of 12 January 2014

Notes for a Panel Presentation at CPDP'14, Brussels, 22 January 2014, on the topic of 'Legal and Non-Legal Technology Impact Assessments'

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2013-14

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/SOS/IA-1401.html

The supporting slide-set is at http://www.rogerclarke.com/SOS/IA-1401.pdf


Abstract

This presentation identifies several categories of methods for assessing impacts of technologies, and highlights their key characteristics. This framework is applied to the European Commission's concept of a Data Protection Impact Assessment (DPIA).


Contents


1. Introduction

To complement the deeper, intellectual contributions of other panellists, this presentation provides an overview of a number of alternative approaches to Technology Impact Assessment. In order to demonstrate the usefulness of the framework, it is the applied to the EC's Data Protection Impact Assessment (DPIA) notion.

This presentation acknowledges the contributions to impact assessment that were made during the last third of the 20th century in the context of industrial processes impacting on the physical environment, in the forms of Environmental Impact Statements and Assessments. The scope of the current work is limited to the human impacts of information technologies.

The first section categorises the various forms of assessment according to the activity's focal-point. Several well-known approaches are then described, including Business Case Assessment, Risk Assessment, Technology Assessment and Impact Assessment. The EC's DPIA concept is then considered, within that framework.


2. The Focus of the Assessment

In Exhibit 1, various categories of assessment are identified, grouped according to the focus of the analysis.

Exhibit 1: Assessment Categories, According to Focus

TECHNOLOGY FOCUS

PROJECT FOCUS

SOCIAL IMPACTS FOCUS

COMPLIANCE FOCUS


3. Business Case Assessment

The term 'business case' is used for a variety of means whereby an argument is presented in support of some course of action. It may be more or less formalised. It may be of an independent, investigative nature; but in many cases it is politicised, and designed to provide justification for an intended course of action.

In Exhibit 2, the broadest possible interpretation is provided of the formal methods that can be used to prepare a business case. Most commonly, the perspective adopted is that of the organisation sponsoring the proposal, the data considered is primarily quantitative (although of necessity estimates), and the analysis adopts a deterministic approach rather than considering contingencies in any great depth.

The most frequently-used techniques are the first in the two upper cells of Exhibit 2- Discounted Cash Flows, and Internal Cost-Benefit Analysis. Less commonly, a business case may reflect the perspectives of additional stakeholders, but this is usually only the case where those stakeholders have sufficient institutional or market power that their collaboration with the primary sponsor is perceived to be crucial to the success of the course of action.

Exhibit 2: Categories of Business Case Methods
Extract from Clarke (2008)


4. Risk Assessment

When preparing a business case, some organisations may consider contingencies more carefully, by applying Risk Assessment techniques. This is indicated by the other methods in the upper cells of Exhibit 2. One, usually minor, consideration in Risk Assessment is legal compliance, because initiatives may be harmed and even undermined to the extent that they are in breach of the law, and enforcement action is taken against them. Risk Assessment is a well-established technique, and is the subject of various Standards. On the other hand, it is commonly used relatively late in projects, and is mostly undertaken from the perspective of a single organisation.

Risk Assessment can, however, be performed from the perspective of multiple stakeholders rather than just the project sponsor, as reflected in the techniques in the bottom-right cell of Figure 2. For an example of a recommended approach, customised to smart meter projects, see Yesudas & Clarke (2013).


5. Technology Assessment

Technology Assessment (TA) is "a scientific, interactive and communicative process, which aims to contribute to the formation of public and political opinion on societal aspects of science and technology" (EPTA). The scope is generally defined in terms of one or more technologies, viewed as interventions into a particular, existing physical, social and/or economic context. The time-horizon may be short-term / tactical, but is more often long-term / strategic. The broader the scope and the longer the time-horizon, the greater the flexibility needed in the scope definition.

Unlike Business Case Assessment and most Risk Assessment activities, TA does not adopt the limited perspective of the project sponsor, but reflects the needs of all parties involved. In order to relate TA to other forms of assessment, Exhibit 3 provides an overview of some key dimensions of Technology Assessments. A somewhat fuller description is provided in Appendix 1. The notion of a standing Office of Technology Assessment (OTA) is outlined in Appendix 2, including reference to the US OTA 1972-1995, and European OTAs from the early-to-mid 1980s. An early treatment of the US Office's methods is in OTA (1977). See also Garcia (1991).

Exhibit 3: Dimensions of Technology Assessment

1. The Scope of Technologies Considered

2. The Scope of Perspectives Reflected

3. The Scope of Values Impinged Upon


6. Social Impact Assessment

Another form of analysis considers the impacts that a particular intervention will have on its environment. Impact Assessment can be traced back to Environmental Impact Statements (EIS), which originated in the 'green' movements of the 1960s. Mere 'Statements' achieve no more than publicity - in the same way that Data Breach Notification is not a privacy protection, but merely delivers warning signals that privacy is not being protected. The more useful notion of an Impact Assessment (IA) emerged, usefully defined as "the identification of future consequences of a current or proposed action". This lifts the focus beyond product alone to include process, and is a more fully articulated concept, including prior publication, public consultation, further publication and review.

The 'environment' that is impacted by the intervention may not be physical or biological but rather social, which may be interpreted narrowly to refer to associations and transactions among people, or more broadly to encompass economic and/or political aspects of human behaviour. A set of criteria may be asserted ad hoc, or derived from some relevant source or sources.

One source is human rights instruments, such as the the Universal Declaration of Human Rights (UDHR 1948), the International Covenant on Civil and Political Rights (ICCPR 1966), and the extensions embodied in the International Covenant on Economic, Social and Cultural Rights (ICESCR 1966). Even broader ethical bases are possible, as discussed in Wright (2011) and Wright & Friedewald (2012).

Another approach has been proposed, whose concern is with the impacts of the full range of surveillance technolologies on human rights and other social and political factors (Wright & Raab 2012). More narrowly, Privacy Impact Assessment (PIA) prioritises for evaluation the multiple dimensions of privacy (Clarke 2009, Wright & De Hert 2012). Much more narrowly again, a mere Data Privacy Impact Assessment (DPIA Type 1) ignores privacy of the person, of personal behaviour, personal experience, and even personal communications, and concerns itself only with the interest that individuals have in exercising control over data about themselves.


7. The EC DPIA

The concept of a 'Data Protection Impact Assessment' (DPIA Type 2) was contained within the European Commission's Draft Data Protection Regulation of 25 January 2012 (EC 2012a). The provision appears in Article 33 of the Draft Regulation, on pp. 62-63. A marked-up version appears in Appendix 3.

Virulent lobbying was conducted against elements of the Draft that would have increased control over foreign corporations. The Draft appears to have been abandoned with effect from the end of 2013, but at the very least consideration of it has been postponed until after the EP elections in mid-2014. In the event that the DPIA concept is persisted with, it is important that it be critically assessed; and in any case it represents a suitable proposal to consider through the lens of the framework outlined above.

A further reason the DPIA is important is that it was referred to in an EC Recommendation in relation to smart metering systems (EU 2012b). That Recommendation included an engrained presumption that "the deployment of smart grids and smart metering systems should allow suppliers and network operators to [have] detailed information on the energy behaviour of individual end-consumers" (at (6)). In short, this is a blatant attempt by the Commission to authorise wholesale leakage of personal data to multiple organisations up the electricity supply chain. The Recommendation was subjected to hefty criticism by the Article 29 Committee of Data Protection Commissioners (Art 29 2013), who referred to their previously published Opinion on smart meters (Art 29 2011) and that of the European Data Protection Supervisor (EDPS 2012). A further factor is relevance is that the term of the current EDPS expires in mid-January 2014, and the Commission has failed to appoint a successor.

Inspection of the Article 33 specification of a 'DPIA', in light of the framework established above, gives rise to the following observations:

The expectations of a Privacy Impact Assessment were articulated in Clarke (2011) and APF (2013). Against both those standards and the various forms of Assessment reviewed earlier in this paper, the EC's DPIA notion is a pale imitation. On the basis of Article 33, and its bold attempt to ride roughshod over the EDPS and the Article 29 Committee, and its failure to assure a smooth transition between old and new EDPS, the European Commission cannot be regarded as being at all friendly to privacy, but rather to be aligned with the US Administration in attempting to ratchet down existing privacy protections.


8. Conclusions

Beware cheap imitations. The EC's DPIA concept is extraordinarily narrow, has inbuilt exemptions, and an inbuilt mechanism for creating exceptions. The proposal has so little merit that it should be rejected, and the Commission pressured to abandon it, and bring forward an alternative proposal that reflects the insights embodied in the Technology Assessment, Risk Assessment, and Privacy Impact Assessment processes outlined above.


Appendix 1: Technology Assessment

Technology Assessment (TA) is "a scientific, interactive and communicative process, which aims to contribute to the formation of public and political opinion on societal aspects of science and technology" (EPTA).

The scope is generally defined in terms of one or more technologies, viewed as interventions into a particular, existing physical, social and/or economic context. The time-horizon may be short-term / tactical, but is more often long-term / strategic. The broader the scope and the longer the time-horizon, the greater the flexibility needed in the scope definition.

The deliverables from a TA process are generally expected to include information about:

The audience is generally defined as a Parliament or a society-at-large.

TA is generally intended to:

It is infeasible for TA to be performed by or for individual government agencies, industry associations or major corporations, or even consortia of such organisations, because this inevitably results in compromise of the requirements that it reflect multiple perspectives and be conducted openly.

The vehicle may be:

The assessment may be performed by employees, secondees or contractors.

TAs were conducted during the 1990s on such topics as:

Aspects of the TA process are:

Because of the diversity of TA projects, it is challenging to specify a generic method, and instead it is normal for a process to be devised for each particular project.


Appendix 2: Offices of Technology Assessment (OTAs)

The US Office of Technology Assessment (OTA) was formed in 1972. Its establishment followed a period in which Rachel Carson's 'Silent Spring' had alerted the public to environmental impacts (in that case, to the insecticidal impacts of DDT) and Ralph Nader's 'Unsafe at Any Speed' had sounded the alarm bells about the design of cars. It was an organ of the US parliament, and reported to a Committee of Congress.

During the 1990s, US politics changed. Endeavours to balance apparent economic progress against its consequences gave way to economic dominance over social goals; policy came to be defined substantially in terms of party-politics; corporate stategies became afflicted with short-termism; and corporations' growth in size, in trans-nationalism and in power resulted in government and regulation being downgraded to governance and 'self-regulation'. One example of the change in priorities was the debasement of Environmental Impact Assessment to mere Environmental Impact Statements (EIS). Another was the disestablishment of OTA during the Reagan Administration, in 1995.

A number of European countries have OTA-like organisations that report more or less directly to the national parliament, e.g. in France since 1983 (OPECST - Office Parlementaire d'Evaluation des Choix Scientifiques et Technologiques) and in Denmark since 1986 (the Danish Board of Technology). A European Parliamentary Technology Assessment (EPTA) network was formed in 1990, and links a dozen such organisations. The EP has a Science and Technology Options Assessment (STOA) panel and bureau.


Appendix 3: The EC Data Protection Impact Assessment (DPIA)

Extract from EC (2012), pp. 62-63
All emphases added by this author

Article 33
Data protection impact assessment

  1. Where processing operations present specific risks to the rights and freedoms of data subjects by virtue of their nature, their scope or their purposes, the controller or the processor acting on the controller's behalf shall carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.
  2. The following processing operations in particular present specific risks referred to in paragraph 1:
  3. The assessment shall contain at least a general description of the envisaged processing operations, an assessment of the risks to the rights and freedoms of data subjects, the measures envisaged to address the risks, safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation, taking into account the rights and legitimate interests of data subjects and other persons concerned.
  4. The controller shall seek the views of data subjects or their representatives on the intended processing, without prejudice to the protection of commercial or public interests or the security of the processing operations.
  5. Where the controller is a public authority or body and where the processing results from a legal obligation pursuant to point (c) of Article 6(1) providing for rules and procedures pertaining to the processing operations and regulated by Union law, paragraphs 1 to 4 shall not apply, unless Member States deem it necessary to carry out such assessment prior to the processing activities.
  6. The Commission shall be empowered to adopt delegated acts in accordance with Article 86 for the purpose of further specifying the criteria and conditions for the processing operations likely to present specific risks referred to in paragraphs 1 and 2 and the requirements for the assessment referred to in paragraph 3, including conditions for scalability, verification and auditability. In doing so, the Commission shall consider specific measures for micro, small and medium-sized enterprises.
  7. The Commission may specify standards and procedures for carrying out and verifying and auditing the assessment referred to in paragraph 3. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 87(2).

References

APF (2013) '' Australian Privacy Foundation, April 2013, at http://www.privacy.org.au/Papers/PS-PIA.html

Art 29 (2011) 'Opinion 12/2011 on smart metering' Article 29 Data Protection Working Party, 00671/11/EN, WP 183, 4 April 2011, at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2011/wp183_en.pdf

Art 29 (2013) 'Opinion 04/2013 on the Data Protection Impact Assessment Template for Smart Grid and Smart Metering Systems (`DPIA Template')' Expert Group 2 of the Commission's Smart Grid Task Force, Article 29 Data Protection Working Party, 00678/13/EN, WP205, 22 April 2013, at http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp209_en.pdf

Clarke R. (2003) 'Scenario-Based Research' Xamax Consultancy Pty Ltd, June 2003, at http://www.xamax.com.au/Res/Scenarios.html

Clarke R. (2008) 'Business Cases for Privacy-Enhancing Technologies' Chapter 7 in Subramanian R. (Ed.) 'Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions' IDEA Group, 2008, pp. 135-155, at http://www.rogerclarke.com/EC/PETsBusCase.html

Clarke R. (2009) 'Privacy Impact Assessment: Its Origins and Development' Computer Law & Security Review 25, 2 (April 2009) 123-135, at http://www.rogerclarke.com/DV/PIAHist-08.html

Clarke R. (2011) 'An Evaluation of Privacy Impact Assessment Guidance Documents' International Data Privacy Law 1, 2 (March 2011) 111-120, PrePrint at http://www.rogerclarke.com/DV/PIAG-Eval.html

EC (2012a) 'Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)', European Commission, COM(2012) 11 final, 2012/0011 (COD), Brussels, 25.1.2012, at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do%3Furi%3DCOM:2012:0011:FIN:EN:PDF

EC (2012b) 'Commission Recommendation on preparations for the roll-out of smart metering systems' 2012/148/EU, 9 March 2012, at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2012:073:0009:0022:EN:PDF

EDPS (2012) 'Opinion of the European Data Protection Supervisor on the Commission Recommendation on preparations for the roll-out of smart metering systems' European Data Protection Supervisor, 8 June 2012, at http://www.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/Consultation/Opinions/2012/12-06-08_Smart_metering_EN.pdf

EPTA, at http://www.eptanetwork.org/

Garcia L. (1991) 'The U.S. Office of Technology Assessment' Chapter in Berleur J. & Drumm J. (eds.) 'Information Technology Assessment' North-Holland, 1991, at pp.177-180

ICCPR (1966) 'International Covenant on Civil and Political Rights' United Nations, 16 December 1966, at http://www2.ohchr.org/english/law/ccpr.htm

ICESCR (1966) 'International Covenant on Economic, Social and Cultural Rights' United Nations, 16 December 1966, at http://www2.ohchr.org/english/law/cescr.htm

OTA Archive, at http://ota.fas.org/

OTA Legacy, at http://www.princeton.edu/~ota/

OTA (1977) 'Technology Assessment in Business and Government' Office of Technology Assessment, NTIS order #PB-273164', January 1977, at http://www.princeton.edu/~ota/disk3/1977/7711_n.html

UDHR (1948) 'Universal Declaration of Human Rights' United Nations, 10 December 1948, at http://www.un.org/en/documents/udhr/index.shtml

Wright, D. (2011) 'A framework for the ethical impact assessment of information technology' Ethics and Information Technology 13 (2011) 199-226

Wright D. & De Hert P. (eds) (2012) 'Privacy Impact Assessments' Springer, 2012

Wright D. & Friedewald M. (2012) 'Integrating privacy and ethical impact assessments' Science and Public Policy 40, 6 (December 2013) 755-766, at http://spp.oxfordjournals.org/content/40/6/755.full

Wright D. & Raab C.D. (2012) 'Constructing a surveillance impact assessment' Computer Law & Security Review 28, 6 (December 2012) 613-626

Yesudas R. & Clarke R. (2013) 'Framework for Risk Analysis in Smart Grid: Perspective Based Approach' Proc. 8th Int'l Conf. on Critical Information Infrastructures Security (CRITIS 2013), Amsterdam, 16-18 September 2013, at http://www.rogerclarke.com/EC/SG-FRA.html


Acknowledgements

These Notes benefited from the stimulus provided by Panel Chair, Mireille Hildebrandt of the Erasmus School of Law in Rotterdam, and conversations with David Wright of Trilateral Research. But the opinions are mine alone.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University. He has been a Director of the Australian Privacy Foundation (APF) since its formation in 1987, and its Chair 2006-14. He has also been a member of the International Advisory Board of Privacy International since its inception in 2000.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 31 December 2013 - Last Amended: 12 January 2014 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/SOS/IA-1401.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy