Roger Clarke's Web-Site© Xamax Consultancy Pty Ltd, 1995-2024 |
||||||
HOME | eBusiness |
Information Infrastructure |
Dataveillance & Privacy |
Identity Matters | Other Topics | |
What's New |
Waltzing Matilda | Advanced Site-Search |
Version of 21 August 2021
For a Panel Session on 'Geolocation and Dataveillance, Sousveillance and Uberveillance', AAG Geoethics Series Webinar, American Association of Geographers, 11 August 2021
© Xamax Consultancy Pty Ltd, 2021
Available under an AEShareNet licence or a Creative Commons licence.
This document is at http://rogerclarke.com/DV/DV21.html
The slide-set is at http://rogerclarke.com/DV/DV21.pdf
The pre-recorded video (6 mins) is at http://rogerclarke.com/DV/DV21.mp4
Here is the video of the Panel Session
And here is the index to the complete set of session videos
This paper offers a brief review of the history of the dataveillance notion. It was prepared in support of an invited panel presentation that opened a Webinar on Geosurveillance and its impacts, in August 2021.
The origins of the term 'surveillance' are inherently physical, spatial and visual. The French verb 'veiller', from which it was derived, is translatable as observe or watch out for. The 'sur' prefix adds weight to the sense of 'watching over' rather than merely 'watching'.
French usage dates to the terrors of the Revolution at the end of the 18th century. Bentham is reputed to have promptly adopted it into English, to refer to his 'panopticon' or 'all-watching' prison tower, which was self-evidently so much more humane and efficient than transporting convicts to the colonies. In the mid-twentieth century, Foucault reinforced the visual metaphor.
Extensions to the idea were readily apparent. One was aural monitoring of voice and other sounds, another the observation of vision and sound at distance. Retrospective surveillance became increasingly feasible, initially by means of hand-recording of textual transcriptions, descriptions and depictions, then by photographic means from the mid-19th century, augmented by sound recording from the turn of the 20th century. These were all geo-spatial in nature, whereas, as early as the 1840s the telegraph gave rise to electronic surveillance, which extended to telephone wiretapping from the 1890s. That established a new form that we commonly refer to now as 'virtual space'.
So definitions need to be generic, broader than just the visual, across many kinds of space:
Surveillance is the systematic investigation or monitoring of the actions or communications of one person (personal surveillance) or multiple people (mass surveillance)
Given my background in information systems, it was natural for me to perceive data as yet another kind of space. I coined dataveillance by joining 'data' with '{sur}veillance', and first aired it in Symposium presentation (Clarke 1986). I prepared the slide-set almost precisely 35 years before this Panel took place (on celluloid, hand-written or with a dot-matrix printer, I can't remember which). I explained the concept informally as follows:
Dataveillance involves the consolidation of personal data from multiple sources, and the use of information technologies to exploit that data
Techniques already in use included identification schemes, front-end verification, cross-system enforcement, profiling and data matching (Clarke 1988). When the OED added the word to its collection, it identified, prior to my 1988 publication, a single occurrence in a 40-page law review article (Davis 1972), and another in a 1979 newspaper article.
The reason attention needed to be drawn to dataveillance was that it greatly changed the economics of monitoring people. Watching data was potentially far less expensive than watching the people themselves, and, even at that early stage, potentially nearly as good. So organisations could watch over many more people through their data than through their real-world presence. And that laid the groundwork for far more effective influence over whatever human behaviour those organisations were concerned with.
To conduct personal dataveillance, of a particular individual, that person needs to be distinguishable from everyone else. Initially, an identifier was a name, whether the one that people had been given by their parents, or one that they used in a particular context. This involves ambiguities and awkwardnesses, so organisations found it more efficient to assign codes or account-numbers (in most cases, comprising a series of digits) to individuals who they dealt with (Clarke 1994b).
As it became feasible to capture impressions of people's bodies or behaviour, biometric measures emerged, and later proliferated beyond application in criminal investigation (Clarke 2001a). A biometric does not relate to a particular presentation of an individual, as an identifier does. Instead it is an entifier of the underlying human entity (Clarke 2003b, Clarke 2009). Use of a biometric aids and abets the consolidation of all data about an individual's many identities. Added to that, the design of the various forms of (id)entifiers has facilitated their digitisation for use in computer-based systems.
Dataveillance builds a model of some aspects of an individual's personality, based on data, and uses that model as a proxy for the individual, commonly with the aim of achieving some form of control over the person's behaviour. Various terms have been used for the resulting model. My coinage, digital persona, has achieved some currency (Clarke 1994a, 2014). Eric Schmid applied the term at Novell, even before he became CEO of Google in 2001 (Nutter 1999). Google turned it into an art form, and a massively profitable one.
The organisation that constructs a digital persona may intend capturing a comprehensive and sufficiently deep impression of a particular human entity (e.g. in criminal intelligence work). On the other hand, most digital personae either provide a narrow and shallow model of a particular facet or identity, or are based on opportunistic data-gathering and hence are likely to be partial, substantially erroneous, and misleading.
The original conception of dataveillance was serviceable for quite some years. During the closing decades of the 20th century, however, large numbers of new systems and associated data-trails were created, specifically to advantage organisations (Clarke 2003a). This necessitated adaptation to my original definition, resulting in this current formulation (Clarke & Greenleaf 2018):
Dataveillance is the systematic [INSERT: creation and/or] use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons
Another major development was the leap in diversity, sophistication and effectiveness of means of people location and tracking (Clarke 2001b). The primary playing-field is geo-space, but there are also significant impacts on network, intellectual and emotional space. This has provided organisations with far greater scope to draw inferences about individuals' behaviour and interests, and to interfere with their behaviour.
Location and tracking can be retrospective, contemporaneous, real-time, and even prospective in nature. Individuals are under threat from correct inferences drawn about their actions, attitudes, intentions and associates. They are also subject to unreasonable and downright incorrect inferences, and resulting actions by organisations that are against their interests. Those actions extend beyond surveillance to interception, interference and incarceration. Dobson & Fisher (2003, 2007) coined the term 'geoslavery' to convey the degree of threat inherent in always-on tracking. See also Clarke & Wigan (2011) and (Michael & Clarke 2013).
During the last decade, data mining has been re-badged and furiously spruiked, as big data, open data, data analytics and data science; and algorithmic techniques have been supplemented with machine-learning (AI/ML) empirical approaches. In such forms, digital data analytics threatens a-rational decision-making, and inexplicable, unappealable actions (Clarke 2017, 2019b, 2019c).
The digitisation revolution was roughly completed during the period 1980 to 2010, resulting in almost all data being created digital, or being readily convertible from analogue to digital form. This laid the foundation for 'digitalisation' to flourish. This involves the adaptation of organisational processes to take full advantage of digitisation. Each person, and the many identities that each person presents to the world, are largely ignored in favour of dealing with digital personae.
The benefit of digitalisation to organisations is that it saves them both the costs of directly interacting with people and the awkwardness and risk of having to take into account wet-chemistry humanness, its inherent emotions and ambiguity, and such consequential virtues as tolerance and compassion. Even the 1990's hope of improved linkages between local governments and their publics has not eventuated, with institutional distance maintained by means of 'engagement' strategies successfully keeping people at electronic arms' length.
One critical step late in that process was the subversion of the consumer-oriented Internet by means of Web 2.0 technologies (Clarke 2008). Another area of depradation arose because almost all forms of reading and access to live events, text, data, image and video migrated from mostly-anonymous analogue to mostly-identified digital forms during a remarkably short time: the first decade of the 21st century. As a result, the 'privacy of personal experience' has been massively compromised (Clarke 2013).
The maturation of Web 2.0 gave rise to so-called 'platforms', which use a new form of business model whose widespread use constitutes the 'digital surveillance economy' (Clarke 2019a). Data is accessed from a wide variety of sources, and consolidated into a digital persona. This is used for individual decision-making - with high error-rates inevitable because of the inadequacies of the data that have given rise to the persona. It is also well-known to be used to target ads, and to manipulate behaviour; whereas its use for micro-pricing still seems to be flying 'under the radar'.
The broader impacts of the business model used by 'bigtech platforms' are pilloried as 'surveillance capitalism' (Zuboff 2015). Zuboff's considers the cause to be the capacity of Google, Facebook, et al. to acquire vast quantities of data and use it to manipulate consumer behaviour. Doctorow (2020), on the other hand, argues that the real issue as being the failure of governments to detect emergent monopolies and regulate their behaviour for the public good.
At the end of this paper, I provide a brief explanation of the drivers of my work in this area during the last 35 years. But first, I want to summarise the main body of the analysis, and pass the baton to my two fellow-panellists.
A sceptical position, expressed about the same time as I coined 'dataveillance', is that "Information, [even today], is no more than it has ever been: discrete little bundles of fact, sometimes useful, sometimes trivial, and never the substance of thought [and knowledge] ... The data processing model of thought ... coarsens subtle distinctions in the anatomy of mind ... Experience ... is more like a stew than a filing system ... Every piece of software has some repertory of basic assumptions, values, limitations embedded within it ... [For example], the vice of the spreadsheet is that its neat, mathematical facade, its rigorous logic, its profusion of numbers, may blind its user to the unexamined ideas and omissions that govern the calculations ... garbage in - gospel out. What we confront in the burgeoning surveillance machinery of our society is not a value-neutral technological process ... It is, rather, the social vision of the Utilitarian philosophers at last fully realized in the computer. It yields a world without shadows, secrets or mysteries, where everything has become a naked quantity" (Roszak 1986, pp. 87, 95, 98 ,118, 120, 186-7).
Exploiting the 35 years of development in dataveillance technologies and practices since then, organisations now distance themselves from the individuals they deal with, resulting in the dehumanisation of organisational decision-making. The politicians that dominate legislatures are beguiled by lobbyists' economics-, innovation- and progress-laced imagery. Government agencies, meanwhile, grease the wheels of business, and themselves apply business-developed data management and decision-process tools. Social control, supported by dataveillance, dominates social values.
From the viewpoint of the public, as both citizens and consumers, across all of the political, social and economic dimensions, the situation is not good. However, it is also not hopeless. We can resist, and we can re-form the processes in both the public and the private spheres. And that's as good as segue as any across to the next panellist, Steve Mann.
Steve Mann originated a number of important technologies and concepts, including the wearcam. In 1995, he inverted 'surveillance' (from above, by the powerful of the weak) and conceived 'sousveillance' (from below, by the weak of the powerful). See Mann et al. (2003). A related idea is that of ubiquitous transparency (Brin 1998).
To achieve balance, it is desirable that sur- and sous-veillance be conducted at a similar level of intensity. This led to the ideas of equiveillance and inequiveillance, to reflect the degree of balance between them (Mann 2005). Steve has a lot more ideas in this area than just those few. See also 'suicurity' (Mann 2014), 'humanistic intelligence' (Mann et al. 2018) and 'priveillance' (Mann et al. 2020).
My third fellow-panellist, Michael G. Michael, originated the notion of Uberveillance (Michael 2006). He uses it to refer to embedded, omnipresent electronic surveillance - the use of artefacts implanted into or affixed onto the human perhaps to replace or augment natural functions or add new functions, but also the avenue for monitoring and control by others. The ideas are further developed in Michael & Michael (2007), Michael et al. (2008), Michael & Michael (2014), Michael et al. (2015), and Michael et al. (2021). See also Clarke 2010).
Uberveillance is a rich construct, which has several characteristics and hence is capable of multiple interpretations. For a brief summary, see the presentation at Michael (2020):
It may be worth recording here what drove me to commit so much time to this area for such an extended period.
During my formative years (roughly 1965-70), while working full-time, studying part-time, and doing officer training in the Army Reserve, I found time to be influenced by the dystopian literature, in particular Zamyatin's 'We', Orwell's '1984' and Koestler's 'Darkness and Noon'. The period was pervaded by the bleakness of the Cold War, Russian repression of the Soviet bloc, and existentialist novelists. In 1968-69, both the unrest in France and Woodstock were reactions against the greyness of the time.
I can't claim to have been particularly concerned about social justice or differential treatment of minorities. I saw mechanisms of repression as the fundamental issue. These prevented control being exercised over the power of corporations to manipulate consumers' behaviour, and the power of governments to constrain the expression of political views. Social stasis and psychological gloom needed to be addressed, but the path to those important improvements was via political debate - in my conservative view, debate that was vocal and firm, but not violent.
From about 1970, I was active in the emergent information systems profession and discipline, and it was natural for me to focus not so much on political processes themselves, but on the nature of information flows that enabled repression. Rather than suppression of information flows out to the public, my main focus was on the ways in which governments could use information collection and management as enablers for social control. From the early 1970s, I argued from within the computing profession for appropriate forms of regulation, and in particular for privacy protections. My publications during that period were informal, but see (NSWPC 1977).
After a decade's work-experience, completing degrees and then working and travelling in Europe for over 5 years, I returned to Australia, and rapidly became embroiled in analysing and arguing about inappropriate behaviour by organisations, and particularly by governments. The 2-1/2-year Australia Card campaign, 1985-87 (Clarke 1987), had a great deal of influence on the conception and articulation of the many elements within the dataveillance constellation of ideas. Since then, it's been natural to investigate and articulate the many facets of dataveillance and the technologies that enable it.
Brin D. (1998) 'The Transparent Society' Addison-Wesley, 1998
Clarke R. (1986) 'Information Technology and 'Dataveillance'' Proc. Symp. on Comp. & Social Responsibility, Macquarie Uni. Dept of Comp. Sci., September 1986
Clarke R. (1987) ''Just Another Piece of Plastic for Your Wallet': The Australia Card' Prometheus 5,1 June 1987 Republished in Computers & Society 18,1 (January 1988), with an Addendum in Computers & Society 18,3 (July 1988). At http://www.rogerclarke.com/DV/OzCard.html
Clarke R. (1988) 'Information Technology and Dataveillance' Comm. ACM 31,5 (May 1988) Re-published in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991, PrePrint at http://www.rogerclarke.com/DV/CACM88.html
Clarke R. (1994a) 'The Digital Persona and Its Application to Data Surveillance' The Information Society 10,2 (June 1994), PrePrint at http://www.rogerclarke.com/DV/DigPersona.html
Clarke R. (1994b) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' 'Information Technology & People 7,4 (December 1994) 6-37, PrePrint at http://www.rogerclarke.com/DV/HumanID.html
Clarke R. (2001a) 'Biometrics and Privacy' Xamax Consultancy Pty Ltd, Canberra, April 2001, at http://www.rogerclarke.com/DV/Biometrics.html
Clarke R. (2001b) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. 21st International Conf. Privacy and Personal Data Protection, Hong Kong, September 1999. Revised version published in Info. Techno. & People 14, 1 (2001) 206-231, PrePrint at http://www.rogerclarke.com/DV/PLT.html
Clarke R. (2003a) 'Dataveillance - 15 Years On' Proc. Privacy Issues Forum, NZ Privacy Commissioner, Wellington, 28 March 2003, PrePrint at http://www.rogerclarke.com/DV/DVNZ03.html
Clarke R. (2003b) 'Authentication Re-visited: How Public Key Infrastructure Could Yet Prosper' Proc. 16th Int'l eCommerce Conf, Bled, Slovenia, 9-11 June 2003, PrePrint at http://www.rogerclarke.com/EC/Bled03.html
Clarke R. (2008) 'Web 2.0 as Syndication' 'Journal of Theoretical and Applied Electronic Commerce Research 3,2 (August 2008) 30-43, at http://www.jtaer.com/portada.php?agno=2008&numero=2#, PrePrint at http://www.rogerclarke.com/EC/Web2C.html
Clarke R. (2009) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, Revised Version at http://www.rogerclarke.com/ID/IdModel-1002.html
Clarke R. (2010) 'What is Uberveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, PrePrint at http://www.rogerclarke.com/DV/RNSA07.html
Clarke R. (2013) 'Privacy and Social Media: An Analytical Framework' Journal of Law, Information and Science 23,1 (April 2014) 1-23, PrePrint at http://www.rogerclarke.com/DV/SMTD.html
Clarke R. (2014) 'Promise Unfulfilled: The Digital Persona Concept, Two Decades Later' Information Technology & People 27, 2 (Jun 2014) 182 - 207, PrePrint at http://www.rogerclarke.com/ID/DP12.html
Clarke R. (2017) 'Guidelines for the Responsible Application of Data Analytics' Computer Law & Security Review 34, 3 (May-Jun 2018) 467- 476, PrePrint at http://www.rogerclarke.com/EC/GDA.html
Clarke R. (2019a) 'Risks Inherent in the Digital Surveillance Economy: A Research Agenda' Journal of Information Technology 34,1 (Mar 2019) 59-80, PrePrint at http://www.rogerclarke.com/EC/DSE.html
Clarke R. (2019b) 'Why the World Wants Controls over Artificial Intelligence' Computer Law & Security Review 35, 4 (Jul-Aug 2019) 423-433, PrePrint at http://www.rogerclarke.com/EC/AII.html
Clarke R. (2019c) 'Principles and Business Processes for Responsible AI' Computer Law & Security Review 35, 4 (Jul-Aug 2019) 410-422, PrePrint at http://www.rogerclarke.com/EC/AIP.html
Clarke R. & Greenleaf G.W. (2018) 'Dataveillance Regulation: A Research Framework' Journal of Law and Information Science 25, 1 (2018), PrePrint at http://www.rogerclarke.com/DV/DVR.html
Clarke R. & Wigan M.R. (2011) 'You Are Where You've Been: The Privacy Implications of Location and Tracking Technologies' 'Journal of Location Based Services 5, 3-4 (December 2011) 138-155, PrePrint at http://www.rogerclarke.com/DV/YAWYB-CWP.html
Davis D.R. (1972) 'Police surveillance of political dissidents' Colum. Hum. Rts. L. Rev. 4, 1 (1972) 101-141
Dobson J.E. & Fisher P.F. (2003) 'Geoslavery' IEEE Technology and Society Magazine 22,1 (Spring 2003) 47-52, at https://www.usna.edu/EE/ee354/Homework/Geoslavery.pdf
Dobson J.E. & Fisher P.F. (2007) 'The Panopticon's Changing Geography' The Geographical Review 97,3 (July 2007) 307-323, at https://www.researchgate.net/profile/Jerome-Dobson/publication/229783632_The_Panopticon's_Changing_Geography/links/5cdd72a892851c4eaba4e6cd/The-Panopticons-Changing-Geography.pdf
Doctorow C. (2020) 'How to Destroy Surveillance Capitalism' OneZero, 2020, at https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59
Mann S. (2005) 'Equiveillance: The equilibrium between Sur-veillance and Sous-veillance' Opening Address, Computers, Freedom and Privacy, 2005, at http://wearcam.org/anonequity.htm
Mann S. (2014) 'Personal Safety Devices Enable 'Suicurity'' IEEE Technology & Society Magazine 33, 2 (June 2014) 14-22, at https://ieeexplore.ieee.org/abstract/document/6824309
Mann S., Havens J.C., Cowan G., Richardson A. & Ouellette R. (2018) 'Sousveillant Cities and Media' Mesh Cities, 26 September 2018, at https://meshcities.com/sousveillant-cities-and-media/
Mann S., McEwen R., Naylor D., Griffiths J., Bos K., Ali A.A. & Coleman B. (2020) 'Priveillance: Equiveillance, Covidized Surveillance and Dark Sousveillance' McLuhan Centre Working Group, University of Toronto, apparently of 2020, at http://priveillance.com
Mann S., Nolan J. & Wellman B. (2003) ' Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments' Surveillance & Society 1, 3 (2003) 331-355, at https://ojs.library.queensu.ca/index.php/surveillance-and-society/article/view/3344/3306
Michael K. & Clarke R. (2013) 'Location and Tracking of Mobile Devices: +berveillance Stalks the Streets' Computer Law & Security Review 29, 3 (June 2013) 216-228, PrePrint at http://www.rogerclarke.com/DV/LTMD.html
Michael K. & Michael M.G. (eds.) (2007) 'From Dataveillance to (Uberveillance) and the Realpolitik of the Transparent Society' Proc. 2nd Workshop on Social Implications of National Security, Uni. of Wollongong, October 2007, at http://works.bepress.com/kmichael/51/
Michael M.G. (2006) 'Consequences of Innovation' Unpublished Lecture Notes No. 13 for IACT405/905 - Information Technology and Innovation, School of Information Technology and Computer Science, University of Wollongong, Australia, 2006
Michael M.G. (2020) 'Defining Uberveillance' Video of a Presentation for the Interdisciplinary Center for Bioethics at Yale University, 23 November 2020, at https://www.youtube.com/watch?v=zkOIX_nD1rI
Michael M.G., Fusco S. Jean. & Michael K. (2008) 'A research note on ethics in the emerging age of [[Ydieresis]]berveillance' Computer Communications 31, 6 (2008) 1192-1199, at https://works.bepress.com/kmichael/32/
Michael M.G. & Michael K. (2007) 'A Note on Uberveillance' Chapter in Michael K. & Michael M.G. (2007), at https://works.bepress.com/kmichael/48/https://works.bepress.com/kmichael/48/
Michael M.G. & Michael K. (2014) 'Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies' IGI Global, 2014, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.643.3519&rep=rep1&type=pdf
Michael M.G., Michael K. & Bookman T. (2021) 'Wim Wenders' Wings of Desire: Anticipating Uberveillance' Journal of Asia-Pacific Pop Culture 6, 1 (August 2021) 109-129, PrePrint at https://www.katinamichael.com/research/tag/surveillance
Michael M.G., Michael K. & Perakslis C. (2015) '+berveillance, the Web of Things, and People' Proc. Int'l Conf Contemporary Computing and Informatics (IC3I), republished in IEEE Consumer Electronics Magazine (April 2015) 107-113
NSWPC (1977) 'Guidelines for the Operation of Personal Data Systems' N.S.W. Privacy Committee, April 1977, at http://www.worldlii.org/int/journals/IntPrivDPLawMon/1977/1.html
Nutter R. (1999) 'Technology first look: Novell's DigitalMe' TechRepublic, 23 November 1999, at https://www.techrepublic.com/article/technology-first-look-novells-digitalme/
Roszak T. (1986) 'The Cult of Information' Pantheon, 1986
Zuboff S. (2015) 'Big other: Surveillance capitalism and the prospects of an information civilization' Journal of Information Technology 30, 1 (2015) 75-89, at https://cryptome.org/2015/07/big-other.pdf
My thanks to Katina Michael for her leadership in the area, and the enormous effort she continues to put into achieving broader public understanding of the march of surveillance techniques and technologies, and to Steve Mann and Michael G. Michael for their very substantial contributions to our knowledge and capacity to resist the many excessive and sociopathic forms and applications of surveillance.
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor associated with the Allens Hub for Technology, Law and Innovation in UNSW Law, and a Visiting Professor in the Research School of Computer Science at the Australian National University.
Personalia |
Photographs Presentations Videos |
Access Statistics |
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax. From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021. Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer |
Xamax Consultancy Pty Ltd ACN: 002 360 456 78 Sidaway St, Chapman ACT 2611 AUSTRALIA Tel: +61 2 6288 6916 |
Created: 27 July 2021 - Last Amended: 21 August 2021 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/DV21.html
Mail to Webmaster - © Xamax Consultancy Pty Ltd, 1995-2022 - Privacy Policy