Roger Clarke's Web-Site© Xamax Consultancy Pty Ltd, 1995-2024 |
||||||
HOME | eBusiness |
Information Infrastructure |
Dataveillance & Privacy |
Identity Matters | Other Topics | |
What's New |
Waltzing Matilda | Advanced Site-Search |
Principal, Xamax Consultancy Pty Ltd, Canberra
Visiting Fellow, Department of Computer Science, Australian National University
Version of July 1988
© Xamax Consultancy Pty Ltd, 1998
This paper was published in MIS Qtly 12,4 (December 1988) 517-9
This document is at http://www.rogerclarke.com/DV/ELSIC.html
In the March 1988 issue, the Editor regretted his "inability to stimulate a flow of articles and manuscripts relating to the broader societal issues of the Information Age". This note is a response to his comments. It considers why IT researchers and practitioners have not contributed sufficiently to the social implications debates, and how this deficiency might be rectified.
Many years ago, MIT's Joe Weizenbaum, frightened by the simple-mindedness with which his Eliza experiments had been misunderstood, expressed serious reservations about artificial intelligence's implicit assumptions and directions. He came to be seen as an adversary of those who had been colleagues, as a whistle-blower, and that particularly dangerous opponent, a convert. Whether or not he intended to, he went out on an academic, intellectual, even spiritual limb. His work has had significant effect on people outside the computing-related disciplines, but is for the most part rejected or ignored by those inside them. The adversarial approach to the social implications of I.T. is not to be recommended, because of the intellectual isolationism it can lead to, and its inherently limited effectiveness.
Rather than becoming converts to the 'computing is dangerous' school of thought, some I.T. researchers and professionals have developed a double-life. They have continued to practise their main disciplines, but have undertaken parallel activities on unrelated social implications matters. This is akin to the international arms-dealer who commits some of his spare time to the local Boy Scouts troop (an activity which I am not in any way intending to belittle). This 'independent roles' approach to the social implications of I.T. has been apparent amongst scientists and professionals of all kinds in relation to nuclear arms, and more recently SDI.
It is inherent in the adversary model that the individual abandons his discipline, and/or his discipline abandons him. The second model, on the other hand, enables the person to retain his credibility in his own area of expertise. However, this is achieved by ensuring that one's professional and 'social conscience' realms are not too tightly intertwined, and it requires considerable feats of ingenuity, persuasion, even sophistry, to inculcate an aura of authority in areas in which one has limited expertise.
A development on this theme is the dual-specialist approach. Stanford's Terry Winograd and NYU's Ken Laudon come to mind, both of whom have undertaken fairly formal studies of particular social implications matters, while maintaining their other, more mainstream activities in AI and MIS respectively. As long as you maintain respectability in the mainstream, your proclivities for soft, socially responsible interests will not do you too much harm, and you can speak with authority in both of them.
Another approach is to confine your comments on social implications to areas in which your authority is well-recognised. Well-known computer scientists David Parnas of Queen's College, Kingston Ontario and Brian Cantwell Smith of Stanford have not only undertaken research, written and spoken on SDI, but have also refused to use their particular knowledge and skills in its support. This approach has the benefits of considerable impact. It loses the individual some of his research grants, but protects his personal integrity and public credibility. But relatively few people have sufficient standing to use this authoritative renegade approach, and for the vast majority of us it would just be an excuse for inaction.
It is a human characteristic to seek out the company of others of like mind. As a result special-purpose organisations and pressure-groups exist, including the major players listed at the end of this note. Valuable though the efforts of these organisations are, their existence underlines the manner in which thinking about implications is being divorced from research in and practice of the technology.
Of course, an approach is available whereby we may excuse ourselves entirely. All that we have to do is define the observation and criticism of the IT disciplines as someone else's problem. After all we have Maggie Boden of the University of Sussex (a psychologist/philosopher) to comment on the artificiality of artificial intelligence, and MIT's Sherry Turkle (the sociologist who wrote 'The Second Self') and Alan Bolter (the philosopher who wrote 'Türing's Man') on humans' self-conception in the computer age. Sociologists Jim Rule of SUNY at Stony Brook and Gary Marx of MIT have studied IT as a weapon of social control, and Rob Kling of California at Irvine has researched and written on the whole gamut of social implications.
[Correction: This accidentally mis-classified Rob Kling - who at that stage I hadn't met - whose background is in computer science, and whose post at that time was a Chair in Computer Science].
The avoidance approach claims that self-criticism by the IT disciplines is either unnecessary, or too hard, and should be left to outsiders. Apart from being a moral 'cop-out', the weakness of this approach is that outsiders can never fully appreciate the developments in specialised fields unless they have the assistance of insiders. External criticism would therefore be poorly aimed and ineffective unless at least some IT specialists collaborated with critics of their discipline.
I don't believe that the adversary, independent-roles, dual-specialist, authoritative renegade and avoidance approaches represent a sufficient set of models. I believe that all researchers and professionals must regard the implications of their work as part and parcel of their research in and application of IT. Consideration of implications needs to be integrated, not segregated.
A fundamental barrier is that we continue to subscribe to a vague image of scientist as saint, or as virgin. We pretend that Copernicus didn't hide his discovery, that Kepler wasn't a mystic, that Galileo wasn't a trickster and an intellectual thief, and that Benjamin Franklin wasn't (among many other things) a politician. We represent our work as being value-free. Or, at the very least, we go to excruciating lengths to convince ourselves that we have made best efforts to rid our work of the very values that might make it relevant.
Naturally I am not arguing against care, precision, external standards, and controls, above all in the design of research and in the bodies of papers. What I am attacking is the pretence of pristine purpose. The introductions to our papers should recognise the full-blooded relevance of the topic, and our implications sections and our conclusions should shed the veil of scientific disinterestedness and admit to motivations and to concerns.
Another barrier preventing the integration of implications with applications thinking is that the social implications area has the smells of altruism and intellectual softness about it. 'Real programmers don't worry about social implications' is an easy and realistic catch-cry. It's a little like the Moral Rearmament movement - it's a good thing of course, like motherhood and apple pie, but you have to be a bit of a fanatic, and maybe even a bit of a wimp, to pursue it.
Some three years ago, I assumed the Chairmanship of the Australian Computer Society's Social Implications Committee. I was concerned at the narrowness of the Committee's Terms of Reference, because it seemed that it must inevitably comprise 'do-gooders', putting their fingers in dykes, and acting as the moral conscience of the IT professions. To address the wimpy image, I proposed that the various dimensions of IT impact should be considered together, in an integrated manner.
In 1986, the Society agreed to enlarge the Committee's Terms of Reference and it became the Economic, Legal and Social Implications Committee (ELSIC). During the next two years, the so-called 'Australia Card' scheme consumed a considerable proportion of the Committee's time. The Australian Government proposed to introduce a national identification scheme, as a basis for a local version of the often-considered but always-rejected American idea of a 'national data center'. Civil libertarians from all segments of society attacked the scheme because of its in part describable, and in part imagined, social implications1.
ELSIC expressed concern about the social implications, but only to the extent that was necessary to argue for careful consideration of the proposal. The focus of submissions to Government and the various review committees was on assessing the scheme's technical base and its economic merit. ELSIC was therefore able to skirt the more subjective aspects of the debate, and speak with authority on matters susceptible to more precise analysis. The conventional topic of 'Social Implications of Computing' is too narrow. Economic, legal and social implications of information technology must all be considered together, to enable the various factors to be seen in perspective.
To what degree does a technologist have moral responsibility for the uses to which his technology is put? Professional Codes of Ethics frequently require a member to place the public's interests above his own, and to avoid nasty uses of technology. But the meaning of the inevitable judgmental words (like 'nasty') are dependent on context. Agreement on the standards applicable is difficult in time of peace, and seriously problematical during periods when threats exist of terrorism or invasion. Such threats may be real or imagined - but at the time, with propaganda emanating from the 'other side', and managed information flows on 'our side', it is by definition impossible to tell the difference between reality and imagination. And anyway Codes of Ethics are seldom enforced, or even remembered.
Researchers and practitioners in I.T. are dealing with a powerful tool. We have clearly before us, as archetypal anti-heroes, the nuclear physicists of the 1920's and '30's, who hid their heads in the sand, comforting one another that the implications of their work were unforeseeable. Or that if they were foreseeable, then their work was a-moral, not immoral, because it was other people who would decide on the uses to which nuclear energy would be put. Should we indulge ourselves in our own modern versions of this avoidance approach, or risk our scientific detachment by becoming involved? To co-opt Kennedy's inversion of the question, if we believe our own hype that IT will have enormous impact, can we afford not to risk our scientific detachment and become involved?
I believe that the moral responsibility of any professional must at least extend to an honest attempt to ensure that public debate is informed. Further, since the subject-matter is often obscure, the professional's role extends to ensuring that debate takes place. At times, that step will drag the person into the role of a protagonist or antagonist. Clearly in those circumstances we are no longer entitled to claim the mantle of technologist-saint, but then aren't we all parents, voters, patriots, and even civil libertarians and ideologues as well?
If MIS academics and thinking practitioners are to stimulate and support public debate, we must make some changes in our conventions of behaviour. We do not need to compromise the precise, careful, scientific manner in which we undertake and report on our research. But we must stop sanitising our introductory remarks, and instead draw attention to the real importance of the topic we are dealing with. And the closing sections of our papers must not be confined to 'implications for further research', but must also directly address 'implications for people'.
It is also important that researchers not restrict their writing to the serious academic journals on which our career prospects and peer approval depend. We must also report the nature and implications of our work in more populist publications, in order to reach the great majority of practitioners whose professional reading does not reach the heights of MISQ.
Like many others, I missed the Editor's 'call to arms' in issue 10,1 of March 1986. As a result, although I have regarded this as one of the most important journals in which to read and to publish technical and managerial articles, I have never even considered looking in it for papers on technological implications. I therefore very much welcome the Editor's intentions.
However, I suggest that, by broadening the topic from mere 'social implications' to 'economic, legal and social implications', we may be able to overcome our reticence about thinking, talking and writing about the consequences of our work, and to integrate the consideration of implications with applications.
1. Clarke R.A. (1987) 'Just Another Piece of Plastic for Your Wallet: The Australia Card' Prometheus 5,1 (June 1987). Republished in Computers & Society 18,1 (January 1988)
IFIP Technical Committee TC9 (Relationship Between Computers and Society) of the International Federation for Information Processing - Prof. Hal Sackman, 13609 Bayliss Rd, Los Angeles CA 90049
IFIP TC9 Working Group WG9.1 (Computers and Work) - Prof. Klaus Fuchs-Kittowski, Humboldt Universität Berlin, Sektion Wissenschaftstheorie und -organisation, Unter der Linden 6, psf 1287 DDR-1086 BERLIN G.D.R.
IFIP TC9 Working Group WG9.2 (Social Accountability) - Mr Dick Sizer, 26 Avenue Rd, Farnborough Hants GU14 7BL U.K.
ACM Special Interest Group on Computers and Society (SIGCAS), of the (charmingly named) Association for Computing Machinery - Ron Anderson, 13221 Lake Point Blvd, Belleville MI 48111
Computer Professionals for Social Responsibility (CPSR) - P.O. Box 717, Palo Alto CA 94302
Social Implications Committee of the British Computer Society (BCS SIC) - 13 Mansfield St, London W1M 0BP
Economic, Legal and Social Implications Committee of the Australian Computer Society Inc. (ACS ELSIC) - Roger Clarke, Department of Commerce, Australian National University, Canberra ACT 2611
Personalia |
Photographs Presentations Videos |
Access Statistics |
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax. From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021. Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer |
Xamax Consultancy Pty Ltd ACN: 002 360 456 78 Sidaway St, Chapman ACT 2611 AUSTRALIA Tel: +61 2 6288 6916 |
Created: 13 July 1998 - Last Amended: 13 July 1998 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/ELSIC.html
Mail to Webmaster - © Xamax Consultancy Pty Ltd, 1995-2022 - Privacy Policy