115 14 10MB
English Pages 232 [226]
Using IT effectively: a guide to technology in the social sciences
Using IT effectively: a guide to technology in the social • sciences Edited by
Millsom Henry University of Stirling
~} Routledge ~~
Taylor & Francis Group
LONDON AND NEW YORK
© Millsom Henry 1998 the collection and introductory material © the contributors for individual chapters 1998 This book is copyright under the Berne Convention. No reproduction without permission. All rights reserved. First published in 1998 by UCL Press Reprinted 2003 by Routledge 11 New Fetter Lane London, EC4P 4EE Routledge is an imprint of the Taylor & Francis Group, an informa business British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library. Library of Congress Cataloguing in Publication Data are available. ISBN-13: 978-1-138-46810-8 (hbk) ISBN-13: 978-1-857-28795-0 (pbk) Cover design by Amanda Barragry Printed and bound in Great Britain by Antony Rowe Ltd, Eastbourne. DOI: 10.4324/9780429332289
CONTENTS
FOREWORD Professor Howard Newby
ix
LIST OF FIGURES AND TABLES
xi
NOTES ON CONTRIBUTORS
xiii
EDITOR'S INTRODUCTION Millsom Henry
xix
SECTION ONE: NEW CHALLENGES FOR TEACHING AND LEARNING
1 EXPONENTIAL EDUCATION Peter Cochrane 2 PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE Stephen Heppell 3 TECHNOLOGY AND SOCIETY: AN MP'S VIEW Anne Campbell 4 INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY Adrian Kirkwood
3
17 21
27
SECTION TWO: DEVELOPING COURSEWARE FOR THE SOCIAL SCIENCES
5 EXPECT A TIO NS AND REALITIES IN DEVELOPING COMPUTER-ASSISTED LEARNING: THE EXAMPLE OF GraphIT! Ruth Madigan, Sue Tickner and Margaret Milner
41 V
Contents 6 THE DATA GAME: LEARNING STATISTICS Stephen Morris and Jill Szuscikiewicz 7 CONVERSION OF THE IDEOLOGIES OF WELFARE TO A MULTIMEDIA TEACHING AND LEARNING FORMAT David Gerrett 8 DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE Stephen Scrivener and Susan Vernon
51
65
75
SECTION THREE: IMPLEMENTING COMPUTERASSISTED LEARNING IN THE SOCIAL SCIENCES
9 COMPUTER-AIDED LEARNING AS A LEARNING TOOL: LESSONS FROM EDUCATIONAL THEORY Graham R. Gibbs and David Robinson 10 ANORAKS AND TECHIES: A CALL FOR THE INCORPORATION OF NON-TECHNICAL KNOWLEDGE IN TECHNOLOGICAL DEVELOPMENTS Vernon Gayle 11 EVANGELISM AND AGNOSTICISM IN THE TAKEUP OF INFORMATION TECHNOLOGY Danny Lawrence, Ken Levine and Nick Manning 12 STANDARDS FOR THE NON-STANDARD: THE IMPACT OF NEW TECHNOLOGY ON THE NON-STANDARD STUDENT Ann Wilkinson
87
103
113
129
SECTION FOUR: THE EFFECTIVENESS OF THE NEW TECHNOLOGIES IN TEACHING AND LEARNING ENVIRONMENTS
13 INFORMATION TECHNOLOGY AND TEACHING QUALITY ASSESSMENT: REFLECTIONS OF A SOCIOLOGIST Chris Turner 14 WHY COSTS ARE IMPORT ANT IN THE ADOPTION AND ASSESSMENT OF NEW EDUCATIONAL TECHNOLOGIES David Newlands, Alasdair McLean and Fraser Lovie
vi
139
151
Contents 15 USING MULTIMEDIA TECHNOLOGY FOR TEACHING: A CASE STUDY APPROACH David Crowther, Neil Barnett and Matt Davies 16 INFORMATION TECHNOLOGY AND TEACHING THE SOCIAL SCIENCES: OBSTACLES AND OPPORTUNITIES Duncan Timms
165
179
GLOSSSARY
195
INDEX
197
vii
FOREWORD Professor Howard Newby Vice-Chancellor, University of Southampton
This book could not be more timely. The shape, structure and content of higher education in the UK is once more under intense public scrutiny and the role of information technology (IT) in teaching and learning remains a key issue. "Using IT effectively" is central to this. There are those with influence over policy in Brussels, Whitehall and Westminster who still regard IT as some sort of panacea: put students in front of VDU screens, the argument runs, and mass higher education can be provided on the cheap. Such "technological fix" arguments are simply false. But so is their contrary, that new technology has no relevance to the future provision of higher education in this country. Clearly the nature of teaching and learning will be affected - sometimes in quite profound ways - by new developments in IT. The key words in the title of this book are not so much "n" (most developments over the next decade are known or knowable), but "using" and "effectively". The technology will stand or fall by its use. Education is - and will, for the foreseeable future, remain - at heart a social process. New technology can assist in raising the quality of this process, but it must go with the grain of conventional pedagogy. Without this the sheer quantity of information available via modern information technology will disable, rather than enable, participation in a genuinely educational process. In the social sciences these issues are particularly pertinent. The social sciences thrive on debate. While the acquisition of empirical data is an important component of the understanding of society, the facts are never self-evident: they require interpretation. IT is, therefore, an essential tool in the social sciences and brings on to the desktop capacity to assimilate and analyze information at a speed and a cost undreamt of less than a generation ago. But it is not a substitute for the informed understanding that also comes from debate and discussion. Social science is, in many respects, the epitome of education as a social process. ix
Foreword The contributions to the collection are not, therefore, narrowly concerned with the technology per se, startling though the advances in this continue to be. Most of the contributors are concerned with the organizational, social and pedagogical use of the new teaching and learning technologies. Relating these technologies to those contexts is crucial if the promise evinced by the new technologies is to be fulfilled. As we look forward to developing a higher education system attuned to the needs of a new century, these chapters remind us of the great care with which new technology must be handled if, indeed, it is to be used effectively.
X
LIST OF FIGURES AND TABLES Figures Figure 1.1 Figure 1.2 Figure 1.3 Figure 4.1 Figure 4.2 Figure 4.3 Figure 6.1 Figure 6.2 Figure 6.3 Figure 6.4 Figure 6.5 Figure 6.6 Figure 6.7 Figure Figure Figure Figure Figure
7.1 7.2 9.1 9.2 9.3
Children using computers: an on-line school session. The surrogate head surgeon. The virtual university. Households in the UK with video recorder, home computer and audio CD player, 1985-94. Proportion of UK households with selected media technologies, 1994. Access to information and communications technologies in UK households, 1994. A painless way to absorb basic information. Repeated, varied and purposeful experimentation. Explaining the pattern recognition principle of choosing a test. Interactivity allows the student to follow their own line of interest. Knowing which tests are appropriate. More information on each test is available. Self-testing enables students to monitor their learning in an exploratory environment. Steps in the authoring process (A-M). Example of a lesson in Ideologies of Welfare. Correlation Explorer. A screen from Inspiration. The Polygraph from MacLaboratory.
7 8 10
29 31 33 56 57 58 59 60 60 61 67 69 97 98 99
Tables Table 6.1 Table 6.2
Areas covered by the software. Confidence and understanding before and after using the software. Table 7.1 Steps in the authoring process (A-M).
54 62 68
xi
NOTES ON CONTRIBUTORS
Neil Barnett is Lecturer in Public Sector Management at Leeds Metropolitan University. Research interests include local government structure and decentralization. He is interested in developing multimedia teaching/learning material for public sector managers and social science students in an interdisciplinary environment. Anne Campbell is the first woman MP for Cambridge and the city's third Labour MP. Since her election in April 1992, she has taken a special interest in science and technology, education and economic affairs. She was a member of the House of Commons Select Committee on Science and Technology from 1992 to 1997 and was Vice-Chair of the Parliamentary Office of Science and Technology. Anne Campbell worked with David Blunkett, Secretary of State for Education, on the future of information technology (IT) in education and research from 1995 to 1997. She is currently Parliamentary private Secretary to John Battle, Minister for Science, Energy and Industry. She has also chaired a sub-group of Labour's Policy Commission on the Information Superhighway. Educated at Newnham College, Cambridge, she taught mathematics at Cambridge secondary schools before becoming a Senior Lecturer in Statistics at Cambridgeshire College of Arts and Technology (now Anglia Polytechnic University). She was Head of the Statistics and Data Processing Department at the National Institute of Agricultural Botany from 1983 to 1992. She is a Fellow of the Institute of Statisticians, the Royal Statistical Society and the Royal Society of Arts. Peter Cochrane is Head of Research at BT Laboratories. A graduate of Trent Polytechnic and Essex University, he is also a visiting professor to UCL, Essex, and Kent universities. Peter Cochrane has published and lectured widely on technology and the implications for society. He received the IEE Electronics Division Premium in 1986; the Queen's Award in 1990; and the Martlesham Medal for contributions to fibre optic technology, the Computing and Control Premium, and the IERE Benefactors Prize in 1994. David Crowther is Lecturer in Management Accounting at Aston
xiii
Notes on contributors Business School, Aston University. His main area of research is corporate performance measurement and behavioural accounting. He is interested in teaching the use of technology and in particular multimedia as a teaching/learning tool. Matt Davies, a qualified chartered accountant, is Lecturer in Financial and Management Accounting at Aston University. His main research interest is in the use of shareholder value performance measures and the use of technology in the teaching process. Vernon Gayle is Lecturer in Sociology at the University of Stirling and responsible for teaching research methods and data analysis to both undergraduates and postgraduates. His own research is mainly concerned with analyzing social and economic data using generalized linear models. He is a committed GLIM4 user and has published a paper that reflects his interests in ordered categorical data analysis. He has also conducted research into a complementary treatment regimen for cancer patients. David Gerrett received his Bachelor of Pharmacy degree from Queensland University in 1977. After professional registration and working as a community and hospital pharmacist, he returned eight years later to full-time academia and read for a Masters in Hospital Pharmacy. Continuing part-time, he received his doctorate in 1995. This considered the role of community pharmacists as advisers on prescribed medication. In researching a role he was led towards a greater understanding of the literature on social policy and professionalism. An outlet for this understanding was provided by his role as course leader and primary author for the Postgraduate Programme in Social and Administrative Pharmacy which uses Multimedia as its sole teaching and learning method. The conversion of the Ideologies of Welfare was found to have generic appeal and is currently in use on three further postgraduate programmes. Graham R. Gibbs is Principal Lecturer in Sociology at the University of Huddersfield. He has a wide experience of teaching research methods using IT at undergraduate and postgraduate levels. His most recent research has focused on the use of CAL in teaching the social sciences and especially the use of computer aided co-operative learning in teaching theoretical subjects. Millsom Henry is the Deputy Director of Socinfo (see Glossary). She graduated from the universities of Durham and Stirling as a social scientist with specific teaching and research interests in the sociology of ethnicity and gender, especially in relation to culture/media and the social implications of the new technologies. She has presented numerous papers at international conferences, published over 12 articles and is editor of three regular publications. Millsom Henry was commissioned to edit three books and to write two chapters for publications
xiv
Notes on contributors due out in 1997. Finally, but not least, she somehow finds time to complete a part-time PhD on the identities of Caribbean women in Britain at the University of Stirling. Stephen Heppell is Professor of Information Technology in the Learning Environment at Anglia Polytechnic University and Director of ULTRALAB. Stephen Heppell is a member of a number of public committees and acts as a consultant in both the public and private sectors; he also has a long list of television appearances and writes regularly for the popular press. He is on the editorial board of the Journal ofInformation Technology for Teacher Education and the Journal of Multimedia and has contributed many chapters in books and journals; a full list can be viewed from: http://www.ultralab.anglia.ac.uk/pages/ultralab/ team/stephen/contents .html Adrian Kirkwood is Head of the Programme on Learner Use of Media within the Open University's Institute of Educational Technology. He undertakes research and evaluation studies related to access and applications of media and information technologies, both within and outside the Open University. He has been a consultant to organizations including British Telecom and the National Council for Educational Technology and has made invited contributions to international conferences. He co-authored Personal computers for distance education (Paul Chapman 1992) and has published widely on the subject of using media in education and training. Danny Lawrence is Senior Lecturer in Sociology in the School of Social Studies at the University of Nottingham. His early research in "race" and ethnic group relations led to Black migrants: white natives (Cambridge University Press, 1974, reprinted and reissued by Gregg, 1992) and he has subsequently published many articles in this field. He has since conducted research and published on transmitted deprivation; youth unemployment; the professional aspiration and changing circumstances of the occupational groups responsible for the delivery of careers guidance; the disestablishment of teacher careers and, most recently, higher education and the labour market. Ken Levine lectures at the School of Social Studies at Nottingham University. His main sociological research interests are adult literacy and architects as a professional group. He has gained considerable experience using computers with both undergraduate and postgraduate students in a variety of contexts, including courses on statistics and survey design and analysis, as well as introductions to networks, word processing, email and databases. He has collaborated with colleagues (including Danny Lawrence) on the production of CAL courseware (using Authorware Professional) designed for a module on "official" statistics. A long spell as the departmental Computing Officer attempting to meet the needs of staff and student users has taught him that despite xv
Notes on contributors massive advances in technology, the gap between expectations and reality in educational IT is more or less unchanging. Fraser Lovie is Research Assistant in the Department of Politics and International Relations at the University of Aberdeen. He is also working on the research project on the Internet delivery of International Relations courses with Alasdair McLean. Ruth Madigan is a Senior Lecturer in Sociology at the University of Glasgow. She has taught urban sociology and methods of social research, in particular data analysis using SPSS, for many years. Her article "Gender issues: teaching with computers in sociology" (Socinfo Journal I, 1995) arose directly out of this experience. The TLTP-TILT project (University of Glasgow) offered an opportunity to explore new approaches to computer-aided learning in the area of basic research statistics. Alasdair McLean is Lecturer in the Department of Politics and International Relations at the University of Aberdeen and is also the Convenor of the Faculty IT User Group. He has considerable experience of distance education through audio-conferencing and leads a research project on the Internet delivery of international relations courses. Nick Manning is Professor of Social Policy and Sociology, University of Nottingham. He has been interested in introducing IT into the social science curriculum since the late 1980s, both as a general environment for student learning and for specific courses. This has included both data analysis and the construction of lectures in hypertext. He has also developed postgraduate degrees combining social policy and IT, with European Union (Eu) funding. His research work is mainly on eastern Europe. This started in the 1980s on social policy, changed to environmental and housing movements in the early 1990s, and is now on (un)employment policy and household experience in Russia. His other areas of work have included medical sociology and health policy, comparative social policy among various OECD countries, and general theories of social problems and social policy. Margaret Milner is a Lecturer in Quantitative Methods in the Department of Accounting and Finance at the University of Glasgow. She has had a keen interest in developing computer applications for the teaching of statistics and quantitative methods to accountancy students and also teaches MBA students and students interested in IT. Her research topics include investigating the distributional properties of accounting ratios and decision-making and report format choices. As a member of the team developing GraphIT!, a TLTP-TILT project, the strategic use of graphs and graphical analysis is also an important teaching and research interest. Stephen Morris has been producing CAL for many years; and was an original ITTI project holder for the production of CAL in medical educa-
xvi
Notes on contributors tion for which the software Statistics for the Terrified was produced. He is one of the prime movers of the MIDRIB project, which will bring together a comprehensive database of peer-reviewed medical images to UK higher education over the Internet. He has also been the head of successful higher education computer units at St Bartholomew's Hospital (1984-94), and (currently) St George's Hospital Medical School. Professor Howard Newby took office as Vice-Chancellor of the University of Southampton on 1 September 1994, moving from the Economic and Social Research Council, where he was first Chairman (1988-94), and then Chief Executive (1994). Professor Newby has a background of research and writing in rural affairs, including many books and articles on social change in rural England, and is a Rural Development Commissioner. He is also a member of a number of government bodies concerned with the funding of research in the UK, including the Dearing Committee Research Working Group; Chairman of the Centre for the Exploitation of Science and Technology; and a member of the executive council of the European Science Foundation. He is Vice-Chair of the Committee of Vice-Chancellors and Principals and serves on a number of its steering and sector groups. David Newlands is Senior Lecturer in Economics at the University of Aberdeen. His principal research interests in economics include regional economics and the economics of the welfare state. He has also conducted research on new educational technologies and is directing a major project to examine the impact of such technologies on educational provision in Scotland. David Robinson is Senior Lecturer in Psychology at the University of Huddersfield. His work has included involvement in the development and evaluation of a software system to support general practitioners. His current research interest is the relationship between autobiographical memory and fantasy. Stephen Scrivener is Professor and Director of the Design Research Centre at the University of Derby. He has published four books and over 50 papers in learned journals, and has refereed conferences and made numerous presentations. Jill Szuscikiewicz has worked on CAL with Stephen Morris since the original ITTI grant was obtained, and more recently at St George's Hospital Medical School has worked on projects with a more medical base such as Immunology Explained and Heartbeat. She is now project manager of MIDRIB, funded under the eLib programme as a joint project with Bristol University. Sue Tickner is a Teaching and Learning Support Consultant at the University of Glasgow. Originally an English graduate, she worked as a teacher in Britain and Spain before returning to the UK for an MSc in IT and learning (by distance learning). After working as an independent
xvii
Notes on contributors developer/trainer and as an Open University tutor Sue Tickner joined the University of Glasgow's TLTP project, as designer/co-ordinator for the Numerical Data Group. She has recently been conducting research into distance education and remote learning technologies. Duncan Timms is Director of Socinfo (see Glossary) and Professor of Sociology in the Department of Applied Social Science in the University of Stirling. He is also Director of Project VARESTILE (the ValueAdded Reuse at Stirling of Existing Technology in the Learning Experience), an institutional project funded under the Teaching and Learning Technology Programme. His teaching and research interests encompass two main areas: the social correlates of health and illness, and the social implications of information and communications technologies. He is also the Director of a series of Scottish-Nordic Winter Schools on Comparative Social Research funded by the EU under the Training and Mobility of Researchers Programme. Chris Turner is Professor of Sociology, University of Stirling. His teaching and research interests include the social constructions of childhood; processes of transition from childhood to adulthood; the impact of state intervention in children's lives; and state policies on children since 1945. Susan Vernon is Director of Applied Arts at the University of Derby. Having studied at Gray's School of Art in Aberdeen (jewellery and printmaking) and at the University of Central England, Birmingham (MA in Industrial Design), Susan regularly exhibits her work at an international level. Ann Wilkinson is a Senior Research Fellow working as the coordinator of the CTI Centre for Human Services in the Department of Social Work Studies, University of Southampton. The Centre is also funded to provide information and advice to CTI Centres on antidiscriminatory practice. Her previous research includes a Europeanfunded project to provide information on access to higher education for disabled students. This was published as an information system and forms part of her MPhil thesis.
xviii
EDITOR'S INTRODUCTION Millsom Henry
This book represents a selection of edited papers, most of which were presented at the first Soclnfo (see Glossary) International Conference on Technology and Education in the Social Sciences (rnss) in September 1995 at the University of Stirling. This event provided a unique opportunity for social scientists to share ideas and experiences around the issues of innovation in teaching and learning. The proliferation of technology initiatives, the growth of more generic computer-assisted courseware for higher education, the rapid expansion of higher education, the emergence of the Internet and the Teaching Quality Assessment (TQA) exercises in the UK all provided the backdrop for many of the issues raised at the conference. As a result, the papers in this collection represent an attempt to document as well as to stimulate some critical debate on the impact of these technologies for teaching and learning. Although studies are emerging in the social sciences on the varying forms of technology and its related sub-cultures, more work is needed on how technology affects everyday life. The social, political and economic implications of the new technologies should be a central concern for the social sciences. Interestingly, this was recognized by the Economic and Social Research Council (EsRc) with the launch in spring 199 7 of a new research programme on the role of virtual technologies in society. 1 This programme will pay attention to the wide-ranging implications of new technology which to date has been missing in social scientific research. This must be good news. It is heartening to note that the themes surrounding the effective use of technology in education were also highlighted as key areas in the new ESRC programme. Using IT effectively: a guide to technology in the social sciences examines in detail some of the major issues associated 1.
ESRC Virtual Societies Programme directed by Professor Steve Woolgar at Brunel University. For more details refer to: http://www.esrc.ac.uk/programmes
XIX
Editor's introduction with the development, impact, implementation and assessment of technology particularly within the social sciences. In UK higher educational institutions, the teaching and learning process is currently undergoing a major revolution and a more sustained examination of the development, implementation, assessment and impact of technology within the teaching and learning process is required. This book should be read as a basis for further investigations into both the positive and negative consequences of technology. Section One contains four key contributions from business, politics, academia and the technological field on the changing effect of technology on the teaching and learning process. In Chapter 1 Peter Cochrane points out the significance in expanding the traditional way of looking at teaching and learning. As the Head of BT Research Laboratories, Cochrane contends that while the younger generation "are embracing IT and rapidly gaining skills, the teaching profession remains dominated by a population resisting, or unable to see the need for, change". This has led to a gap between the students and staff and is, according to Cochrane, a cause for concern. In addition, there is a failure to appreciate the significance of IT as an effective tool for teaching and learning. Cochrane insists that "IT is not an 'instead of' but an 'as well as' technology. It is unlikely to replace the teacher or the institution but it will change their nature. In future, education will have to be more available, just-in-time, and on-line as it becomes a continuous life-long process." The key point of this chapter is that there is a dramatic shift in the way formal education is viewed which has implications for staff, students and the nature of higher education disciplines. Social scientists should be interested in these shifts not only as an area for external enquiry, but also to investigate ways in which IT can be used more effectively as a tool within its own discipline areas. In Chapter 2 Stephen Heppell, the Director of ULTRALAB, points out how developments in educational technology contrast sharply "between rapidly expanding advancing technological potential and slower pedagogical, social and political development". This has led to a number of tensions "between the emergent capabilities both of the technology itself and of the 'children of the information age'; the challenge that these capabilities pose for existing models of education and assessment; the challenges posed for public policy and the social implications of technology for work, gender, family and education". These tensions are of central concern to the social science disciplines, yet to date we have failed to critically engage in the debate. Chapter 3 focuses on the need to use the technologies effectively to encourage innovative research and to assist in the development of good teaching and learning skills. Anne Campbell, the Labour MP for Cambridge, demonstrates how technology can be used to improve access to the political as well as the
xx
Editor's introduction learning process by describing the design and management of her online political surgery, one of the first in the UK. Campbell highlights the issues of access and empowerment which are central to the debate about the implications of the new technologies and insists that the information revolution should "not worsen the divisions in society, but is used to enable opportunity, equality and democracy". In the final chapter of this section, Adrian Kirkwood also picks up on some of the negative consequences of technological development and considers whether it will excerabate social differences. The evidence to date suggests that whether in the home or at work, technology has tended to reinforce existing social differences in relation to gender and class. In this regard, Kirkwood argues that these concerns represent a valid subject for social scientific enquiry. In Section Two, a few examples of the development of courseware in the social sciences are highlighted. Ruth Madigan, Sue Tickner and Margaret Milner describe their work in Chapter 5, as part of an interdisciplinary team to produce a basic statistics CAL program. As selfconfessed "relative novices in courseware development" the authors here describe how the design and implementation of their program forced a fundamental re-evaluation of their own teaching methods. Their case study provides some salutary lessons about collaboration in courseware development and raises issues about the effective use of technology for teaching and learning. In Chapter 6, Stephen Morris and Jill Szuscikiewicz also outline their attempts to develop, assess and implement a statistical program for students. The chapter demonstrates how to exploit the technology and introduce graphics and simulations to encourage practical experimentation. By employing these methods, Morris and Szuscikiewicz assert, students will have a deeper appreciation of statistics. In Chapter 7, David Gerrett describes how he developed a programme based on an existing core course. Gerrett utilized the literature about the ideologies of welfare in social and public policy to create what he called "one-to-one, non-judgemental tuition". Chapter 8 documents Stephen Scrivener's and Susan Vernon's vision of the future where collaborative group work by designers as part of international teams supported by computer and electronically mediated communication and cscw tools will predominate. This form of learning has implications for all discipline areas and may be particularly of use to the social science community who should be able to fully exploit the capability of computer-mediated communications. The contributions in Section Three focus more directly on the implementation of CAL programs in the social sciences. In Chapter 9, Graham R. Gibbs and David Robinson argue that attempts to replace the teacher with technology are unhelpful and relate to a much wider societal xxi
Editor's introduction process of deskilling. Developments in CAL, they argue, should be used to enhance teaching skills by providing flexible learning tools rather than seeking to replace or deskill teachers. In Chapter 10, Vernon Gayle points out how the influx of IT in higher education has been poorly conceived and ineffectively implemented. Gayle argues that the inclusion of a sociological account of the teaching and learning environment serves to provide "an empirical account of the sociality of the teaching and learning environment and incorporates the knowledgeability held within the non-technical perspectives". Nick Manning, Danny Lawrence and Ken Levine examine some of the reasons why academics have been reluctant to embrace technology in their teaching and research areas in Chapter 11. Paying particular attention to the attitudes, organizational ethos and context of academia, the authors maintain that a number of factors have hindered the successful development and use of IT. In response to Brackenbury's & Hague's (1995) article, Manning et al. argue that, rather than being dismissed as irrational, the actions of academics who do not embrace technology may actually be based on calculation. Ann Wilkinson's contribution in Chapter 12 addresses how the implementation of technology affects groups defined as "non-standard" students. Wilkinson points out that educational technologies should be providing the opportunity to look at different approaches to teaching and learning that benefit all. In Section Four, Chris Turner reflects on his recent experience of TQA in Scotland by exploring both the current and potential uses of IT in the learning and teaching of sociology in higher education. Focusing in more detail on cost, David Newlands, Alasdair McLean and Fraser Lovie in Chapter 14 stress the importance of comparing the costs of different technologies as well as looking at the evidence for the learning achievements and experiences of students. David Crowther, Neil Barnett and Matt Davies assess how the introduction of computerbased-learning programs have been driven by the desire to achieve efficiency savings. However, as the authors point out, "efficient teaching may not represent efficient learning". Crowther, Barnett and Davies proceed to outline the general failure to exploit multimedia technology, particularly in the social sciences, before going on to describe how to maximize both effective learning and efficient teaching. Finally, in Chapter 16, Duncan Timms examines the obstacles and the opportunities of IT teaching in the social sciences. According to Timms, the development of computer-based learning in the social sciences has been slow, with the exception of its use in data collection and analysis. Consequently, teaching IT in the social sciences has remained largely unchanged. The reasons for this are both general and specific and as a result the pres-
xxii
Editor's introduction sures are also both positive and negative. The chapter ends with a brief look ahead to the direction that the social science community should take in relation to the pervading role of technology. The issues surrounding developments in technology are undoubtedly wide-ranging, as these chapters have shown. There are issues which still need to be identified as well as resolved. This book should be seen as an attempt to provide a working document to some of the issues raised. It is clear that as a research area, the social, political and economic implications of technology will expand over the next few years; and in the area of education it is evident that with the continuing expansion of higher education, issues of access, quality, effectiveness and choice will be central. Consequently, the pedagogic nature of disciplines, the structure of universities, the teaching and learning styles of both staff and students and any partnership with industry, commerce and public policy must be strategically reviewed. This will undoubtedly involve a huge investment in time as well as financial and other resources. Nevertheless, for the first time, the social science community are well-placed to take the lead and to shape policy in this area. The role of technology, then, must be an issue that is placed high on the social science agenda.
xxiii
Section One
NEW CHALLENGES FOR TEACHING AND LEARNING
1
EXPONENTIAL EDUCATION Peter Cochrane
We are living at a time of unprecedented change, with technology advancing faster, and producing more new opportunities, than ever before (Lilley 1995). IT has created not only the mechanisms to do more with less, but also the means of storing, accessing and transporting information on a scale inconceivable just ten years ago (Emmot 1995). Technology feeding technology, with machines used to design better machines, is the evolutionary process responsible for the exponential capability growth now driving society. In contrast, our wetware (the brain between our ears) has seen no significant change during the past 150,000 years, and in evolutionary terms mankind is in stasis (Calvin 1991). So if we are to survive in a technologically driven world that is changing faster than we can biologically accommodate, we have to use the very technology that engendered our predicament to help us cope; it is our only course of action. Going back to earlier, and in many respects simpler, times is not an option - no matter how distortedly attractive it may appear (Bronowski 1973). The progress of our species has always been, and remains, irrevocably linked to innovation and technology - and it is one way only! We just could not support the world's population of over 5 billion without the technology we have come to take for granted (Toffler 1971).
Human-technology perspective Only 2,000 years ago most of humankind lived in tribal communities of just a few hundred individuals meeting and knowing fewer than 1,000 people in a lifetime. For this life, we were well equipped, with all of the tribe's knowledge contained in the human brain, and passed on from father to son, mother to daughter. For most, all the information they ever required was within the tribe. Civilization, cities and trade changed all this, and in a period of less DOI: 10.4324/9780429332289-2
3
Exponential education than 200 years the transition from the farming and rural existence to the Industrial Age was completed (Bronowski 1973). During this transition, the ability to transport large quantities of goods and people across the planet emerged, creating a demand for good telecommunications. It is interesting to reflect that colonization and supremacy in war were the primary motives for the development of much of our industry, and have led directly to today's revolution in IT. More impressively, we have created a new era in much less than 100 years. When De Forest invented the thermionic valve in 1915 he could never have guessed the revolution that he was starting. The next major step was the invention of the transistor in 1946 by Shockley, Bardeen and Brattain, to be followed by the integrated circuit in 1958, the laser in 1960, and optical fibre in 1966. In the last 50 years we have seen the world become dominated by electronics (chips) and optical fibre. As a result, computers and communication are now ubiquitous, and we have created more information, achieved and understood more than aH of the past generations since we first discovered fire. This pace of change will not only continue, but accelerate: and the trajectory is now clear - it is exponential. Every year (or thereabouts) sees optical fibre transporting twice as much traffic (Cochrane & Heatley 1995), memory chips storing twice as much data, and computers twice as fast (Emmot 1995). Many people consider English to be the planet's primary language, and speech to be the most sophisticated and dominant form of communication. Well, they are wrong. The dominant form is now binary, and it is between machines having more conversations per day than mankind has had in its entire existence (Drexler 1990). We can now wear more computing power in a wristwatch than was provided by a commercial computer the size of a domestic washing machine 30 years ago. In 10 years the PC will be around 1,000 times more powerful than today, and in 20 years near 1,000,000 times more. By about 2015 super computers will have reached human equivalence in terms of information storage and processing, and by 2025 that power will be available on our desks. About five years later (Calvin 1991, Regis 1991) computers will be wearing us! If we are to maintain a primary role on this planet, we must understand technology and use it to advance our own limited brain capacity. It is not possible to ignore these changes for they are inexorable, and will promote even more change (Cochrane 1995b, Kennedy 1993). In short, you can opt out, but you cannot escape.
4
Education
Antagonistic technology There is absolutely no doubt that most IT interfaces seem to have been designed by people who feel we should all be computer scientists (Norman 1988). This is definitely the wrong approach. Most people have great difficulty driving a VHS video recorder, let alone a PC. Unless we humanize machines (make devices extremely user friendly and easy to use), a society divided by its abilities with machines will be born. This would be a disastrous society of IT "have and have nots", full of tension, and sub-optimal for our own productivity, progress and survival. It is vital, therefore, that technology is bent into people and people are not bent further into technology (Emmot 1995). Today the primary interface is the button, switch, knob, mouse, keyboard and screen. This can only be viewed as archaic, and something that should not survive. Fortunately, technology is now reaching a point where voice control and command, and even limited conversations between people and machines, are possible (Cochrane & Westall 1995). This Star Trek vision is the first step in the journey to a symbiotic relationship between carbon (us) and silicon (chips) life forms. It is also the first example of directly linking the nervous systems of two different entities. The next extension will be our sense of touch, as fingertips and other sensory areas are coupled directly into machines (Drexler 1990). In the meantime we have to make do with sight and sound, head-mounted screens, cameras, microphones and earphones. But even with this limited technology we can achieve a tremendous expansion and change in our abilities and society as we increase the access to, and throughput of, information and experience (Earnshaw & Vince 1995).
Education In the slow-moving world of the ancients, who wrote and drew in the sand, on clay and parchment, education followed the master-disciple model, whereby only a select few were chosen to be educated by a very few teachers. The world was a slow-moving place where innovation and technology were alternately promoted and constrained by war and religion (Bronowski 1973). With the invention of the printing press major changes evolved: this was a new means of propagating the written word and, more importantly, ideas. Mass education started to take off; formal systems, teachers and classes grew in size and number throughout the developing world. Up to, and throughout, the industrial revolution this "Sage-on-the-Stage" system of imparting knowledge was very effective. Regimented classes of 30-50 children, drilled by a 5
Exponential education single teacher, proved an efficient and essential means of educating the armies of people required to fuel the transition of society from agriculture and cottage industry to mass production and industrialization. Up to the end of this era, change was still relatively modest and within the grasp of the individual, and so was education. However, at the dawn of the information age, the system and individuals were beginning to creak under the pace of change and demand for more diversity (Tof-fler 1971). Long-held wisdoms of science, technology, economics and commerce started to shift or became increasingly challenged. In contrast, other topics such as mathematics, history, sociology and law remained relatively stable for a further 30 years. Today, nothing is stable, nothing goes unchallenged, and certainly our accepted modes of education and training are under threat as the world accelerates into the information age (Emmot 1995). Let us examine this change in more detail. Thirty years ago the vast majority of children came from homes with few books and went to school for education. Today, unfortunately, the reverse is often true. For many children with top end computers, access to cos and networks at home, they see school as having little to offer. Interestingly, in numerous programmes with children, it has become abundantly clear that the primary impediment to progress is not the young people. It is the older generation who are trying to impart their experience and knowledge who present the key limitation (Cochrane 1995a). For the most part our society appears divided at about the age of 29, with those older computer illiterate, and those younger fully able. So it is not unusual to find a class dominated by a teacher who is IT illiterate, and feels threatened by a class full of capability. This problem is compounded by the lifestyle of children which is now partly governed by the games environment (Martyn, Vickers, Feeney 1990) ofintuitive learning, and a "crash and burn" culture. They feel no inhibition in discovering by doing, and coming to grief in full public gaze, while the cultural background of their elders is the converse. It is perhaps not surprising to find that many youngsters view university, college and school as boring, where the teaching methods have not changed in aeons. These young people happen to come from a world of instant gratification, of IT, and rapid access and experience, of new and dynamic skills learnt in new ways (Cochrane (ed.) 1994a).
Examples of new ways Just two decades ago a young child would have learnt to tell the time on an analogue clock-face, and the digital form would have been unusual. If they later developed an interest in science, engineering or flying, they would come to grips with the vernier scale and altimeter by a single6
Examples of new ways
step analogy with telling the time. Today, the converse is often the case. They learn about flying very early, and you cannot fly an F16 simulator if you do not learn about cockpit instrumentation. So telling the time on an analogue display involves analogical reasoning in the reverse direction of 20 years ago. Finding information has always been a social activity. The Dickensian library offered a degree of order and mapping that allowed a fair degree of success by the individual. However, much of the information retrieval process of this old world involved finding knowledgeable people; teachers, friends and colleagues could usually help steer us in the right direction. In the IT world we now have search engines and Gofers or Agents that serve the same purpose (Milne & Montgomery 1994). We can also communicate electronically with vastly more people to gain their assistance and steer. With such devices, students can search, find, sort and assemble information hundreds of times faster than previous generations. Curiously, older people, especially teachers, often consider this as cheating. There are now over 24,000 CD titles available for use with PCs containing everything from classic books and whole-body interactive encyclopaedias to scientific experiments and university degree courses. The
Figure 1.1
Children using computers: an on-line school session.
7
Exponential education teaching of some difficult topics in science, statistics and engineering can now be enhanced significantly through computer animation and visual representation. Instead of static words and two-dimensional pictures on paper, students can interact with three-dimensional entities on the screen to experience cause and effect first hand. There is a growing library of standard experiments and situations available, along with medical operations, Shakespearean plays and legal cases. In this regard, interactive multimedia is providing an often superior alternative (Martyn, Vickers, Feeney 1990) to individual teachers and books for large tranches of education. It is now possible to illustrate and explain immensely complex systems and situations with the technology of visualization and virtual reality (MacDonald and Vince 1994). Unlike Crick & Watson, students should not have to construct a model of DNA using cardboard and coathangers (Crick 1994). Access to mathematical representations of a visual form that is both exciting, stimulating and edifying is now a given in modern industry. Leading manufacturers no longer construct prototypes, but the real thing in virtual space, and then go straight to the production line with the finished product (Earnshaw & Vince 1995; Yates 1992). Education needs this technology too.
Shared experiences With telepresence technology it is now possible for a one-to-many or one-to-one experience to be realized efficiently on a massive scale. The
Figure 1.2 The surrogate head surgeon.
8
Virtual university surrogate head is just one development where miniature television cameras above the eyes, and microphones above the ears, collect information in real space and time. This can then be transmitted and displayed on screen, or a VR headset, to one or more people in any location on the planet. So a surgeon can perform an operation with a thousand students standing inside his or her head looking out. Conversely, when a protege performs the same operation, the surgeon can stand inside and advise in the closest possible sense (Cochrane 1994). Within the next 15 years the addition of touch to such systems will make this human experience almost complete. This might sound farfetched, but it exists in the laboratory today, and has been used for real operations on humans over standard dial-up ISDN circuits (Cochrane, Heatley, Pearson 1995). This technology is applicable to a wide range of disciplines, and has the potential to completely change the education and training paradigm to just-in-time.
Half-life education In fast-moving areas of technology many degrees now have a half-life of less than five years. Moreover, the time when a single discipline degree was sufficient for a lifetime of work has long gone (Gell & Cochrane 1995). For example, it is not unusual to find electrical engineers now concerned with biology, sociology and genetics. So it seems time to create a new form of degree that is much lower, broader, more generic, and able to equip people for a world that will change rapidly over a working lifetime. In addition, a series of higher degrees are necessary that can be rapidly acquired as technology and work practices change. However, as business life and industry also accelerate and demand increases, then so does the pressure to hang on to the scarce welltrained resource that is key to the success of the very enterprise itself (Hague 1991).
Virtual university It is partly in response to the above paradox that five years ago BT created a series of internal degree courses. Their organization and running were under the auspices of several conventional universities banded together to create the desired profile and course content (Cochrane 1995b). Interestingly, this content is increasingly dynamic as each year sees the course material change to meet the needs of a fastmoving business. Everyone wins: the students who become empowered and capable, the company that has the workforce it requires, and 9
Exponential education the universities that gain access to key people and activities in industry. At first the courses were conventional, with students and teacher gathered in a lecture theatre for a few hours a week, followed by tutorials and assignments. More recently a new format began to unfurl, whereby lecturers from North America and other regions were teleported into the lecture theatre by suitably mounted cameras and ISDN dial-up lines. They appear on a three-metre-square back-projected screen to give their lectures eye-to-eye. Only two years ago such a lecture was costing £60 for the communication connection, and today it is only £40, much less than the hotel charges for a real lecturer in a real hotel. There are those who would argue that this is not a real experience, and it is not as good as the real thing. While this may be true, the choice is rather more stark: either you have the electronic experience, or none at all! On that basis, the students would sooner have world experts in front of them electronically rather than never getting their presence. More recently the next step has been taken: teleporting the event to the desks of individual students so that they no longer have to break away from work, and they do not have to crowd into a lecture theatre. They can now attend courses or tutorials, and interact with each other, directly on the screen.
Figure 1.3 The virtual university.
10
The critics Within BT this experiment has now been ratified as the primary model for future company education and training. The key discovery has been that the downside of apparent isolation at the desk can be overcome by a series of short communal periods where everyone on the course meets and works together. To date technology presents a poor meeting environment for people. The images are small and distorted, often with sound and vision slightly disconnected. After a first real face-to-face meeting, however, these deficiencies tend to be overlooked and the participants just get on with working together. In the not-too-distant future new display and audio technology will provide life-size images of near-zero distortion and daylight brightness. This is expected to extend this education and training regime significantly, and may totally remove the current need for real interaction. Experiments on breaking down the social barriers (Cooper 1994) and establishing trust and relationships will thus form a primary target in the next phase of development.
The critics There are very few of us who look forward to, or enjoy, change on a large scale (Toffler 1971). This is certainly true of the education establishment and many who are involved indirectly. The primary direction of criticism always seems to be, "That's not the way they did it in my day!" I suppose if we went back to the time of Archimedes and Aristotle, people were saying much the same thing about their methods of teaching. The reality is that just 50 years ago in British universities the lecturing and teaching practice was totally different. Today you can still see the benches at the front of lecture theatres where experiments on a grand scale would be conducted in front of an enthralled class. This was real experience, and teaching in a manner that is now long lost. Why? Because education has been squeezed and changed continually. This has resulted in small universities with very small departments trying to do far too much in too short a time. Students are being asked to take in more and more information and experience in less time while staff-student contact time continues to decline (Gell & Cochrane 1994). Ultimately education is becoming impossible relative to the rate and breadth of change in a world of technologically driven progress (Ravitch 1995). Most active university staff have far too many research students, and far too many classes to teach. To exacerbate the problem, most university departments seriously lack the necessary number of people with the right abilities. No doubt, all of the abilities required to create a suitably well qualified, skilled and able department are available in the country. 11
Exponential education However, they are seldom, if ever, available in one location - a university (Hague 1991). The virtual university, an ethereal space in the information world, overcomes this problem, and allows groups of people with the right interests and skills to come together to work and be proactive. The problem is that it does mean a different mind set, and a different way of doing things (Lyons & Gell 1994). Unfortunately for the traditionalists there is no other solution that will allow us to meet the challenge of technologically driven change in our society. It is, therefore, imperative that we embrace the technology, and experiment to find out what works and what doesn't (Handy 1990).
The virtual world today On a Saturday morning I can struggle into Ipswich, park my car, walk across town, buy some software at a high price and pay VAT. Alternatively I can go onto the Net, access the software directly from the supplier in the USA, pull it down the Network, and pay for it without even leaving my machine. The advantage is not only in the time and inconvenience saved but in the lower cost of a product that no longer requires a wholesaler, distributor, retail outlet or VAT. The same is true for the library, the bookstall and potentially for all forms of "soft products". Such thoughts alarm many people when they ought to make them feel relaxed. For this virtual world is not an instead of, but an aswell-as, technology. It opens opportunities for new ways of doing things, new forms of trading and enterprise, and new dynamic markets. Shopping, entertainment, education and training, from your desktop at home, at work, or wherever you happen to be, are now very real options (Heldman 1988).
Society and change While technology changes our world irrevocably, there are some features of it that will remain for many decades to come - but not many. For example, consider such stable institutions as government, banking and the City of London. We currently have a governing mechanism that involves people sitting two sword-lengths apart acting like demented schoolchildren in lengthy debates eyeball to eyeball. The decisionmaking processes of this system is orders of magnitude slower than counterparts in the virtual (electronic) world. Similarly, financial institutions are being touched by technology in ways that are changing them, and impacting on our society. The vast majority of bank branches
12
The future are no longer required. It is possible to run the entire operation from one location or even no location at all. The same is true of the City as it now deals primarily with information rather than money, for gold has become an abstract concept, as are the pound and other currencies. If such solid institutions are being challenged (Drucker 1993) by technologically driven change, then the role of an education system is to prepare the population for the new world that will result. It is vital that the education and training sector produces the right people with the right skills. This will not happen by following the market; education (Gell & Cochrane 1994) has to get ahead. The difference between the old world and the new is exemplified by the typing pool. Only a decade ago most large organizations had such a resource staffed by young women, their sole purpose to take handwritten or spoken text and transcribe it onto the typed page. The process could take several days depending on the queue length. The very thought is inconceivable today; who would operate in such a way? Things are now turned around in a matter of minutes, not days. Modern companies operate with telephone, fax, e-mail, video conferencing; they have very low flat structures (Lyons & Gell 1994) with people empowered to make local decisions and get on with the job fast. Any form of delay is just inviting the competition to take away your business and markets. The same is increasingly true in education and training - any school, college, university or training establishment that sits back and continues to exclusively use the old chalk-and-talk methods is destined for extinction (Gell & Cochrane 1995; Ravitch 1995). We have to move forward with the technology if we are going to keep up with a world that is changing ever faster.
The future My father had a working life of 100,000 hours; I can now do his work in 10,000 hours; my son will be able to do it in 1,000 hours, and so on (Handy 1990). The work that took me a whole morning as a young engineer is now completed in less than 15 seconds by the power of computer-based automation. This level of progress is assured for at least another decade as we can see all of the techniques, and all of the technologies on the laboratory bench today. It is likely that this progress will continue for at least another two decades and probably three, but after that we reach the ultimate limit of using sub-atomic particles as components (Drexler 1990). There is little doubt, as history shows, that our innate curiosity, creativity and inventiveness will generate even more technology (Bronowski 1973) and cause more change beyond silicon and silica.
13
Exponential education However, we are at a unique epoch, and there is a new proviso; for the first time in our entire history, we have to keep up with the technology. We have to stay ahead, stay educated and trained, and somehow understand things that currently defy our limited wetware. Tapping the exponential power of the technology itself (Pagels 1988) appears the only option if we are to live and prosper as individuals and a society (Cochrane (ed.) 1994b).
References J. Bronowski, "The Ascent of Man" (television series) {London: BBC, 1973).
W. H. Calvin, The ascent of mind (New York: Bantam, 1991). P. Cochrane, "Communications, care and cure", Telemed '94 conference, Hammersmith Hospital, London, 1-4 September 1994. P. Cochrane (ed.), "The potential for multimedia, information technology and public policy", The Journal of the Parliamentary Information Technology Committee 13, 3, Summer 1994a. P. Cochrane (ed.), (37) Special Series on "The 21st Century", British Telecommunications Engineering, 13, Pt 1, April 1994, continuing into 1996. Features a wide range of articles on technologies and applications concerned with education and training. P. Cochrane, "Desperate race to keep up with children", The Times Educational Supplement, 23 June 1995a, p. 25. P. Cochrane, "The virtual university", Business of Education 5, March 1995b. P. Cochrane, & D. J. T. Heatley, "Aspects of optical transparency", British Telecom Engineering 14, 1, April 1995, pp. 33-7. P. Cochrane, D. J. T. Heatley, I. D. Pearson, "Who cares?", British Telecom Engineering 14, 3, October 1995, pp. 225-32. P. Cochrane, & F. Westall, "It would be good to talk!", paper presented at Second Language Engineering Convention, London, October 1995. M. Cooper, "Human Factors in Telecommunications Engineering", special issue of British Telecom Engineering Journal 13, 1994. F. Crick, The astonishing hypothesis: the scientific search for the soul (London: Simon & Schuster, 1994). K. E. Drexler, Engines of creation: the coming era of nanotechnology (Oxford: Oxford University Press, 1990). P. Drucker, Post-capitalist society (Oxford: Butterworth-Heinemann, 1993). R. A. Earnshaw, & J. A. Vince, Computer graphics: developments in virtual environments (London: Academic Press, 1995). S. Emmot, Information superhighways: multimedia users and futures (London: Academic Press, 1995). M. Gell, & P. Cochrane, "Education and the birth of the experience industry", paper presented at European Technology in Learning Conference, Birmingham, 16-18 November 1994. M. Gell, & P. Cochrane, "Turbulence signals a lucrative experience", The Times Higher Education Supplement, 10 March 1995, p. 11.
14
References D. Hague, "Beyond universities: a new republic of the intellect", Hobart Paper, Institute of Public Affairs, London, 1991. C. Handy, The age of unreason (London: Arrow, 1990). R. K. Heldman, ISDN in the information marketplace (Blue Ridge Summit, PA: TAB Books, 1988). P. Kennedy, Preparing for the 21st century (London: HarperCollins, 1993). R. Lilley, Future proofing (London: Radcliffe Press, 1995). M. Lyons, & M. Gell, "Companies fu'ld communications in the next century", British Telecommunications Engineering Journal 13, 2, 1994, p. 112. L. MacDonald, & J. Vince, Interacting with virtual environments (Chichester: Wiley, 1994). J. Martyn, P. Vickers, M. Feeney, "Information UK 2000", British Library Research (London: Bowker-Saum, 1990). R. Milne, & A. Montgomery, "Proceedings of expert systems 94", British Computer Society (Oxford: Information Press, 1994). D. A. Norman, The psychology of everyday things (New York: HarperCollins, 1988).
H. R. Pagels, The dreams of reason: the computer and the rise of the sciences of complexity (London: Bantam New Age Books, 1988). D. Ravitch, "When school comes to you: The coming transformation of education and its underside", The Economist, 11 September 1995, pp. 53-5. E. Regis, Great mambo chicken and the transhuman condition (London: Penguin Books, 1991). A. Tofller, Future shock (London: Pan Books, 1971). I. Yates, Innovation investment and survival (London: The Royal Academy of Engineering, 1992).
15
2
PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE Stephen Heppell
The development of educational technology from Skinnerian teaching machines onwards has offered a contrast between rapidly advancing technological potential and slower pedagogical, social and political development. Initially, this had not necessarily been disadvantageous; in many cases technology failed to deliver on its potential, often embodying models of learning that owed more to convenience than cognition. Technology's contribution was often then touted with a triumph of hype over hope and institutional learning's conservatism provided a pragmatic, and welcome, buffer against the tide of misplaced optimism. However, it is wrong to assume blandly that this will continue to describe the state of affairs in learning technology. Technology continues to advance its potential exponentially and, inexorably, we reach a point where rhetoric is eclipsed by reality and learning technology holds out the hope of simply better learning, whatever we mean by that. Unfortunately it is not as simple as "sitting and waiting" for progress to take root. Many systemic barriers to progress must be surmounted and many confusions result from the mismatch of technological and pedagogical progress. Far from a model of irredeemable technological determinism, the choices thrown up by these confusions are the stuff of political and social debate with real choices to be made: for example, it could be suggested that increasingly affordable micro-technology liberates individuals from old forms of capital. In publishing and communications, for example, we have seen the economies-of-scale barriers to entry of new competition fall away rapidly as everyone with a desktop micro and a laser printer becomes a publishing house, and the Internet has offered access to vast audiences for minimal capital outlays. But DOI: 10.4324/9780429332289-3
17
Pedagogy, progress, politics and power in the information age equally we could argue that, although the barriers are lower, lack of access to new communication technology has created a further disenfranchized techno-poor minority. Similar political and social debate should surround concepts of what public service looks like in cyberspace, whether information is a new factor of production or a new form of capital, whether teleworking liberates or imprisons ... and so on. Technological progress in this way is posing some fundamental questions for nations which are a long way from the simple grasping of "the white heat of technology". Indeed, as telecommunications reduce our reliance on geographical proximity the concept of the nation state itself becomes challenged; will I vote and pay taxes with my geographical neighbour, or with the electronic community that I work, shop and socialize with? However, these are broad and general issues for future debate. This paper will reflect on more pressing concerns. Firstly, the emergent capabilities both of technology itself and of the "children of the information age"; reflecting on the challenge that these capabilities pose for existing models of education and assessment. As I demonstrated in my conference plenary address 1 , technology allows us to offer powerful support for small cultures whether they are linguistically determined (for example Catalan) or (like Deaf culture) based on some other common circumstances. We can dictate to our computers, they can reward our engagement with multiple media types - speech, text, graphics, aural ambience, video - a tapestry of cues, clues and primary information. This should not mean that we require our school students to be media eclectic, strong in every media type that technology can support; already we disadvantage those not fluent in textual notation (for example dyslexics) by an insistence that we filter much of our children's learning through the ability to represent it textually. Requiring them to be strong in all other media too would be to further narrow the corridor of success. What we are seeking is media redundancy where children can derive and represent meaning from a menu of media type/s and this of course has profound implications for the assessment and examination system. Children too pose challenges to that assessment system. It is clear from research at ULTRALAB that children are adept performers with (and through) technology. Faced with new tasks and problems they adopt strategies (for example Observe, Question, Hypothesize, Test and Reflect), they represent Process to each other ("look at how I did this" rather than "look at what I did"), they work collaboratively and they multi-task. When asked for example to watch multiple television programmes simultaneously they adopted a strategy which reflected their --------·-----------------·-·--· 1. At the First International Soclnfo Conference: (TESS) Technology and Education in the Social Sciences, 5-7 September 1995. For details refer to: http://www.stir.ac.uk/socinfo
18
Pedagogy, progress, politics and power in the information age own understanding of media (for example they used their knowledge of genre and of the role of aural information) which allowed them to answer detailed questions afterwards about minutiae ("what colour was the ...") and also successfully to tackle meta-level questions about character development and production decisions. They showed themselves to be highly media literate and yet much of our pedagogy and assessment fails to allow them to reflect this capability. Worse still, as we abandon (for good reasons) our reliance on normative-referenced testing in favour of criteria referencing we find that technology moves the criteria faster than we can pin them down, with the result that either the assessment model becomes an unacceptable drag on progress or we are uncertain about the quality of our assessment procedures. Ten years ago I could have gained a recognized qualification by typing at n words per minute on a manual Remington typewriter. Now "speech to text" technology lets me dictate to a portable computer faster and with fewer errors; do I still qualify for the certificate? Our constant problem with technology has been to look at its impact on an existing model of behaviour. Too often we make judgements from a deficiency model of both people and technology ("they can't use it and it doesn't work"). In 1939 the New York Times commented that "The problem with television is that people must sit and keep their eyes glued to the screen. The average American family doesn't have the time for it", which undervalued the ability of technology and of individuals to modify behaviours. In 1967 Chu & Schram looked back on half a decade of research into the impact of colour television on learning. They concluded from the research data that "Where learning is concerned colour television has no distinct advantage over monochrome", which was true in retrospect because at that point television companies had failed to grasp the new ways that colour might allow them to represent knowledge and entertainment. One of the first TV entertainment programmes to be converted into colour was The Black and White Minstrel Show, which shows how easy it is to miss both social and technological change. Similarly today much of the output of publishers on CD ROM is simply in the form of electronic books and the resultant multi-mediocre both misrepresents the potential of technology and the capability and new literacy of technology. A second important area for current debate centres around the shape of our media services and the institutional or public policy that attempts to keep pace with them. As computers put communication tools into more and more hands the national debate about preserving standards and about what is appropriate or inappropriate is reminiscent of the church's rearguard action to preserve literacy for itself as printing began to impact on our social lives. From pirate radio onwards the democratization of communications 19
Pedagogy, progress, politics and power in the information age has been characterized by stout defences of position by existing institutions. In the context of learning, schools and universities as institutions too have worked to preserve and strengthen their role in formal learning. Schools even encourage parents to create little institutional microcosms in the home by sending students back with homework, while parents respond in some cases by setting up little school desks and trying to recreate the classroom in the bedroom ("you wouldn't have the radio on in the classroom would you?"). Suddenly, however, the learning industry looks a lot bigger than schools and universities and high quality learning will be occurring through other channels like the digital annotative side channels offering parallel commentary to TV programmes. A huge challenge for educational institutions will be the way in which they respond to these new learning environments. There are already popular "project collaboration and exchange" areas available on the Internet and these can either be seen as an appropriate and imaginative use of technology or as cheating. Education's response (and the way it addresses the issues of social equity raised) will determine its future significance in the learning industry, just as the church's response to mass literacy was crucial in shaping its own destiny. For politicians looking to build policy in the information age it should be clear that alternative scenarios can and will result and the extrapolation of policies to build those scenarios will become a key differentiator of political perspectives. Not so long ago in us politics it was a universal tenet that "motherhood and apple pie" would always be a Good Thing but now our understanding of the changing dynamics of the family, and of nutrition and diet, leaves many credible shades of opinion about just how "good". Similarly our view of technology as a Good Thing needs to evolve levels of sophistication; the critical awareness that social sciences bring will be crucial to this process and education needs to be at the heart of the debate if it is not to be excluded.
References G. C. Chu & W. Schramm, Learning from television: what the research says (Stanford, California: Institute for Communication Research, 1967). A. M. Gillaume & G. L. Rudney, "Student teachers' growth towards independence: an analysis of their changing concerns", Teaching and Teacher Education 9, 1993, pp. 65-80. M. C. Heck, "The ideological dimension of media messages", in Culture, Media Language S. Hall et al. (eds) (London: Hutchinson, 1980). S. Papert, "Literacy and letteracy in the media ages", Wired 1(2), May/June 1993, pp. 50-52.
20
3
TECHNOLOGY AND SOCIETY: AN MP'S VIEW Anne Campbell
We are living through a period of intense technological change. The number of people employed in manufacturing industry is down from around 7 million in 1979 to 4 million today. These changes have left many scars and caused insecurity and unease among those who are still employed as well as those who have given up hope. Much of the population has been left with a feeling of deep suspicion. In many quarters an "anti-science" culture is developing, particularly among the young. The free market approach to technological development has left many people without access to the new technologies and consequently the gap between the "haves" and the "have-nots" has grown. What is required is more vision. Technology could be employed to open up the opportunities for education and research for millions of people. The government's role must be to ensure that such access is available to everyone and that the information revolution is used to encourage opportunity, equality and democracy. Social scientists need to engage in this process and find ways to understand the implications of these technological advances. Using the example of my own constituency in Cambridge, this paper illustrates how, through free public access points across the city, citizens can access socially useful information about the city, council services and information from government agencies.
Science, technology and government policy In May 1993, the Conservative Government produced a White Paper called "Realizing our potential: a strategy for science, engineering and technology". Their strategy was to improve the nation's competitiveness and quality of life by maintaining the excellence of science, engiDOI: 10.4324/9780429332289-4
21
Technology and society: an MP's view neering and technology. However, this was accompanied by a sharp reduction in the funds available for science and technology across government departments. The Government also expressed its concern that public money spent on science and technology might not always be directed in a way that best satisfied industrial needs. A Technology Foresight exercise was established in order to predict the future needs of society. Few would argue that this exercise has no merit. It is helpful for the Government, academics and the business community to sit down together and discuss issues of common interest. Nevertheless there are many concerns that focus on the way the results of this exercise might be used. Some parts of the White Paper certainly sent a chill through the scientific community. Many scientists believed that the Government would try to restrict its spending on science and technology to those areas most useful for industrial application. In fact, it has been proved on numerous occasions that attempts to pick industrial winners often meet with failure. The pressure group Save British Science sent out a Christmas card to MPS in 1993 which cited examples of research projects that had been turned down for funding in the past, because they did not appear to have any industrial application. Liquid crystal display technology, which was invented in the UK, was developed abroad because nobody in the grant-awarding bodies believed that it was an idea which had any commercial future. Many scientists have argued that many of the best commercial ideas come from "blue skies research". We risk missing the more innovative and exciting developments if the direction of our scientific effort is determined by society's existing needs. It is important that scientists are given the freedom to continue with blue skies research to ensure that we do not extinguish future opportunities to improve our lifestyles. It is not too difficult to predict some of the areas in which scientists and engineers will develop the technology in the five or ten years which lie ahead, but it is much harder to imagine how people will adapt to the changes which it can bring. The role that social scientists can play here is therefore quite important. WB need to understand the new social processes that are emerging with the developments in technology. Since technology can create new needs, we should not limit ourselves to merely doing more efficiently or more cheaply those things which we can do already. Word processors are not simply about typing letters more quickly; they allow people to think in a different and more flexible way. Mobile telephones are not just different ways of using the telephone; they enable people to be in touch anywhere at any time. The Internet is not only a means of downloading information; it encourages 22
The information gap communication and interaction across national boundaries as well. All these changes have been supported and welcomed by the people who could afford to pay for them. Successful technological development is often a leap of faith. It depends on being able to predict the new needs which are generated by the scientific progress which has been made. Thirty years ago it was not anticipated that computers would be used to do anything other than high-speed mathematical calculations. Now we see them being used to organize information in a way that has revolutionized our lives. The ways in which the national communication networks are used in future will depend very much on human ingenuity and imagination.
The information gap We must ask whether these advantages will benefit everyone or will leave us with an underclass of information poor. Will they increase employment so that everyone can afford to have shopping, entertainment, education, business, and so on, all available from home? From present trends that seems doubtful. Without intervention, the free market will force open the divisions in society even wider than they am at present. People who are employed will acquire the experience and upto-date technological skills to flourish in the new age. The "haves" will rapidly accumulate more and the "have-nots" will have no relevant skills to pull themselves out of the poverty trap. When 80 per cent of the population are using electronic mail, what happens to those dependent on the daily mail deliveries when the postman disappears? Does electronic surveillance drive the homeless even further from the centres of civilization, to dark hidden corners where they become invisible? Will the corner shop disappear completely with the advent of teleshopping? How will the "have-nots" manage in those circumstances? There are some serious and difficult issues to do with access and equality. It may be easy if you have the necessary computer and modem and can afford to pay the subscription to an Internet provider, as well as the expensive phone bills which arrive when you have been surfing and forgotten the time. If you have never been able to afford a telephone, which is the situation for up to 75 per cent of households on some housing estates in Britain, then life is bound to be much more difficult. These issues are the cornerstone of social scientific research and more work should be done in this area. 23
Technology and society: an MP's view
The Cambridge experience About a year ago, I decided to make use of the new technologies by giving my constituents the facility to contact me by e-mail. It is probably a more viable prospect in Cambridge than in most other constituencies, since about 30,000 of my 70,000 constituents already have access to e-mail. About 25 per cent of my constituency mail arrives this way. I also give my constituents the chance to contact me at an e-mail advice surgery. This specifies an exact time when I shall be sitting at a terminal ready to receive messages and I try to respond to them immediately. But for many, and for 40,000 of my 70,000 constituents, there is no access and probably little inclination. What is the point of connecting when you have never used a computer anyway and you just do not believe that there is anything on the Internet which would be of any conceivable use to you? In Cambridge we have attempted to tackle these problems by launching the Cambridge On-line City Project, with the aim of providing socially useful information and free network access for people to whom it would not normally be available. In its first phase, six public access points have been provided in public buildings such as libraries, community centres and council offices around the city. This has now increased to 17. The information is provided via a Website. 1 This contains an A-Z of council services, an index of voluntary groups, advertisements of council leisure facilities, information on where to get benefits advice, and some links to job vacancy databases. We hope to add doctors' lists, NHS dentists, council house exchange lists, chemists which are open late and many others. Many public service organizations have been consulted, and are enthusiastic about having their information provided over the Web. In these early stages, the project has relied on the generosity and goodwill of local companies and local councils. Cambridgeshire County Council and Cambridge City Council have contributed officer time, Cambridge Cable have provided telephone lines, and Uunet have given server space and technical support, with further support from CMS and Software AG. In order to progress, the project will need funds to employ a manager and to expand the system. The success of the recent lottery bid will ensure that this happens. It is also hoped to have an IT learning centre so that people can pick up the skills required in order to be comfortable with the technology. In the future, this facility could also be used to provide a feedback mechanism so that users can comment on council services and on other public services as well. In a 1. http://www.worldserver.pipex.com/cambridge/
24
Access and empowerment: some lessons properly developed framework, it could greatly increase the accountability of councillors, MPS and other elected public representatives, particularly if the comments were accessible in an open and public way. Another project that will link-in to the Cambridge On-line City is the Cambridge Childcare Information Project, now called Opportunity Links. The purpose of this venture is to help parents get back to work by providing most of the information that they need in one location. There has been a very generous response from the organizations and firms which we approached about sponsoring this project. Initially, £10,000 was raised from commercial and public sources which has enabled us to employ a part-time project worker to collect the information. A Website was launched in 1996 to give parents some basic advice on childcare: the different kinds available, their relative costs, local nurseries and playgroups. Other information about ways in which to look for a job, how to find appropriate training, and benefits advice together with "better-off" calculations are also supplied. The continuation of this project will rely on local government and businesses donating sufficient funds to continue to employ project workers. The need for such a service is clearly there. In the future, these information services will revolutionize libraries and welfare information provision. It will be cost-effective to spend some public money but it is difficult to envisage government, local or central, being able to afford the expenditure to assure completely open access. There are commercial advantages for the private sector wishing to provide additional entertainment and leisure facilities. This could stimulate the development of publicprivate partnerships that will provide the systems and access facilities, giving benefit to both community and business.
Access and empowerment: some lessons There are many issues that I have had to consider in my own personal use of the Net. A fundamental belief is that participation in the information society should be available to all, and not just the privileged few. 2 There must be equality of access and we must seek to empower citizens both as participants and consumers, as well as providing equal access for the providers of services. The new networks must help to increase citizen participation in decision-making and contribute to the 2. The Labour Party held a Superhighway Policy Forum in 1995, a wide-ranging investigation into the effects of the new networks and how government can manage them to benefit the many, not just the few. Its findings (Labour Party (1995)) were adopted by the Party conference in October 1995. Labour Party, Communicating Britain's future (London: The Labour Party, 1995).
25
Technology and society: an MP's view development of a more open society. At the same time, legislation should be framed which enables privacy to be respected and legitimate rights to the ownership of information to be acknowledged. Government itself can become more open and accessible through this process. The implications for education and research are central to issues of access. 3 The opportunities to learn will undergo the same massive expansion as occurred when the first public libraries opened. But this is a new kind of active learning, since it will involve interaction with individuals and not just passive absorption of information. How much more then will learners need guidance through the maze of information, learning packages, electronic courses and offers of tuition. The role of teachers and lecturers will change for the better: there will be more individual direction and guidance, less bureaucratic recordkeeping and fact-giving. Programmes can be tailored to the needs of individuals, but that individual will still want human contact and human input to learn in the most effective way. The national communication networks have the potential to open up learning channels for very many more people than those who benefit from further and higher education at present. It is through this new technology that we see "The Learning Society" within our grasp. It is vitally important that we take hold of these chances and use them to improve the quality of life for all our people. But that takes more than pure commercial development. This is not an area that we can safely leave to the scientists and the business community. It is one that requires government intervention and the social scientific community to ensure that the benefits are available to everyone. Let us make sure that the information revolution does not worsen the divisions in society, but is used to enable opportunity, equality and democracy.
References Office of Science and Technology, Realizing our potential: a strategy for science, engineering and technology (London: HMSO, 1993). Office of Science and Technology, Progress through partnership, 1-15 (London: HMSO, 1995).
3. For papers aod information on the development and effects of IT in education see ULTRALAB's Website at: http://www.ultralab.anglia.ar..uk/pages/ultralab
26
4
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY Adrian Kirkwood
Although there has recently been a significant growth in the use of information technologies in the workplace, in education and in the home, there is little evidence to support the technologically deterministic predictions about IT becoming ubiquitous throughout society and about radical social changes that would follow. In Western countries, the impact of rr upon different groups in society has been varied, tending to reinforce rather than ameliorate existing inequalities. This chapter will examine some of the differences that exist between groups (primarily within the UK) in the extent of access to and use of IT in the home and in education. In particular, it will consider variations that exist in terms of gender, age and socio-economic group. As well as presenting the quantitative evidence for the existence of these differences, the chapter will consider whether IT is likely to exacerbate social differences.
Introduction For at least two decades predictions have been made about the imminent ubiquity of IT throughout society and the radical social changes that would follow. Alvin Toffler's view of a future society (1980) had at its centre the home; an "electronic cottage" in which not only paid work, but also leisure and service consumption, would be mediated through information and communication technologies. Although there has been a significant growth in the use of information technologies in the workplace, in education and even in the home, the technologically DOI: 10.4324/9780429332289-5
27
Information technology: a case for social scientific enquiry deterministic prediction of IT bringing about fundamental social change has failed to materialize. In Western countries, the impact of IT upon different groups in society has been varied, tending to reinforce existing inequalities (e.g. Forester 1988; Miles 1988). Those people with good access to IT and familiarity with its use often assume that their situation is typical. For example, Eliot Soloway (a professor at the University of Michigan, usA) introduced his keynote speech at an international conference in 1994 with these words: 'There is no longer a problem of access to computers." Perhaps access to IT is not a problem if, like Soloway, you are a male, white American in a middle-class professional occupation - if not, the situation might be viewed differently. In fact, the pattern of ownership and use of IT varies considerably between social groups. This chapter will examine some of the differences that exist between groups (primarily within the UK) in the extent of access to and use of IT in the home and in education. As well as presenting the quantitative evidence for the existence of these differences, the chapter will consider whether IT is likely to ameliorate or exacerbate social differences. It will also examine some of the social factors that tend to have been overlooked (or dismissed) by those who expound technological determinist predictions.
Access to information technology in the home Computers are not accessible to all, even in the richer industrialized Western countries. Many of the claims made by computer manufacturing and marketing companies about the numbers of machines available in particular countries are based upon their measures of output, usually "deliveries to the trade" - i.e. machines shipped out from their own factories, assembly plants or warehouses to retailers or other distributors. Even the consumer sales figures of computer retailers or other distributors provide little or no indication of who the purchasers are and whether the machines are being sold into existing markets (i.e. additional or replacement equipment) or penetrating new markets (i.e. first-time buyers). National social surveys can provide independent information about the extent to which households have computers and other media technologies. The us Bureau of the Census (1993) reported that there was a computer available in about 45 per cent of households in the USA, but only about 10 per cent of the homes of blacks or Hispanics contain a computer. In the UK, data from the General Household Survey for 1994 (opes 1996) indicates that less than a quarter of households (24 per cent) contained a computer, compared with 77 per cent having a video recorder and 47 per cent an audio CD player. (This 28
Access to information technology in the home figure for computer access is in line with data from commercial market research.) Even more revealing is the extent to which access in the UK has changed over the last decade. Figure 4.1, below, uses data from successive General Household Surveys from 1985 to 1994 to reveal trends in access to computers and media technologies. The growth in access to these three domestic technologies exhibits strikingly different patterns over this period. Home access to a video recorder steadily rose to over three-quarters of households (increasing by almost two and a half times, from 31 per cent to 77 per cent). There was a similar (but slightly more rapid) rate of growth in access to an audio CD player over a shorter period; more than tripling, from 15 per cent in 1989 to 47 per cent in 1994. Over the whole period, access to a home computer increased, but only very gradually (from 13 per cent to 24 per cent, often increasing by only 1 per cent per year). But why has the computer failed to penetrate more than threequarters ofuK households despite the high-profile marketing campaigns of the 1980s and 1990s? One of the reasons why computers have not become as ubiquitous as video recorders in the home (a forecast that was commonly made throughout the 1980s) is that many people are uncertain about what the multi-function computer could usefully do - - - -1111- - - - Video recorder
- = : - Home computer
- - - ,Ir -
Audio CD player
- -
80
.......
-
70
50 %40
30
----
-_-_-_______ , --•----·-lilt'
--•
-
.
60
-
_
-- --.....-..a"'
_,.
.,,,,..1;.-.,,,,..
-
20
---- _?---.,. It('
--
~-
.-dJ"""'
10 0 +----+----+----t----t-----t-----t-----t------t----1
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
Figure 4.1 Households in the UK with video recorder, home computer and audio CD player, 1985-94. Sources: OPCS, 1989 and 1996.
29
Information technology: a case for social scientific enquiry for them in the domestic setting. A video recorder and an audio CD player have clearly defined and easily understood functions within a household. Both offer increased convenience to users (extending control over when and what TV programmes and films can be watched, or the quality of music reproduction) and also a degree of continuity unlike home computers, they have not been subject to a rapid succession of changes that give rise to problems of technical incompatibility and obsolescence. Another factor must surely be the marketing and pricing policies of the hardware manufacturers, who prefer to increase the technical specification of computers on a regular basis rather than reduce the base price. Increasingly powerful machines are being marketed as "entry level" computers, but these still require a large financial outlay for many people. Software developers reinforce this strategy by frequently producing enhanced programmes that require ever more computer memory to operate.
Information technology at home: differences between social groups In the UK, domestic access to computing equipment is clearly not universal and the penetration of new households has been very slow. So who does have computing equipment at home? For more than a decade, much of the computer companies' marketing effort has targeted families with children of school age: a computer at home was desirable, if not essential, as it would extend the educational opportunities for children and allow them to practise and consolidate the skills they learn in the classroom. Has this strategy had any effect?
Families with children of school age There is evidence from the annual survey undertaken by the Independent Television Commission (nc) that homes with children are more likely than others to contain domestic media technologies, including a computer (e.g. ITC 1995). Figure 4.2 shows differential rates of access to a range of technologies. Data from the General Household Survey 1994 (Central Statistical Office 1996) provides confirmation that households containing dependent children are more likely to possess a video recorder, audio CD player and home computer. So, households that include children are more likely than others to contain domestic media technologies. What other differences can be identified between groups in society? 30
Gender differences
I ~ All homes
IS Homes with children
I
100 90 80 70 60 % 50
40 30 20 10
0 Video recorder
Satellite TV dish
Figure 4.2 Proportion of 1994. Source: ITC, 1995.
UK
Cable TV
Home computer
Audio
CD player
households with selected media technologies,
Gender differences Survey data on home access to technologies often fails to identify which members of the household make (or control) use of particular items of equipment. Even if there is a computer at home, it may not be equally accessible to all members of a household. Or, looked at another way, not all household members might choose to make use of a computer for leisure or entertainment, for educational purposes or for other domestic or business purposes. Market research surveys and studies of adult students, teenagers and children have consistently found that males are more likely than females to have access to a computer and to spend more time using a computer at home. For example, a study of 12-year-old children in England (Kirkman 1993) revealed that 5 5 per cent of the students in the sample used a computer at home - 70 per cent of the boys compared with 38 per cent of the girls. The amount of time spent using a computer at home averaged 7.1 hours per week for the boys, but only 4.2 hours per week for girls. Another study of English secondary school students (Robertson et al. 1995) found that 47 per cent of students had access to a computer at home, but that ten times as many boys as girls had sole access to a home computer and that there was a significant difference in 31
Information technology: a case for social scientific enquiry the extent to which they made use of them. In the USA, a study of high school students (Shashaani 1994) found 68 per cent of boys and 56 per cent of girls reporting the presence of a computer at home. When asked who used the home computer, two-thirds of the primary users identified were male. In a study of Norwegian undergraduate students, Busch (1995) found that more male students than female students had had a home computer before entering higher education (41 per cent compared with 24 per cent), and as college students the difference persisted, although not to the same extent. In many of these studies it has been found that the extent of use correlates with attitudes and perceptions about the potential value of computer-related activities and with performance on learning tasks involving the use of a computer. Home computers are more likely to be bought for the use of men and boys, and even when a machine is acquired as a family resource, the main users are very infrequently reported to be female. This might reflect the fact that computers were initially marketed for the male leisure industry (Haddon 1988). It is also associated with the greater control that tends to be exerted over domestic finances by men. Research undertaken with large numbers of adults studying with the Open University has consistently indicated that men are more likely than women to have access to a computer, either at home or at their place of work (Kirkwood & Kirkup 1991; Kirkwood et al. 1994; Taylor & Jelfs 1995; etc.). Furthermore, men were much more likely than women to have made the decision to acquire or upgrade home computing equipment and to make use of such equipment in the home. For example, in a large-scale survey conducted in 1995 (Taylor & Jelfs 1995), over 40 per cent of female students had no access to a computer (either at home or at work) compared with only 25 per cent of male students. When students with a computer at home were asked who in the household provided the main impetus to acquire computing equipment, 77 per cent of men - but only 41 per cent of women - answered "self"; 26 per cent of women indicated that their spouse or partner had been the main decision-maker, compared with only 4 per cent of men who answered that way. Patterns of use also favoured men. Half of the females with access to a computer at home reported that their spouse or partner made frequent use of the equipment, compared with only 26 per cent of male students.
Occupation and social class differences The basic data on domestic access to media technologies also conceals social differences. Where the head of a household has a high-status
32
Is information technology bringing about fundamental social change? --♦ -Telephone
100 90 80
- - Ill- - -Video recorder
·•·----- ♦----
~ +
_ - - - - - - - 111 - • - -- •
-- ♦- -
70 .6.- - - - - - - .... __ • __ - ...~ 60 o•
in
50
-
-
-•--:
·•--------Ill--
--
- -.6, - CD player
:-:-=--·•----
--0-Home computer
- -....-..:.:..:---•.
_.,,..,.,·•
. . . - ... -c:..
--.. . .J;,,.-------A.. _............ ....... A- ...
40 30
-- --....
''
...............
20
' ....
10
0-t----t------,t-------t----+-----+---+-------l
Professional
Employers & Intermediate managers non-manual
Junior nonmanual
Skilled manual
Semi-skilled manual
Unskilled manual
Economically inactive
Figure 4.3 Access to information and communication technologies in UK households, 1994 (by economic status of head of household). Source: OPCS, 1996.
occupation (i.e. classified as being in the categories "Professional" or "Employers and managers") there is greater likelihood that the home will contain a telephone, video recorder, audio CD player and home computer than if the occupation is classified as "Semi-skilled manual", "Unskilled manual" or "Economically inactive" (opes 1996). Data from the 1994 General Household Survey is presented in Figure 4.3. It is not just a matter of those using stand-alone computers: those in the higher socio-economic groups are more likely than others to participate in computer-mediated communication and have access to networks. A survey conducted by Continental Research in September 1995 (quoted in ITC 1996) found that less than 7 per cent of the UK population had access to the Internet, which was mainly available at the workplace. The user profile is biased towards younger men in the higher occupational categories. However, an earlier survey by the same organization (quoted in Screen Digest) indicated that 23 per cent of UK company executives had access (either at home or at work) in June 1995.
Is information technology bringing about fundamental social change? So the evidence does not support the predictions of IT becoming ubiquitous in Western countries in the near future, particularly in terms of home access. But what of the fundamental social changes that were 33
Information technology: a case for social scientific enquiry expected to be brought about through the widespread use of IT? Has IT made any contribution to changes in society and, if so, have these tended to ameliorate or exacerbate social differences? A number of aspects will be considered, paying particular attention to home-based activities.
Changing employment patterns The overall pattern of employment in the UK has changed in recent years. Since the mid-1980s part-time working has become more common for both men and women, which has led to a rise in the number of women in paid employment (Central Statistical Office 1996). However, while the number of women in full-time work has also increased, fulltime employment for men has declined. Information and communication technologies have had both positive and negative effects upon the level of employment in the UK, as they have in most other developed countries. The impact of new technologies can be seen in the creation of new employment opportunities as well as the destruction of jobs in certain industries and services (Freeman 1995). Many of the new jobs made possible by greater use of information and communication technologies have involved changes in the geographical location of companies, particularly in the service sector. For example, the organization of banking, insurance and other financial services has been transformed in recent years with a much greater emphasis on access to "remote" rather than "High Street" provision. But while the use of IT in the workplace has permeated a large proportion of companies and organizations, there is little evidence of significant changes in the practice of homeworking - an essential element of the predictions for a new electronic society. Homeworking encompasses many categories of activity, including farmers, selfemployed building and maintenance workers, those in creative fields (writers, designers, artists, etc.) as well as people undertaking unskilled or semi-skilled assembly jobs or other forms of piece-work. Few of these activities lend themselves to being IT-based. Although some people are engaged in "teleworking" (i.e. working from home with information and communication technologies rather than travelling to a place of work located elsewhere), much of this appears to be done as only part of the normal work pattern or by people engaged in professional and creative occupations. Many companies would be reluctant to encourage or facilitate homeworking because it would necessitate a loss of control over employees' time and the tasks they undertake. Furthermore, many homes are not suitable for teleworking. Using IT for homeworking requires not only appropriate facilities, but also space 34
Leisure and service consumption at home and arrangements that allow work to proceed without too much disruption being caused (both to the homeworker and to other members of the household). Much of the growth in professional homeworking arises not so much from developments in IT as from economic changes that have brought about an increase in self-employment and home-based consultancy work. In 1995, more than three-quarters of all UK homeworkers owned their own business or worked on their own account (Central Statistical Office 1996).
Leisure and service consumption at home using information and communication technologies It was predicted that information and communication technologies would bring about significant changes in the patterns of leisure and service consumption. IT would make it unnecessary for people to leave their homes for many forms of entertainment or to undertake activities such as shopping, banking or gaining access to information and advice on a wide range of topics. The convergence of computing and digitized telecommunications services has made possible the development of an infrastructure that is often referred to as an Information Highway (or even Superhighway). This would comprise linked networks of high-capacity fibre optic (broadband) cables capable of conveying at high speed very large volumes of data (audio, text, video, etc.) to and from a very high proportion of business and domestic properties and institutions such as schools, libraries, hospitals, etc. A high level of investment has already been made in installing the necessary infrastructure and this will continue for at least the next decade. The principal actors involved are the telecommunications providers (BT, Mercury, etc.) and the cable television companies. In terms of the domestic market, cable television has not achieved a high degree of penetration of UK homes since it was established in 1984. Figure 4.2, above, showed that in 1994 only 7 per cent of UK households were connected to cable TV services (ITC 1995). These companies are seeking to achieve a target of 75 per cent of households being capable of being connected by the year 2000. It is frequently claimed that there is an enormous demand for consumer services using information and communication technologies, but to what extent are the services currently provided being used'? To date, the limited number of services that have been offered have achieved only a modest amount of success. In recent decades, leisure and recreation time has increasingly been spent in the home rather than in the public sphere. In Western societies attendance at public performances (e.g. cinema, concerts, theatre and
35
Information technology: a case for social scientific enquiry attending sports events as spectators) has declined in favour of home consumption using audio-visual means (television, video, etc.). However, there are other activities which involve people going outside the home, for example to restaurants, shopping expeditions, day-trips, etc. Increased leisure services using information and communication technologies are unlikely to replace the "outside the home" activities to any great extent - social contacts are important and people do not want to remain at home unless they have no alternative; the new services are more likely to be in competition with other home-based leisure activities. "A new supply of information, communication and entertainment services is more likely to result in increased competition to win round the consumer. The idea that as a result of new services, new markets will open up is a distortion of the facts" (Punie 1995, p. 33).
Economic capacity as a limiting factor Those who have predicted the ubiquity of home computers have tended to adopt a diffusion model, seeking to explain patterns of adoption and use by relating the characteristics of computers and the needs and attitudes of potential users. There has been a tendency to assume that people would perceive the benefits to be gained from the use of IT applications and acquire equipment for home use. If existing uses and applications were unable to convince the reluctant to become involved, then efforts were needed to develop a "killer application", i.e. a service or use for IT that met so many needs for so many people that it was impossible to resist. The economic capacity of a household was largely overlooked, because such model: took it for granted that everybody was a potential computer owner and that the diffusion curve would follow other major innovations in domestic electronics, such as the television set, with adoption trickling steadily down the income scale. (Murdock et al. 1994, p. 271) Despite enormous promotional activities, home computer ownership remains concentrated within the professional and managerial groups, often increasing opportunities for those who already have them.
Who's using the Net? The Internet enables communication between computers to be established for the purpose of data transfer, e-mail, access to remote databases and information sources, etc. In a newspaper article Bowen 36
References (1996) sought to draw the attention of the business community to the commercial possibilities offered by the Internet: "It is not fanciful to compare the potential of the Internet with that of the motor car." He used a "comparative history" of the motor car and the Internet to draw an analogy with early scepticism about the potential of the car. However, the historical "facts" about the motor car are very selective, with no mention whatsoever of the negative and detrimental effects brought about by the dominance of the motor trade in Western countries. For example, although one-third (32 per cent) of UK households were without a car in 1994 (oPcs 1996), transport policies make travel in rural and remote areas very difficult, while retail and leisure activities in very many town centres have declined as a result of the growth in out-oftown shopping and entertainment developments. It also ignores the economic and environmental effects of traffic congestion that renders many journeys very time-consuming.
Conclusions A number of sources have been used to provide evidence that significant differences exist between social groups in terms of access to and use of IT, particularly in the home. Some of those differences have been examined with a view to assessing the likely impact of IT. Little evidence has been found to support the idea that IT is bringing about fundamental changes to the existing social structures.
References D. Bowen, "Is anybody out there?", Independent on Sunday, 10 March 1996. T. Busch, "Gender differences in self-efficacy and attitudes towards computers", Journal of Educational Computing Research 12, 1995, pp. 147-58. Central Statistical Office, Social Trends 26 (HMSO, London, 1996). T. Forester, "The myth of the electronic cottage", Futures, June 1988, pp. 22740. C. Freeman, "Unemployment and the diffusion of information technologies: the two-edged nature of technical change," PICT Policy Research Paper no. 32, Programme on Information and Communication Technologies, Economic and Social Research Council, 1995. L. Haddon, "The home computer: the making of a consumer electronic", Science as Culture, no. 2, 1988, pp. 7-51. Independent Television Commission, Television: the public's view 1994 (London: Independent Television Commission, 1995). Independent Television Commission, "Surfin' UK", Spectrum, Issue 19, 1996. C. Kirkman, "Computer experience and attitudes of 12-year old students:
37
Information technology: a case for social scientific enquiry implications for the UK national curriculum", Journal of Computer Assisted Learning 9, 1993, pp. 51-62. A. Kirkwood, & G. Kirkup, "Access to computing for home-based students", Studies in Higher Education 16, no. 2, 1991, pp. 199-208. A. Kirkwood, A. Jelfs, A. Jones, "Computing access survey 1994: foundation level students", Paper no. 51, Programme on Learner Use of Media, Institute of Educational Technology, The Open University, 1994. I. Miles, "The electronic cottage: myth or near-myth?", Futures, August 1988, pp. 355-66. G. Murdock, P. Hartmann, P. Gray, "Contextualizing home computing: resources and practices", in Information technology and society, N. Heap et al. (eds) (London: Sage, 1994). Office of Population Censuses and Surveys, General Household Survey 1987 (HMSO, London, 1989). Office of Population Censuses and Surveys, Living in Britain: results from the 1994 General Household Survey (London: HMSO, 1996). Y. Punie, "Media use on the information highway: towards a new consumer market or towards increased competition to win round the consumer?", paper presented at PICT International Conference on the Social and Economic Implications of Information and Communication Technologies, London, 10-12 May 1995. S. I. Robertson, J. Calder, P. Fung, A. Jones, T. O'Shea, "Attitudes to computers in an English secondary school", Computers and Education 24, 1995, pp. 7381. L. Shashaani, "Gender-differences in computer experience and its influence on computer attitudes", Journal of Educational Computing Research 11, 1994, pp. 347-67. E. Soloway, "Reading and Writing in the 21st Century", keynote address to EDMEDIA 94, World Conference on Educational Multimedia and Hypermedia, Vancouver, Canada, 1994. J. Taylor, & A. Jelfs, "Access to new technologies survey (ANTS) 1995", Report no. 62, Programme on Learner Use of Media, Institute of Educational Technology, The Open University, 1995. A. Toffler, The Third Wave (London: Pan, 1980). US Bureau of the Census, "Current Population Reports: Computer use in the United States, 1993", Washington. The data is available at the following URL: http://www.census.gov/population/www/socdemo/computer.html
38
Section Two
DEVELOPING COURSEW ARE FOR THE SOCIAL SCIENCES
5
EXPECTATIONS AND REALITIES IN DEVELOPING COMPUTER-ASSISTED LEARNING: THE EXAMPLE OF GraphIT! Ruth Madigan, Sue Tickner and Margaret Milner
Working with an interdisciplinary team to produce CAL courseware (a tutorial package introducing basic statistics) proved more difficult than anticipated. More training, organization and sustained teamwork were needed to establish a common language and mode of operation. The design and the use of CAL forced a fundamental re-evaluation of teaching methods. Defining how students learn may be as important as defining what they learn. This is necessary in order that academics (and other teachers) can come to terms with a new medium and its integration into the curriculum. The aim of this paper is to pass on a few honest reflections on the problems encountered in developing a piece of interdisciplinary courseware under the umbrella of TLTP. We are taking a risk here, since we are focusing on our mistakes rather than our successes, but we are doing this in order to clarify our own thoughts and in the hope that others can learn from us. Despite our mistakes, we do believe we have produced a tutorial program which others will find useful. 1 Our particular objective was to create an independent learning package (GraphIT!) which could serve as an introduction to basic 1. For those who are interested, a copy can be found at: http://www,elec.gla.ac.uk/nLT/cat-of-
software/ downloadGraphIT .html
DOI: 10.4324/9780429332289-7
41
Expectations and realities statistics across a number of university departments; accounting and finance, sociology and statistics were represented on the development team 2 • Introductory statistics appeared to be an appropriate area in which to make use of such a program. Many of the social sciences are essentially discursive and evaluative subjects, which do not generally lend themselves to a simple rehearsal of factual knowledge or established routines. Basic statistics on the other hand is an area which requires a certain amount of repetitive exercises to grasp its application and does produce some right and wrong answers. Moreover it is an area of study which many students (and staff!) find difficult, so any additional aid to learning would be welcomed. The computer has the obvious advantage of an interactive dimension and the capacity for rapid calculation, so the drudgery is removed and the student can concentrate on the application. Moreover, the interactive, dynamic aspect of CAL can introduce an element of fun or play which is often welcome in a subject which many experience as rather dry; a means to an end perhaps rather than interesting in its own right (apologies to all those statisticians who evidently love their subject, but many teachers will recognize the problem). Introductory statistics therefore seemed an appropriate area in which to develop a CAL package: • At this introductory level at least, it rests on a well-defined paradigm. • It can be presented as exercises which are susceptible to right and wrong answers. • It is an area in which students are likely to find repetitive practice helpful. • The fun/play element of CAL helps in an area of learning which many regard as necessary rather than popular.
Tutorial programs Only some of our team had any experience of authoring systems and, as a consequence perhaps, some of us at the start had very little understanding or "feel" for what could be achieved with CAL. It was only later when we discussed and read some of the literature about the role of IT in education that we came to understand that the "drill-and-practice" tutorial has attracted a lot of criticism from CAL professionals, because
2. The courseware development was carried out within the context of a wider initiative within the University of Glasgow: Teaching with Independent Learning Technologies (TILT) .
42
Tutorial programs it appears to rely on a rather old-fashioned approach to learning with built-in assumptions about a fixed body of knowledge and narrowly prescribed learning objectives. This is seen, understandably, as a rigid and non-exploratory approach to learning. "It is judged to offer poor approximations to what is itself a rather poor model of the teaching process in the first place (didactic encounters guided by the IRE 3 pattern of dialogue)" (Crook 1994, p. 13). In planning our own tutorial package we were happy to include some elements of "drill-and-practice" routines. None the less, our original conception aimed to be rather more discursive and adaptive (Laurillard 1993, pp. 94-5) than we finally achieved. We had hoped that GraphIT! would provide in effect a front end for Minitab (a commercially available statistical package) so that the student could access and analyze the data sets in a rather more creative, flexible way than has actually proved possible. A series of technical problems and the consequent pressure of time meant we had to abandon the direct use of Minitab and as a consequence we lost the more exploratory dimension. We were also keen to retain a more interpretational dimension, to encourage students to realize, for example, that there is not always agreement about the best way of presenting data or indeed interpreting data. The problem here is not just technical, but may also reflect the inexperience of the academics as scriptwriters, who found it difficult to think themselves into a new medium (see below). What we have produced, then, is a series of modules arranged in a hierarchy of learning (from the simple to the more complex). It is possible for the student to go back and forth at will, but it is essentially a linear tutorial program following a fairly traditional, didactic model of learning. This is less than we had originally envisaged, but still has a useful role in many courses. As Crook (1994) points out, this sort of tutorial is popular with teachers because it is easy to assimilate into prevailing patterns of teaching practice, and because the drill-and-practice approach is appropriate for some types of material and some forms of learning. We can recognize the value of this type of teaching technique in certain situations: "it need not presume a wholesale reduction of educational activity to the rehearsal of discrete subskills.... It needs to be made sense of rather than automatically disparaged" (Crook 1994, p. 14). Its value must depend to a great extent on how successfully the tutorial is integrated into the rest of the course and other complementary teaching methods.
3. I-R-E "verbal exchanges taking the form of a (teacher) Initiation, a (pupil) Response and a (teacher) Evaluation" (Crook 1994, p. 11).
43
Expectations and realities
Interdisciplinary work At a purely practical level, we found an interdisciplinary project more difficult than anticipated. The academics in our group were originally located in four departments spread across the campus between five and fifteen minutes' walk from each other. All the academics involved had heavy teaching timetables and other commitments, so even when the number of participating departments was reduced to three (because one member moved department) it remained very difficult to get the whole group together on a regular basis. The RAS (research assistants) were located at some distance from the Chair of the group who was responsible for administration and liaison with the centre (i.e. the TILT Steering Group overseeing all the subgroups). At the beginning of the project not everyone had access to e-mail. It is easy to say we should have given more thought to these practical issues, but they arose as a result of the resources available, the fact that the central project administration was not yet established and the interdisciplinary nature of the group. They were, however, very serious for the operation of the group. The fact that we found it so difficult to meet regularly as a group meant that we also had difficulty in developing a common language and were slow to pick up divergent views. Part of our problem was also an intellectual one. We used the same words, the same statistical terms for instance, but we did not necessarily speak the same language; the relative importance of categorical versus interval data for different disciplines for example, or what constitutes an attractive data set. At one level we knew these differences existed before we began, which is why indeed we have included interchangeable data sets so that to some extent the package can be customized to suit each subject area, but one can know these things without realizing their full implications. With hindsight we should have spent more time at the planning stage (though in our defence, we worked through all the recommended stages of defining objectives, distinguishing our project from comparable software, creating a common framework, agreeing on key design features, etc.). The pressure of time, the fact that the RAS had already been recruited and the difficulty of getting together as a group encouraged us to subdivide the task of scriptwriting as soon as we had an agreed framework. This seemed a practical way to progress in the circumstances, but had the unintended effect of reinforcing a disciplinary divide, the statisticians on the one hand and the social sciences/accounting on the other, and allowed two sets of interests to develop in isolation. This slowed down the development process considerably as material then had to be rewritten and reshaped at a later stage. It left the RAS in a difficult situation trying to reconcile the two groups.
44
Authoring and a new medium for academics
Authoring and a new medium for academics As well as interdisciplinary problems there were also problems of communication between those with experience of authoring software (in particular, though not exclusively, the RAS) and those without. Again this was a gap which tended to be reinforced, rather than reduced, by organizational arrangements. The RAS, across all the subgroups, not just ours, were appointed at an early stage in the project before the central organization had really been established. This had advantages and disadvantages: they were in at the beginning and consequently able to make their own contribution to developments, but at the same time they were newcomers to an organization which had not yet established its own lines of communication and administration. Thrown on their own resources, the RAS developed a camaraderie and a lively network of working relationships right across the university. This has been enormously beneficial in that RAS have been able to swap technical knowledge, offer each other support and spontaneously advance one of the aims of such a project, that is to evaluate the role of IT across a diversity of disciplines and teaching situations. The academic teaching staff were often marginal to this process and slow to benefit. Unlike the RAS they were not newcomers: they already had an established niche in the university and they worked to a different timetable and a different set of imperatives. Many of the academics were ignorant about authoring software, had never attended a CAL conference or read any of the debates about teaching with IT. They were experienced in teaching in a verbal medium (written and spoken) but had difficulty in envisaging the possibilities of the new medium of the computer. As Bunderson et al. suggest: instruction has been trapped in a "lexical loop" perpetuated by print based media and methodology ... the skill/knowledge of the expert [is translated] into a list of verbal abstractions descriptive of the critical tasks [and] given to students. The student is expected to translate the verbal abstraction back into the skills/ knowledge of the expert. They are expected to create a model of the performance of the expert from the verbal abstraction. This then is the lexical loop (1981, p. 206). The alternative approach suggested by Bunderson et al. is to provide working models in which the learner can perform. Computers are valuable because they can provide elements of simulation, but it requires imagination and experience to be able to take advantage of these possibilities. Yet at the outset of the project it was the RAS, not the academics, who were offered training (on the grounds presumably that they were the 45
Expectations and realities people entering a new situation). It seems obvious now that it was the academics who required the training and who needed the introduction to CAL philosophy and educational debates. It was they who were having to shift to a new medium of presentation and a new pedagogy. We learnt the hard way, through our own mistakes, and at the end that may be the only way to learn, but a bit of basic training would have speeded up the process and made for easier communication between the academics with mostly teaching experience and the RAS with mostly development experience.
Editorial function Again with hindsight, we should have been much more specific about defining the working relationships within our subgroup. We had a good range of skills for a courseware development team (as defined by Laurillard 1993, p. 237) either within the subgroup or the wider TILT project. After one or two false starts, we established satisfactory procedures for dealing with accounts and routine administration. What we failed to do was to establish an appropriate editorial structure. Academics are used to working within a broadly collegiate environment where at least in theory everyone contributes as equals. As any social scientist will recognize, this is a rather na'ive view of academic life, but it allowed us to believe that working relationships would develop organically. We had assumed that we could work with a division of labour (referred to above) in which different individuals and combinations of individuals went off and were responsible for writing different parts of the initial script, and then came together on a series of "Design Days" for collective approval, editorial decision and so on. This system failed, or at least worked only intermittently, for a variety of reasons already alluded to: the group found it difficult to come together on a regular basis and the distance between the disciplines was greater than we had anticipated. As a consequence we were left with a very weak editorial decisionmaking structure. The RAS were particularly affected by this. They would produce alternative designs and receive a range of comments and preferences, when what they needed was a decision. The irony is of course that the RAS, who had most direct experience of the working practices involved in a project of this sort, were the least able to dictate or change the group structure; they were the newcomers to the institution, part-time, less well paid, and the academics were the original instigators of the project. The organic collegiate model tends to ignore these differences and pretend they do not exist. This can have its plus side if it embraces a genuine respect for people's expertise, but a more 46
Evaluation formal structure of decision-making is needed and can also, we think, be empowering. In our group there was sufficient goodwill that we did evolve ways of working together, but we would have done so more efficiently and with less frustration had we recognized at an earlier stage that our existing model of editorial control was not working as intended and needed to be replaced.
Evaluation The university-wide TILT project was initially set up by inviting groups of academics throughout the university to submit ideas for projects in their area of work. These proposals were then combined into cognate areas (the subgroups) which in turn were combined into the single TILT programme. At the outset the academics, in our group at least, tended to be focused on their own project and cognate areas rather than the TILT project as a whole, and tended to resent the demands made by the centre for information and participation in activities which appeared to have more to do with the central project than their own subgroup. In particular the role of evaluation, which was crucial to the overall project, caused a great deal of initial misunderstanding at the subgroup level. In fact the RAS, who were better integrated as a group across the university and closer to the centre than the academics, had a better understanding of the role of evaluation in the project as a whole. This changed as the project progressed and it became clear that the subgroup with special responsibility for evaluation could offer something of value to the other subgroups, rather than seeming to intrude and demand more paperwork, more reports and so on. In the end we all came to appreciate the value of having a group of independent evaluators who had the time and expertise to design instruments (before and after questionnaires, observation schedules, video recordings) with which to evaluate the effectiveness of the software we were producing. They carried out their evaluations in laboratory conditions with selected groups and in genuine classroom situations. Quite apart from anything else, positive feedback from such thorough external evaluation has done a great deal to restore confidence in moments of self doubt. It is important, though, to recognize that our own subgroup also carried out evaluation exercises which were crucial at a formative stage. These tended to be smaller in scale and more informal, but allowed us to try out an unfinished piece of software which was still rough around the edges and could not therefore be used in a fully fledged teaching situation. The classroom evaluations were extremely valuable in focusing our attention on the importance of locating such a package within the 47
Expectations and realities course structure (Laurillard 1993, p. 213). Different teachers will want to use the package in different ways, but it is extremely important that it is properly introduced to students and it is clear what they are expected to do with it. There is a temptation for teachers everywhere when given an independent learning package (equally video or film) to treat it as a "child-minder", something just to occupy a class room hour or so. Courseware is only of interest if it promotes learning. However, to the extent that it does, it only does so in conjunction with the wider teaching context in which it is used: how it is supported by handouts, books, compulsory assessment, whether the teacher seems enthusiastic about it, support among learners as peer group, and many other factors (Draper 1995). Both formal and informal evaluation were found to be important as part of the formative development process and as part of the transition to the classroom. Summative evaluation is more difficult to accomplish. Our students for example, were questioned and "tested" before and after classroom sessions using GraphIT!. For the most part they reported that they had enjoyed using the tutorial package, they believed it to be useful, and the "tests" showed that they had acquired new knowledge or information (Henderson et al. 1995). At one level then, this example of CAL courseware appears to be effective, but we cannot say whether it is more effective than other methods because we did not compare our CAL tutorial with alternative, more conventional teaching and learning methods. We had always intended that the use of this courseware should be integrated with other coursework to supplement or reinforce, not to replace, conventional teaching, though it might obviously reduce the time spent on certain topics. In these circumstances it is very difficult to "pinpoint the precise variables that determine the superiority of a particular approach" (Booth et al. 1993,
p. 83).
We believe that CAL is attractive because it adds to the diversity of teaching methods available and it offers the student an additional source of independent learning. Whether it is cost-effective can be substantiated only in the longer term. GraphIT! has been very expensive to produce (two part-time staff working for three years) and, if we ignore the research and learning experience involved, could be justified only if it is widely adopted. So far we have received many expressions of interest, but only time will tell if it is widely used in practice. Generally CAL has not been taken up as enthusiastically as its developers would like (Booth et al. 1993, p. 83). One of the problems with the evaluation of CAL is that it is often done by CAL professionals and enthusiasts, who are already committed to developing and expanding
48
Conclusion
its use. They are faced with the issue of overcoming the conservatism of course teachers and ensuring that genuine opportunities for the constructive use of CAL are created. But the real evaluation must in the end come from the long-term patterns of usage which emerge, and we must allow for the possibility that in many areas of education these evaluations may be negative and the use of CAL will be rejected. This is not an easy finding for a project like TLTP, which is committed to expanding the use of CAL, to contemplate objectively.
Conclusion We have produced what we believe to be an attractive tutorial introduction to basic statistics and graphical presentation. The evaluation, from within our own institution where it has been piloted in classroom situations and from other institutions, where we have received teacher evaluation, has been most encouraging. We hope the final product, which has a teacher's editing facility so that data sets which have particular relevance to individual courses can be include as part of the exercise set, will also be well received. We have not achieved everything we set out to achieve: we were over-ambitious given our resources. The whole courseware development took much longer than we had anticipated; as a consequence the tutorial package is shorter than intended. We had planned a number of additional modules which would have taken the student on to a slightly more advanced level. What we have learned: • Keep talking to each other! It is not enough to identify learning objectives at the outset: the same words may mean different things to different people. This is particularly true where people are coming from different intellectual backgrounds. • Defining how you want students to learn may be more important than exactly what you want them to learn. • Although there is bound to be a division of labour and of expertise within the group, it helps to identify a minimum training scheme and/or literature review with which you expect everyone to be familiar. • Do not fall into the trap of thinking that the practical day-to-day arrangements and working relationships will take care of themselves: they need regular review. • No package is a "stand-alone", even if it is designed for independent learning. It has to be integrated into the rest of the course and its success or failure will depend in part on how it is used.
49
Expectations and realities
References J. Booth, J. Foster, D. Wilkie,
K. Silber, "Evaluating CAL", Psychology Teaching Review 2, 1993, 2. C. V. Bunderson, A. S. Gibbons, J. B. Olsen, G. P. Kearsley, "Work models: beyond instructional objectives", Instructional Science 10, 1981, pp. 205-15. C. Crook, Computers and the collaborative experience of learning (London: Routledge, 1994). S. W. Draper, "Two notes on evaluating CAL in HE", University of Glasgow, 1995. www URL: http://psy.gla.ac.uk/steve F. P. Henderson, C. Duffy, L. Creanor, S. Tickner, "Teaching with Independent Learning Technology Project: University of Glasgow", paper presented at CAL Conference, Cambridge, 10-13 April 1995. D. Laurillard, Rethinking university teaching: a framework for the effective use of educational technology (London: Routledge 1993).
50
6
THE DATA GAME: LEARNING ST A TIS TICS Stephen Morris and
Jill
Szuscikiewicz
Learning statistics is a perennial problem for students and research workers from non-mathematical backgrounds. The social sciences and medicine in particular rely on high quality analysis and interpretation of data. However, the teaching of statistics throughout higher education assumes a high degree of mathematical competence even when the students are from non-mathematical disciplines. This is without doubt a major reason for students' perceived lack of statistical judgement (Jamart 1992). Clearly a different approach is called for. With imagination it is possible to convert difficult statistical problems into simpler problems of pattern recognition. Furthermore, with computerization the extra ingredient of interactivity can be added enabling the student to manipulate the raw data while observing the changing patterns and thereby build up an intuitive understanding of how statistics works. In effect statistics is turned into a game, the data game. In this paper we describe this approach to the teaching of statistics.
The problem Imagine learning to play chess, with a set of principles and examples in a book but without the board and pieces. Or learning music composition without sound. Statistics learnt solely from the pages of a book (or lectures) suffers the same problem, and unfortunately most students and researchers are expected to gain a practical grasp of the subject in exactly this way. A further problem is that much statistics teaching is based on mathematics, which is beyond the easy reach of most students. This makes it more like learning chess without a board in a foreign language. Finally, most statistics textbooks and courses take a few sets of data and work through them; which sounds acceptable, but DOI: 10.4324/9780429332289-8
51
The data game: learning statistics does not actually teach statistics. Students learn a handful of analyses instead; examples of statistics rather than statistics itself. Statistics is particularly in need of a new approach. Its purpose is to present a large and probably complex body of raw data in a meaningful, summarized form. Everyone understands what raw data is, because they collect it and it is largely self-explanatory. Students accept the concept that you come to a conclusion, and that that is the end of the process - however, they do not understand what goes on in between. Statisticians work with a series of steps culminating in the table of test statistics, each step condensing the data and making it more manageable. This forms a kind of "Information Funnel", large and raw at one end, and clear and informative at the other. Statistical analysis programs used by those in education and research every day emphasize the two ends of the funnel, and hide the intermediate steps. How is a student to understand what is happening when they enter a vast array of data and a moment later are presented with a few probabilities? Although pride of place appears to go to the raw data and the final statistics, educationally the in-between steps are of primary importance. Most of what we thoroughly know has been learnt by observation, trial and error. Statistics cannot be taught this way within the current mathematical framework using a selection of data sets, since the time required (and the number of prepared analyses) would be far too great. However, most students of statistics are not interested in it as an academic discipline, and by approaching the subject instead as a tool to be used, a higher degree of teaching flexibility can be achieved. Mathematical proofs become irrelevant; when learning to ride a bicycle, a child needs no knowledge of angular velocity, frictional forces or gravitational pull; an intuitive understanding of all of them will be impressed on him/her more or less painfully. While deep mathematics can prove or disprove assertions, proof does not necessarily lead to enlightenment (Jamart 1992), and much of what we truly understand requires no proof at all but repeated, varied and directed observation. A deep appreciation of almost any subject comes after practical experimentation. This can be achieved in statistics teaching, by making it into a game, after which it can be at least as interesting as chess, and certainly a lot more useful. Now that the IT revolution has made faster, more powerful PCS available to university education at reasonable cost, their advantages to both teachers and students are widely recognized. When used with imagination, the increased interactivity of PC software is a powerful ally in the move away from didactic teaching; and the potential availability of networked software 24 hours a day enables students to work with
52
Our solution complex concepts, at their own pace, whenever it suits them (Simpson 1995).
Our solution With this in mind we have approached the problem of teaching statistics by creating and computerizing a series of challenges and games which the user plays by changing the data. This is a radical departure from traditional statistics teaching, where correct statistical practice is mirrored unnecessarily closely in making the data sacred and immutable. It is still possible to impress on students that the real life data cannot be changed; and by giving them the opportunity in the classroom to experiment in a way impossible in real life, they become experienced in recognizing patterns and exploring strategies without danger. They are exposed to a wide variety of situations which might take a decade or more to accumulate through real research. Although traditional teaching styles may be able to show some variety to students, the interactivity of our approach involves them directly, making it more successful than a strictly didactic method. However, if the results of the interactivity remain complicated, the students will simply have a deeper understanding of their confusion. By transposing the data into a simple graphical representation, whether a fitted line or a set of normal plots, the results of the interactivity become clear and the student gains a genuine understanding. This gives an entirely natural representation of the middle stage of the Informational Funnel, connecting the original data in a clear way to the otherwise slightly mysterious test results and conclusions. The data points are the game pieces which may be moved in any direction. The student can be guided through a number of scenarios within which they are encouraged to experiment and observe changes in the resulting test statistics, finally being challenged to generate particular outcomes. Through these exercises, the students recognize that the processes directly connect the data to the results, and that they can be understood. Although they may not understand the processes at first, by the end a clear intuitive understanding will be established. We tested this teaching approach by producing a suite of pc-based gameplaying scenarios designed to demystify a wide range of statistical concepts. These were incorporated into a comprehensive teaching package called Statistics for the Terrified. Although the software does cover some quite advanced topics, it approaches everything in a basic, commonsense way. The areas covered by the software include: 53
Table 6.1
Areas covered by the software.
How to choose a test
The importance of Groups in statistics Identifying appropriate tests by data layout When to use: Chi Square, Kruskall-Wallis, MannWhitney, Oneway Analysis of Variance, Paired and Two-sample t-tests, Wilcoxon
Basic data Description
Descriptive statistics Advantages and disadvantages of median and mean, range and variance, and coefficient of variation The importance of the normal curve and how the mean and s.d. affect its shape and position Standard error and confidence intervals The differences and similarities between two-sample ttest and oneway analysis of variance Role of the normal distribution The differences and similarities between the MannWhitney Test and the Kruskall-Wallis test Role of box and whisker plots When and how to use oneway and twoway analysis of variance Develop the ability to visualize data in graph form
Testing for differences between groups
Uncovering hidden influences
Uncovering influences! Reducing bias and variance Analysis of covariance (when the influence is a measurement) Twoway analysis of variance (when the influence is a category, such as gender) Matching groups to prevent bias Reduced variance enhances the likelihood of a significant result
Fitting lines to data
Regression Judging the value of a linefit Using the fitted line Describing the line
Analyzing repeated measurements
Before and after studies About the normal curve The paired t-test Why area under a curve? The differences between areas Different repeated measurement shapes
Analyzing 2 X 2 classification tables
What are classification tables? Interpreting proportions Risk difference Relative risk and relating two proportions Constructing a hypothesis of no difference between the groups Issues surrounding the Chi Square Test (Fisher's Exact Test)
What does p < 0.05 actually mean?
What is going on when a hypothesis is tested'! Type 1 error, Type 2 error, and Power The effect of sample size in the accuracy of a statistical trial Use of blocking in experiments
54
Examples of the data game approach The Computer Unit ran a series of regular statistics courses open to research staff and students who had previously had university-level statistics teaching. Their level of knowledge and confidence was assessed before and after CAL-based teaching sessions via a questionnaire, on such areas as Correlation, Outliers, t-test, etc. Responses were made by marking a 0-10 scale. Information on attitudes to using computers for learning, IT skill levels, etc. was also gathered to ensure that participants were all of a similar skill level and attitude. These questionnaires were received from a total of 51 students. The data was analyzed using a paired t-test after applying the Wilks-Shapiro test to confirm normality.
Examples of the data game approach
Interactive graphs The module on linefitting (regression) is a good example of our approach. It can be viewed as three sections: • Introduction and overview • General exploration • Challenges. In the first part, general concepts are introduced, with animated illustrations. This covers what linefitting is, what it is for (prediction and influence), how to judge the usefulness of a linefit (correlation coefficient), and how to describe the line (gradient and intercept). At this stage we don't include too much detail, just a commonsense definition with a clear demonstration. The next section provides users with a graph, ten movable points, and a line which is automatically recalculated (see Figure 6.1). Next to the graph are the essential parameters: Correlation, Gradient and Intercept. More information on the meaning of these is available to students who require it - however, as it is not important at this stage, it is available only as hypertext. For a period of two minutes, students are invited to move the points and watch how the line changes, and also to observe the effect on the parameters. This section performs a dual function. Primarily, the student begins to gain an intuitive feel for the way a line reacts to data (we have found they pick up particularly on outliers and on data with poor correlation) and begins the process of learning to "eyeball" scatters of points. Since they will not always have a plotted graph to work from in real life, the data values are also given (colour-coded to the points) next to the graph. However, it also painlessly teaches students the minimal software skills which they will need in order to work through the challenges in the next section: 55
The data game: learning statistics
Seconds remaining: 115
Figure 6.1
A painless way to absorb basic information.
dragging the points, and spotting where on the screen the relevant changes occur. In the final section, the students work through a series of four challenges, in which they have to progressively change the data to produce a Correlation, Intercept and Gradient value by manipulating the points, and finally a really difficult challenge where a particular Correlation and a Gradient must be obtained together (see Figure 6.2). The values for these exercises are generated randomly; this means that they can be repeated as often as desired, without any actual repetition. Although they have been exposed to absolutely no mathematics in this module, all students successfully complete these exercises, and enjoy the learning process. Depending on the complexity of the random challenges and the student's initial knowledge, this may take between 30 minutes and an hour. At the end they have a good grasp of what linefitting is for, what correlation, gradient and intercept are, and what they say about the data. They are also able to make an educated guesstimate of the correlation of a given set of data. In other words, they 56
Pattern recognition
Now arrange these points ta get a gradient of .75 with a correlation of .25
Figure 6.2
? •
Repeated, varied and purposeful experimentation.
are aware of all stages of the Information Funnel, and connect them together naturally and easily.
Pattern recognition A slightly different approach was used in the module "How to choose a test". Many of those using statistics as a tool (both students and researchers) find the choice of a test to use in a real-life situation a baffling and fundamental problem. This is because statistics textbooks present different sets of data and mathematically generate appropriate tests from first principles. The mysterious use of mathematics in this context merely distracts the student away from the data. However, it is possible to concentrate on the data and still choose an appropriate test. By presenting it in terms of pattern recognition (a skill available to all of us) rather than complex maths, we were able to make some basic concepts clear without any attendant bewilderment. As with the linefitting module, it can be viewed in three sections: 57
The data game: learning statistics • Introduction and overview • General exploration • Quiz. The introduction is a little more detailed than with linefitting, since it often has to overcome a lack of confidence resulting from previous bad experiences and confusion. The three most common data layouts are shown on the screen and explained (see Figure 6.3). Throughout the whole module it is demonstrated that 90 per cent of data obtained in research is based around one of these clearly different layouts. Deciding on a test is simply a matter of matching patterns. The General exploration and Quiz sections are designed to be worked through a number of times. We recommend that students return to the exploration at least twice, as there is a huge volume of information stored within it. The basic screen display remains the same as for the introduction. However, everything on the screen is "hot", and by simply clicking anywhere the student can gain further information. For example, by clicking on a name in the Repeat Measurements panel, the student can
Figure 6.3
58
Explaining the pattern recognition principle of choosing a test.
Pattern recognition
Figure 6.4
Interactivity allows the student to follow their own line of interest.
obtain information on the concept of tracking one person at repeated intervals, and how that differs from Group Comparison (see Figure 6.4). This section is totally interactive. The user clicks on items of interest and receives information accordingly. In the Help window they are introduced to further ideas to follow up, such as parametric/nonparametric testing. Perhaps the most important facet in this section is the opportunity to explore the Permissible Tests for each layout. Once this mode has been entered, the mouse click brings up a summary of the appropriate tests, and again clicking on a test calls up detailed information on the test itself (Figures 6.5 and 6.6) After spending some time working through this section, the student acquires a body of information concerning the major statistical tests and the way they work, and the confidence that they can apply them appropriately. Finally, there is a Test Quiz, in which six questions are selected at random from a question bank, which tests them on their acquired level of knowledge in choosing a test (see Figure 6.7). Feedback is provided to the student at the end, so that they can return to the previous section and explore a little further. 59
Figure 6.5
Knowing which tests are appropriate.
Figure 6.6 More information on each test is available.
60
Conclusion
Figure 6. 7
Self-testing enables students to monitor their learning in an exploratory environment.
Conclusion We have been using the software now for some time in the teaching of statistics, and those who have used it report that they feel much more confident of their ability to cope with the subject. In addition to classroom sessions, it is available on the St George's Medical School Network, and in this way it has reached a large number of researchers. These users are particularly pleased with its practical emphasis, and find that the method of using the PC on their desk for a quick subject refresher is very convenient, and does not involve the loss of face from asking for advice. Alarmingly, a number of experienced researchers have confessed to not having understood some very basic concepts before working through this package under their own steam. These results were extremely encouraging, not least because all users enjoyed the experience of learning statistics for the first time. The most effective learning took place when the data game was also accompanied by graphical representations, such as the fitted line in the regression module or multiple normal curves (or Box Plots) in the Analysis of Variance module, although quizzes were also popular. The overall effect of the software was to make the Information Funnel clearly
61
The data game: learning statistics Table 6.2
Confidence and understanding before and after using the software.
Question
How confidently can you explain what the Intercept is? How confidently can you explain what Gradient is? How confidently can you explain what Correlation is? How confident are you about the way outliers affect a linefit? Could you explain how the standard deviation affects the normal curve? Could you explain what the two sample t-test is used to test for? Could you explain the circumstances in which the two sample t-test gives a significant difference? (0% implies zero understanding)
Average confidence/ understanding before(%)
Average confidence/ understanding after(%)
Paired t-test Significance level
20.7
84.2
0.0000006
29.2
77.8
0.000004
64.2
85
0.0056
27.8
78.5
0.00001
52.8
82.8
0.0006
48.5
74.2
0.0007
44.2
67.8
0.0002
confidence/understanding;
100%
implies
total
confidence/
visible, with large amounts of raw data at one end, a small number of interactive graphical representations and numeric statistics in the middle, leading to the final simple interpretation. This reflects one function of statistics itself as an informative summary tool. The software has also generated an enormous volume of comments from users and from teachers in other institutions. The feedback was obtained from verbal comments, questionnaire and also a large number through our Talkback feature, which allows the user to enter comments at any time from within the software. The Talkback feature was designed by us originally as a way of obtaining student feedback on areas of difficulty and misunderstanding, its in-built anonymity encouraging users to be as frank as humanly possible. We used this frankness to perform an evaluation of the software, asking them to enter comments on the usefulness (or otherwise) of learning statistics by the data-game approach. The most common reaction from researchers was that they could not believe that statistics has been so simple all along: "If choosing a test can be made this simple, why hasn't anyone told me before?" 62
References
Students enjoy it, classes consistently overrun because students do not want to leave - unusually for statistics. Teams of two to a PC working in the classroom on interactive exercises have been observed competing in a race against time to complete a set of challenges - and even to go back for a "best of three"! The accompanying graphical reinforcement of the data was commented on frequently, as being especially helpful to the intuitive understanding of the way a test works. Also the simple description of apparently complex items such as an analysis of variance table were well received. Most students found the package approachable, unlike the typical statistics textbook; however, many said that the experience of using the data-game approach would allow them to look at traditional statistical textbooks again in a more informed light. An unsuspected benefit was highlighted by a number of students who felt that their confidence in, and grasp of, basic numeracy had improved. There is some concern on the lack of basic numeracy skills in a number of disciplines, and in areas such as nursing (Jacobsen et al. 1991) there is growing evidence that this is worsening with increased use of calculators and computers. Having gained an understanding at this level, the user is able to approach lectures and textbooks in a more informed manner. We do not feel that our approach replaces the existing teaching of statistics, but it is a valuable precursor and accompaniment to it.
References M. Cartwright, "Numeracy needs of the beginning Registered Nurse", Nurse
Education Today (School of Nursing and Health Administration, Charles Stuart University, Bathurst, Nsw), 16, 1996, pp. 137-43. B. S. Jacobsen, R. S. Jacobsen, L. Tulman, "The computer as a classroom participant in teaching statistics", Computers in Nursing (School of Nursing, University of Pennsylvania, PA), May-June 1991. J. Jamart, "Statistical tests in medical research", Acta Oncologica (Cliniques Universitaires de Mont-Godinne, Universite Catholique de Louvain, Yvoir, Belgium), 1992. J. M. Simpson, "Teaching statistics to non-specialists", Statistics in Medicine (Department of Public Health, University of Sydney, Nsw), January 1995.
63
7
CONVERSION OF THE IDEOLOGIES OF WELFARE TO A MULTIMEDIA TEACHING AND LEARNING FORMAT David Gerrett
This paper describes the process developed for, and lessons learned from, the conversion of the Ideologies of Welfare into a Multimedia teaching and learning format. Yardsticks for the time and effort required to convert intellectual material are provided. In the case of the Ideologies of Welfare lesson, to produce a second generation package required 260 hours of staff time at a cost of approximately £4,000. A brief description of the lesson and specific student feedback on the use of the package are included.
Background The Ideologies of Welfare (IofW) are groupings of often opposing social constructs which provide a rationale for differentiating policy decisionmaking concerning public welfare. A knowledge of the ideology which currently underpins the direction of the British health service is of particular interest to health care professionals such as pharmacists. They are legally responsible for monitoring the process whereby therapeutic medicines are made available to the public. As drug costs rise they are relied upon to assist in rationalization of services and are themselves becoming increasingly the focus of policy-making. As such, their ability to make decisions perceived to be rational by those in power is dependent on their knowledge of the "in vogue" ideology. DOI: 10.4324/9780429332289-9
65
Conversion of the Ideologies of Welfare There is currently no formal undergraduate instruction on the IofW for pharmacists. The only postgraduate instruction occurs on the Postgraduate Programme in Social and Administrative Pharmacy (The Course) run by the University of Derby. However, the theory and practical application of the IofW may become more important in pharmacy education following an independent and particularly influential assessment of the occupation in 1986 which recognized the general educational deficit in the Social Sciences (The Nuffield Foundation, 1986).
In response, the official body responsible for monitoring curriculum content and ultimately registration of pharmacists, the Royal Pharmaceutical Society of Great Britain, recommended that "teaching of social sciences should be an element of all years of the undergraduate course" (Royal Pharmaceutical Society 1989). Furthermore, postgraduate pharmacy teaching has begun focusing on practice in its broadest sense which necessitates an understanding of human action, a social science domain. The change in emphasis is in keeping with Pharmacy Administration courses in America (Teachers of Pharmacy Administration of the American Association of Colleges of Pharmacy 1985; American Association of Colleges of Pharmacy 1992).
The Course and Multimedia teaching and learning The University of Derby validated The Course in September 1993. The Course Planning Team (cPT) comprises staff of the School of Health and Community Studies, pharmacists from the School's Academic Pharmacy Practice Unit plus senior hospital pharmacists at the Derbyshire Royal Infirmary. The first module of four making up the Certificate stage of the award is Pharmacy and Health Policy. This core module was validated at 100 hours of student effort of which 30 hours was allocated for instruction on the IofW. A critical feature of The Course is its sole use of Multimedia. This term is universally understood to represent the integration of several media such as text, graphics, video and sound into a single computer application. At the time of validation, only two other courses in Britain were known to be so essentially depended on technology. Multimedia teaching and learning (MTL) refers to the use of computers and programmes to present educational information to students in an interactive manner. It is a form of resource-based learning commonly described as self-directed, independent and individual in nature. Through its use the educational needs of a significant population, including those unable or unwilling to attend face-to-face courses, can be satisfied (Gilroy 1992; McDonough, Strivens, Rada 1994). Multime-
66
A Multimedia Ideologies of Welfare lesson dia teaching and learning has been shown to be a viable alternative to lectures generally (Clem et al. 1992) and specifically for pharmacy as part of undergraduate (Stevens & Sewell 1993) and postgraduate (Pugh et al. 1993) provision. Pharmacists are an ideal audience for Multimedia. Many are unable to attend a university as they are legally committed to be available for discussion with patients concerning medication yet have access to and experience with computers. The conversion of intellectual material to lessons in a Multimedia format is termed "authoring". In order to ensure that the aims and outcomes of lessons are reflected in student understanding and action, further that the lessons of a module form a cohesive learning experience, a series of stages in the process of authoring were identified for validation of The Course. The flow diagram in Figure 7. 1 and the corresponding descriptions in Table 7.1 define the stages involved. These chart the relationship in the production of lessons of the CPT to Module Teams, Authors, the Multimedia Teams, the MTL Unit and most importantly students. Careful note should be made of the implicit responsibilities of those groups involved in authoring and the quality assurance implicit in feedback mechanisms. Also, that for the full period when a lesson is made available on The Course, evidence of the effectiveness of the student learning experience is required to be compiled and passed between involved groups. Evidence may take the form of, for example, student comment, assessment or specific research conducted for the purpose of eliciting the student response. In ensuring quality, reports are required to address the central question as to whether lessons achieve student outcomes as specified in the module content. Course Planning Team
.------. (B)
Module Team
Intellectual
Module
(H)
Course Planning Team Representative student group Key:
(E) MTL Unit
Student
Module Team
Course Planning Team
(M)
(I)
Programme Committee
See table one for descriptions of (A) to (M). Two way discussion Single direction of process -
Figure 7.1
Steps in the authoring process (A-M).
67
g
0) Q:)
:i
(A) The CPT notify Module leaders six months prior to delivery of lessons. Module leaders convene meetings of the module team and produce a first report detailing module structure and the aims and outcomes for each lesson. The CPT receive the first report and may provide suggestions and request conditional changes. (Bl Within criteria agreed, Module Leaders commission production of authored lessons.
(E) The Multimedia team communicate with the MTL Unit for advice on the latest HCI strategies and provide lessons for student evaluation. (Fl The MTL Unit conduct research to verify the student outcome from interaction with the lesson. (G) The Multimedia team communicate results of student evaluation to the Module team.
(Cl Lessons are authored. Communication between Authors and Module Teams concern how authored lessons meet aims and objectives specified.
(H) The Module team produce a second report to the CPT, including evidence of the student experience, for approval to run the authored lessons
(D) Lessons are authored. Communication between Authors and the Multimedia team concern application of current knowledge of the HCI and available technology to optimizing the student learning experience.
(I)
The CPT comment on the second report and notify their recommendations to the Course Committee. Within policy laid down by the Course committee, direction is given to the Module teams
Table 7.1 Steps in the authoring process (A-M).
(J)
Having considered recommendations and met all conditions Module teams supervise student access to authored lessons.
;§
~
g·
~
s:. Cl)
~
(Kl Student feedback is monitored by the Module teams.
Cl)
~
°3. ~
(L) Module tean1s compile a yearly report for the cPT including student responses to lessons.
(Ml The CPT respond to the yearly report and may initiate changes or production of new lessons following the steps B to L.
~
~ sQ qJ
A Multimedia Ideologies of Welfare lesson
your current scores
Figure 7.2
Example of a lesson in Ideologies of Welfare.
A Multimedia Ideologies of Welfare lesson Following the process outlined previously (Figure 7.1; Table 7.1), on 24 June 1993 the first report (Table l(A)) on the structure and content of the Pharmacy and Health Policy Module was sent to the CPT. The description for lesson 4 of the module stated: "Lesson four 'Ideologies of welfare' takes the negotiated policies in lesson two and considers the underpinning political ideologies which have shaped the Welfare State. This lesson considers developments from the political perspective." The aim of the lesson was: "To direct the student to an understanding of the belief systems which underpin ideological stances." The stated outcome was: "The student will demonstrate an ability to identify and critique the fundamental ideological stances which underpin health policies." Details of the proposed intellectual author were submitted. It was noted in the submission that the topic had been taught for two years at postgraduate level to health care workers attending the Postgraduate Programme in Research Methods. A subsequent meeting of the CPT agreed to progress the lesson and funds for conversion to Multimedia were released. 69
Conversion of the Ideologies of Welfare After an estimated 20 hours' focused activity, a lesson covering the IofW for conversion to a Multimedia format was committed to paper by the intellectual author. No further reading was necessary, lecture notes had been prepared and a clear idea of the structure of the Multimedia lesson was already known. The Multimedia lesson was designed to provide four hours of student interaction and directed approximately 26 hours of reading. Reading time was based on one hour of student effort for 10 pages of text. Students based in Britain were required to obtain three classic references (Lee & Raban 1983; George & Wilding 1985; Clarke, Cochrane, Smart 1987) and those outside were required to access one further text (Cochrane & Clarke 1993). Reasons for "outsourcing" much of the didactic material were as follows: 1. The amount of technical authoring required was kept to a minimum. 2. Students were saved from either printing off, or reading from the computer, reams of text. 3. Student time in front of the computer was more interactive. 4. Copyright issues for the use of intellectual material in a Multimedia format were, and still are, vague. The lesson centred around four questions on each of 20 themes, which were piloted and shown to differentiate anti-collectivist, reluctant collectivist, collectivist and revolutionary collectivist ideologies. Students were first asked to respond to the 80 questions and view a graphical representation of their ideology. They were then directed to specific reading and asked to respond a second time to the same questions in a different order. Feedback to students was based on an examination of how their ideology may have changed through the reading as evidenced by a single graphical representation of their pre- and postreading ideologies. The graphical representation of one student's response is provided in Figure 7.2. Students were able to record their views and responses to specific posed situations by typing in text entry boxes. The first version of the Multimedia lesson was designed to run in the Windows™ operating environment. This defined the recommended minimum hardware specification as a 486 processor running at 33 Mhtz speed with 4 Mbytes ofrandom access memory (RAM). No other specific software was required. Using Authorware ProTM as the authoring software tool, an estimated 120 hours of software development was required to produce the first version. In the design of Multimedia lessons the literature (Preece et al. 1994; Christie & Gardiner 1990) was used to provide some guidance for authoring. Text and graphics were the principal media used. Subsequent software testing and quality assur-
70
A Multimedia Ideologies of Welfare lesson ance required a further 40 hours. Four students provided initial feedback (Table 7.1(F)). In the case of lesson 4 additional student testing was conducted. Twenty students from the 1994 cohort of the Postgraduate Programme in Research Methods studying the core module entitled Health policy, read the set texts and interacted with the first version of the IofW lesson. Verbal student feedback on the experience was favourable. Indeed, one student indicated that the lesson "allowed me to be ignorant and did not show me up to the rest". It was not felt necessary to go over the material in formal, traditional lecture-plus-tutorial format. Six hours would normally have been allocated to such activity. Subsequent students in this programme have all been provided with the Multimedia lesson and told to "take the lesson'; at their own pace, but to have completed their interaction by a set date, thereby releasing valuable staff time. In April 1994, 12 students on The Course were provided with Module 1 including the Multimedia IofW (Table 7.1(J)). Feedback was automated by the Multimedia programme and saved to a floppy disk. Assessment was dependent on receipt of the feedback disk. Evaluation of the success of the lesson was based on before-and-after reading responses to the 80 questions and on the transcripts of students' rationalization of their ideologies. Further evidence of the student learning experience was provided by students' text entries of their understanding of how over time developments in the NHS had paralleled changes in ideologies. By April 1995 all student marks had been externally moderated and all students had passed. Thirty per cent of the coursework mark was allocated to demonstration of an understanding of the IofW. From the 12 students a total of 105, A4 pages of feedback were received. A typical student's feedback contained 3,000 words of text. In addition the exact times when the lesson was accessed and the specific path chosen through the lesson were all recorded. Students were encouraged to make comments as to their experience of the lesson. These tended to reflect on technical aspects, as the following verbatim example demonstrates (cited with permission of Student K. Rosenbloom): When going through the program it would help to know if you have done this before ... I hope that I have done this already ... and that it is stored I have changed my views now that I have done this reading what has happened The program would be better if it knew that you have done this section
71
Conversion of the Ideologies of Welfare the bookmark is not working this is different to last time, I was a reluctant collectivist I have done this but the information was lost, I will do this again ... where have my results gone again? I can't go back and I forgot to press F 12 how can you say that 1980 is not anti collectivist? I give up I tried this one first From such feedback it was possible to determine the following critical features which required changing in the revised lesson, the ability for the student to: • move backwards and forwards in all situations • have permanent records kept of their interaction • start again if required • exit from and return to all parts of the lesson • print all text inserted. In order to make the changes to the lesson a further 80 hours of authoring time was required. Up to the time of demonstration the IofW lesson had required an estimated 260 hours of academic and authoring effort. Conservative estimates put the staff costs at £4,000. External assessment of students' coursework has proved the lesson is capable of leading students through an academic learning experience. Exactly how this experience compares to that of "chalk' n' talk" plus tutorials is yet to be determined. It is the author's suspicion that the two experiences are simply different, each with strengths and weaknesses but, demonstrably, the same outcome. Conversion of the IofW is an important test for Multimedia. In the author's previous experience, this topic is most effectively taught with a minimum of formal lecture input, copious handouts and proportionally greater time allocated to small group tutorials discussing questions across a wide variety of politically, socially and emotionally charged topics. The test is: how can Multimedia emulate such interaction and lead to similar student learning outcomes? Where emulation is concerned it clearly fails the real-time interaction test; however, students' questions can be anticipated and the Multimedia software programmed to react on cue. In terms of outcome, our limited experience is that it passes. Exactly how and why have not been elucidated, but the proof is there. Our experience suggests that the limitations of Multimedia in terms of real-time interaction are not terminal and that other discursive topics may also be amenable to such delivery. The implication for the social sciences is considerable in that many of its topics are taught in this manner. 72
References In summary, this paper describes the changing orientation of pharmacy undergraduate and postgraduate education in Britain. Many practising pharmacists have not had the benefit of a broader education. Pharmacists typically work in community, hospital and industrial settings. Many are unable to access campus-based courses and require a distance-learning delivery format. The Postgraduate Programme in Social and Administrative Pharmacy has been developed to meet their needs. The challenge has been to convert material which has been successfully taught by lecture and tutorials to the one-to-one, interactive Multimedia format. At validation the process of conversion was laid down. An example of one lesson, the IofW, is described. The incorporation of full navigation facilities is a fundamental principle in authoring. Student feedback confirms the desirability of such facilities. Finally, our answer to the group learning experience has been to incorporate student feedback into subsequent versions of the lesson. That Multimedia cannot truly emulate the discursive real-time tutorial interaction does not appear to terminally affect student learning outcomes. The implication is that other topics taught in a similar manner may be amenable for conversion to 1vrrL. This is a long-term and expensive commitment to the student learning experience. It remains for future researchers to determine the effect of this approach.
References American Association of Colleges of Pharmacy, "Mastering change". 93rd annual meeting (Washington uc: American Association of Colleges of Pharmacy, 12-15 July 1992). B. Christie, & M. M. Gardiner, "Evaluation of the human-computer interface", in Evaluation of human work: A practical ergonomics methodology, J. R. Wilson & E. N. Corelett (eds) (London: Taylor & Francis Ltd, 1990). J. Clarke, A. Cochrane, C. Smart, Ideologies of welfare: from dreams to disillusion (London: Hutchinson, 1987). J. R. Clem, D. J. Murray, P. J. Perry, B. Alexander, "Performance in a clinical pharmacy clerkship: computer-aided instruction versus traditional lectures", American Journal Pharmaceutical Education 56, 1992, pp. 259-63. A. Cochrane, & J. Clarke, Comparing welfare states: Britain in international context (London: Sage Publications, 1993). V. George, & P. Wilding, Ideology and social welfare (London: Routledge & Kegan Paul, 1985) pp. 19-119. L. Gilroy, "Problem-based distance learning, the role of the pharmacist. Part 1", Hospital Pharmacy Practice 2(12), 1992, pp. 743-4, p. 753. P. Lee, & C. Raban, "Welfare and ideology", in Social policy and social welfare, M. Loney, D. Boswell, & J. Clarke (eds) (Milton Keynes: Open University Press, 1983), pp. 18-32.
73
Conversion of the Ideologies of Welfare D. McDonough, J. Strivens, R. Rada, "University courseware development: comparative views of computer-based teaching by users and non-users", Computers Education 23, 1994, pp. 211-20. J. Preece, Y. Rogers, H. Sharp, D. Benyon, S. Holland, T. Carey, Humancomputer interaction (Wokingham: Addison-Wesley, 1994). J. Pugh, C. Moss-Barclay, R. Sharratt, N. Boreham, "The effectiveness of an interactive computerised education program", Pharmaceutical Journal 251, 1993, pp. E1-3. Royal Pharmaceutical Society, "Working party recommends social sciences teaching to undergraduates", Pharmaceutical Journal 243, 1989, p. 228. R. Stevens, & R. Sewell, "The replacement of pharmacology practicals by multimedia computer technology", Pharmaceutical Journal 251, 1993, pp. E11-5. Teachers of Pharmacy Administration of the American Association of Colleges of Pharmacy, Commissioned Report: A history of the discipline of pharmacy administration (Washington: American Association of Colleges of Pharmacy, 1985).
The Nuffield Foundation, The Report of a Committee of Inquiry. Pharmacy (London: The Nuffield Foundation, 1986).
74
8
DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE Stephen Scrivener and Susan Vernon
This paper does not deal specifically with the use of computing in social sciences - rather the subject considered is design. We believe, however, that the way of working described has general application in the social sciences. In design, computers are often seen as offering new forms of media, image-making, and information resource, for example virtual reality, three-dimensional modelling, painting systems and databases. Working with computer-based media is different to working with pen and paper, paint, models or the like, and design practice is bound to change as practitioners learn to deal with both its limitations and possibilities. This is understood to the extent that most design courses now include modules dealing with IT, computer-aided design, computer-based image-making and design databases. Important as these uses of the computer are, other equally important applications of computer-based technology should be considered by both designers and educators. For example, computers can provide an infrastructure for mediating collaborative design. When computers are used in this way the final artefacts, even their visualization and representation during the design process, may be largely non-digital and produced using conventional media and tools. Computer systems that support team communication and collaboration are usually called Computer-Supported Co-operative Work (cscw), or Groupware systems (see Scrivener & Clark 1994a for a review of cscw systems). This application of computer-based technology is likely to have as great an impact on design practice as digital media, modelling, and database tools, and yet at present there are few instances where this DOI: 10.4324/9780429332289-10
75
DesignNet: transnational design project work at a distance technology is used in practice or in the curriculum. However, a future can be envisaged in which designers work as part of international teams supported by computer- and electronically mediated communication and cscw tools. It will be important to prepare designers and students to work in this way. Indeed, we hope to demonstrate how this technology is not only something that students should understand and know how to use, but it is also actually a way of making it possible for students to work together as part of multi-national and multidisciplinary teams; educators can use the technology to bring such students teams together. Very importantly, the students do not have to be brought together in a given country - it is the technology that brings them together. This chapter describes the DesignNet project which aimed to explore further the possibilities of CSCW usage, of which earlier projects had led us to postulate the following: 1. Users will be committed and motivated to complete a task if it is perceived to be purposeful and valuable. 2. Users' motivation to complete a task will drive them to exploit the resources at their disposal, even if this involves radical changes in work and communication methods. 3. Users will choose what they perceive to be the most efficient and effective means at their disposal in order to complete a task. 4. Users will choose from the resources at their disposal those they perceive to be necessary and sufficient for the task in hand. (For a fuller account of the collected knowledge from an earlier project (Rococo) of design at a distance, refer to Scrivener & Clark 1994b.)
Computer and electronically mediated communication was used in the DesignNet project to enable multi-disciplinary, transnational students groups separated by distance to work together on a shared design project in order to produce an agreed outcome.
DesignNet The DesignNet project was partly funded by the Commission of the European Community's Task Force for Human Resources, Education, Training and Youth, as part of the preparatory phase of the Arts Education and Training Initiative. This initiative will seek To enhance transnational co-operation between education and training establishments in the European Union Member States in the field of the Arts, to increase mobility of students and teaching staff in this field, to promote the use of innovative techniques through measures to enhance dissemination of 76
The partners and disciplines information and good practice, to encourage international masterclasses and the production of special modules and courses which add a European dimension to education and training in the arts, and in general to support activities which, through the medium of European co-operation, enhance the quality of education and training in the arts throughout the European union. (Arts Education and Training Initiative 1994)
The project The project brief was the result of a collective decision between staff at an initial planning workshop. It was designed to focus on life-style, cultural issues and the interaction of design influences and objects to encourage an exchange of ideas from different countries. Key words were used in the brief to help overcome language difficulties and misinterpretation. The project provided a unique opportunity to compare different modes of group working and uses of various media technologies; it also highlighted the importance of cultural factors and the positive interaction created between peoples of different backgrounds. The project culminated in an exhibition, presentation and feedback workshop attended by all staff and students. It provided an opportunity to see, document and evaluate all of the work, containing elements that reflected both the combination of different skills and different cultural viewpoints.
Primary objectives The aims of the project, as set out in our proposal, were: 1. To share the experience of and evaluate the earlier Rococo electronic link project. 2. To explore the interaction and transition between twodimensional representation and three-dimensional construction. 3. To explore computer- and electronically mediated distance communication and how design is communicated across language barriers through visual discourse. 4. To demonstrate how electronic links can support communications between collaborating transnational universities and enhance the student learning experience.
The partners and disciplines DesignNet involved four institutions: Applied Arts, School of Art and Design, University of Derby, Derby, uK; the Ceramics Institute, Bergen
77
DesignNet: transnational design project work at a distance College of Art, Bergen, Norway; Faculty of Industrial Design, TU Delft, Delft, The Netherlands; and The Centre of Art and Design, Escola Massana, Barcelona, Spain. Ceramics, industrial design, jewellery and graphic design staff and students took part in the project. Six teams were formed linking Derby with Derby, Delft, Barcelona and Bergen, and Bergen with Barcelona. In all but one case, Derby-Derby, the links were transnational. In this latter case, the students were located in different sites in the City and were unknown to each other.
The media and technology Some teams had more resources to choose from than others. Fax and telephone were the primary communication resource in four of the six links. The other two links were computer-supported - Derby-Derby and Derby-Delft - and included Talk, a computer application that allows users present at the same time at each end of a link to "talk" by means of typed messages, and Aspects, which allowed users at both ends of a link to simultaneously write and draw on a common worksurface. The video connection was achieved using CUSeeMe which allows multiple users to share the video. Eudora was available for electronic mail and file transfer; and additionally files could be transferred by FTP. The conclusions presented in this chapter are based on observations during the project, and student and staff feedback at the end of the project.
Electronically and computer-mediated communication Art and Design is generally taught as an individualistic activity and group work is not the normal mode of working, neither is distance communication. Students and staff are used to working in a face-to-face situation for individual tutorials and seminars; communicating ideas visually and verbally is an inevitable part of the designing process. DesignNet posed quite different working methodologies for staff and students as face-to-face visual and verbal discussions in the real sense were not possible and alternative technological methods were adapted to enable design communication to take place. Misunderstandings did occur, particularly in the groups where only fax was used and students did not share a common language. Students were required to be more specific about conveying their thoughts and designs than would be necessary in a face-to-face situation as information could not be gathered from the subtleties of body language and expression.
78
Electronically and computer-mediated communication All teams were satisfied with the media available to them. This evidence supports Postulate 2 above which predicts that users will adapt their behaviour to accommodate for technological impoverishment in order to complete the task in hand. For example, the students who had only fax and telephone did not express more or less dissatisfaction with the technology than those who also had video and e-mail. Some students commented that while installing the work for the final workshop they came to realize that some misunderstandings had occurred as a result of the restricted forms of communication, but they also agreed that they did not, at the time, associate these problems with the communication media. In other words, in all cases, the students were able to exploit the media in ways they perceived satisfactory. Put another way, users will accept restricted means of communication if they can find ways of completing a task to their satisfaction. What is remarkable is users' robustness; they seem to be able to accommodate very severe restrictions. Adaptation of working method was clear: the teams developed protocols and practices geared to their conditions of working. For example, one group described how they prepared for a synchronous meeting (i.e. of fax exchanges), how they made a telephone connection to agree the agenda for the meeting, and how at the end of the meeting they would summarize the agreements reached during the meeting. It appears that each group devised a different strategy for co-ordinating their activities. The more successful groups maintained regular contact at fixed times, weekly meetings whether by telephone or cscw were planned in advance, and correspondence ensued asynchronously in between real time contact. In all groups students initially needed to "visualize" their partners (except in the Derby-Derby link) by faxing images of themselves. or in the Derby-Delft link by initially spending more time scrutinizing the video link through CUSeeMe. Initial fax messages exchanged were self-conscious regarding the quality of drawn and written information being transmitted (e.g. spelling, legibility, finished drawings); this soon loosened up as more spontaneous communication took place, especially when time was an issue. Nearly all groups said later that they would have liked more time, but in many ways the time constraint kept the momentum of the project going. It is difficult to state categorically that the addition of computermediated communication resources led to qualitative enhancements in either process or outcome as compared to those obtained using only electronically mediated communication. However, the Derby-Delft students displayed a strong sense of team commitment and identity. Of all the projects, this was probably the most integrated and unified. Had one not known otherwise one would have thought that the work was 79
DesignNet: transnational design project work at a distance produced by a single hand. On the evidence of the exhibition and presentations, its reasonable to hypothesize that computer-mediated communication of the kind used in the DesignNet project offers positive advantages over electronically mediated communication, such as fax and phone. Observation would suggest that the Derby-Delft group communicated more than the other groups. For them, video was very important as, although of very low bandwidth and hence poor visual quality, they were able to gauge gesture and facial expression. Interestingly, they did not use the telephone very often, although one was available. First indications are that this reflected a strategy rather than a preference: it seems that they decided to do as much as they could using the computer. Neither did they use Aspects, but this is hardly surprising as the network latency of Internet makes drawing difficult. To summarize so far: design at a distance mediated by electronic and computer technology is perfectly feasible. Students are able to adapt both communication and working strategies to accommodate technological impoverishments. Initial evaluation would suggest that computer-mediated resources, such as video, synchronous text exchange, e-mail, file transfer, and shared worksurfaces offer positive benefits in comparison to fax and telephone.
The method of working The brief permitted two ways of working. The first required a team to work together to develop what was essentially an agreed installation; the second required the designers to agree on an object to be produced by each individual. Three teams chose to work in one way and three in the other. The different methods of working were clearly represented in the outcomes: the agreed-installation method leading to single, unified pieces; the agreed-object approach to mini-exhibitions of individual pieces, the obvious connection being a common starting point. Perhaps not surprisingly, the teams adopting the agreed-installation method (apart from one group which we will come to later) appeared to have a stronger sense of team identity than those choosing the latter. When they presented their work they did so as a team, each member speaking for the team and seeking agreement from other team members about statements made on their behalf. Teams adopting the agreed-object method tended to report as two national teams. Interestingly, the agreed-installation teams appeared to be more positive about the whole experience overall. 80
Multi-disciplinary and transnational team working
Task meaningfulness Perhaps the reason for these apparent differences in the value attached to experience by different groups is related to the purposefulness of the task. If you are being asked to work as part of a team perhaps the production of an agreed installation seems more purposeful and more valuable than being asked to work on an agreed object. Perceived purposefulness and value (Postulate 1 above, "Users will be committed and motivated to complete a task if it is perceived to be purposeful and valuable") perhaps explains why the Derby-Derby link, an agreed-installation project, worked less successfully from the teamworking point of view than the other two agreed-installation projects. Obviously, things could have been arranged such that the students in Derby worked together, for all or part of the project. Both options were essentially prohibited as students were asked not to work face-to-face. Postulate 2 predicts that users will be less tolerant of communication impoverishments when they lack motivation; Postulate 1 suggesting that the perceived purposefulness and value of the task is a strong motivational factor.
Multi-disciplinary and transnational team working Generally, students found the experience of working in multidisciplinary and transnational teams rewarding. Where problems arose between teams this seemed to have little to do with the communication media or language. Indeed, students seemed highly tolerant of problems of this kind. Personality seemed to be an important factor in determining successful group dynamics. One student found it difficult to work with other people, and this caused problems for the group as a whole. Students who made valuable contributions to the group were well motivated and well organized, had a flexible attitude and an ability to value others' opinions. An enthusiastic approach was also important as well as an enjoyment of working with others. Decision-making was a shared activity with all group members participating. In one instance there was almost telepathic communication: as one design was faxed an almost identical one was received. Personality was directly related to learning activity, as those students who enjoyed working in groups and were more enthusiastic about the process gained more from the experience and felt a greater sense of satisfaction with what was achieved. This also directly related to the quality of the exhibited work: those groups who gained more from the whole process of communication also produced the most innovative and well co-ordinated work.
81
DesignNet: transnational design project work at a distance Generally, students were able to give and take, even when they wanted to retain a strong individualistic element in the work. Interestingly, when questioned about the quality of the work produced by groups with good team dynamics, the highly independent student admitted to being very impressed. One feature that did emerge in support of multi-disciplinary teams was that team members found it easy to accommodate anot.her team's contributions when they concerned experience and skills not possessed by the others. Finally, although individuals agreed that compromises had to be made they regarded this as generally enhancing rather than diminishing the outcome.
The work It is difficult to say whether transnational multi-disciplinary teamwork leads to better outcomes. What we can say with reasonable confidence is that the work produced for DesignNet contained elements that reflected both the combination of different skills and different cultural viewpoints. It is tempting to believe that these conjunctions and unifications enhanced the quality of the work; they certainly produced interesting and novel results.
Conclusions and future work We have suggested that design at a distance involving multidisciplinary, transnational teams is likely to be an increasingly common feature of design practice. It may also offer some salutary lessons to other disciplines such as the social sciences. Electronically and computer-mediated communication and collaborative work technology may enhance design practice; this remains to be determined. However, it cannot be disputed that this technology enables design at a distance to be achieved. It has to be recognized that collaboration using this technology is impoverished in terms of media, communication, and work pattern possibilities as compared to working in the same place at the same time. On the other hand, design at a distance offers potential benefits that may counterbalance or override these impoverishments. The primary aim of our previous studies of design at a distance and the DesignNet project was to investigate the problems and potential of this way of working, with a view to establishing ways of minimizing the problems and maximizing potential. The DesignNet project extended our earlier work in a number of ways, the most important being the differences in culture, language, and discipline of the collaborating 82
Acknowledgements individuals. Consequently, the overall aims of the DesignNet project were successfully realized. Both staff and students found it a purposeful, meaningful, valuable, interesting and enjoyable work and social experience. The students gained experience of unfamiliar technology, communication and work methods; and multi-disciplinary, transnational groupworking. Feedback indicates that students were able to overcome the problems of technology, language, culture, discipline and groupworking, and to draw out positive lessons and insights from the experience. In the first place, student resistance to new technology may be reduced since a design-at-a-distance project cannot easily be completed without it. Thus students can gain experience of new technology in a non-threatening and meaningful context. Having used new technology in this context they may be motivated to explore its potential in their day-to-day activities. Secondly, computer-supported communication and work may provide a means of maintaining regular interactions with linked institutions, especially if technology is available to support routine project work forming part of the curriculum. Staff also gained insights into the benefits and limitations of this kind of working. Routine design-at-a-distance projects, of the DesignNet kind, would enable an ongoing and constant level of interaction to persist as the foundation of other perhaps more changeable forms of interaction. Furthermore, such projects may provide a sensible precursor to staff and student exchange, as they allow staff and students in each collaborating institution to gain some prior knowledge and practical experience of the people, place, values and working and learning methods of their partners. We might expect that this would assist the assimilation of exchange staff and students into the host organization. Since completing this project an Art and Design communications module using Internet connections world-wide has been validated; this will help consolidate institutional links integrated in the curriculum. We have also recently completed DesignNet 2, a pilot scheme, with student teams linking institutions in Finland, Sweden, USA, Canada and Columbia which used only the Internet.
Acknowledgements Thanks go to all those who contributed to the success of the DesignNet project: Raghu Kalli, Richard Launder, Joan Sunyol, Lindon Ball, Nigel Billson, Paula Bourges, Sean Clark, Gail Ferriman, Tim Willey, Quinten Drakes, Joan Ainley, Gemma Carcaterra, Joost de Keijzer, Marjolein Rains, Irene Osborne, Amanda Simes, Elisabeth Fornas Dos-Santos, Cora Egger, Paul Rodriquez, Sarah Matthews, Nicola Williams, Armeli 83
DesignNet: transnational design project work at a distance Belsvik, Heidi Bourgan, Carol Cooling, Roger Davies, Nacho Garcia Del Rio, Franscisco Juan Tent Petrus, Robin Reeves, Lynn Butler, Raakesh Nath, Howard Dean, Elin Andreasson, Anna Maria Jacobsdottir, Lluis Serra, Anna Aibar, Wendy Proctor; to all those who contributed behind the scenes; and to the Arts Education and Training Initiative that partly funded the project.
References Arts Education and Training Initiative, Preparation Phase: Support for Demonstration Project, Commission of the European Communities, Task Force Human Resources, Education and Training, 22 January 1994, p. 1. S. A. R. Scrivener & S. M. Clark, "Introducing computer-supported co-operative work", in Computer-supported co-operative work, S. A. R. Scrivener (ed.) (Alperton: Ashgate Publishing, 1994a), pp. 51-66. S. A. R. Scrivener, & S. M. Clark, "Experiences in computer-mediated communication", Information Systems Architecture and Technology Workshop, Szklarska Poreba, Poland, September 1994b.
84
Section Three
IMPLEMENTING COMPUTER-ASSISTED LEARNING IN THE SOCIAL SCIENCES
9
COMPUTER-AIDED LEARNING AS A TOOL: LESSONS FROM EDUCATIONAL THEORY Graham R. Gibbs and David Robinson
This paper argues against a dominating design philosophy of CAL: the attempt to replace the teacher with technology. This is an example of the wider societal process of deskilling and recent research in CAL and allied areas suggests that it may be counterproductive if improved teaching and learning is the goal. In developing CAL packages for use in higher education teaching we need to identify what teachers are good at: giving lectures, explaining complex ideas, communicating with large numbers of people, having in-depth knowledge of specific areas. CAL should be used to enhance these skills by providing flexible learning tools rather than seeking to replace or deskill them. Tools which enhance the pedagogic skills of teachers include software gadgets, simulations, databases/hypertext systems, knowledge tools and communications systems. Advantages of the tools approach include recognizing teachers as experts, minimizing the effect of the "not invented here" syndrome, allowing for the different learning styles of students and in many cases encouraging learning by exploration and by doing. Examples oflearning tools are examined, and the paper concludes with the suggestion that the development and use oflearning tools will avoid the pitfalls associated with replacing the expertise of teachers.
Introduction In recent years there have been several initiatives in the UK such as the Computers in Teaching Initiative Centres, Phase 1 and 2 (en 1996), the DOI: 10.4324/9780429332289-12
87
Computer-aided learning as a learning tool 1 and 2 (TLTP 1996), and the ITTI (rrn 1996), each aimed at promoting the use of CAL packages for use in academic institutions. This paper looks at issues associated with these developments and in particular addresses the key question: "what kind of CAL software is best supportive of teaching in higher education?" There are many factors which determine whether CAL software will be widely used by lecturers in higher education. Software is more likely to be used where institutions give support to lecturers to develop and modify software, there are sufficient computers and students have adequate access. Such factors, however, are generally beyond the control of CAL software developers. On the other hand there are others which they can control. These include making software relevant to teaching needs and making it easy to integrate into teaching programmes. A key aspect of this is an understanding of how students learn. Indeed, in recent years there has been much investigation of how to produce CAL that is suitable for and supportive of the different ways students learn, for example, Kwok & Jones (1995), Patterson & Rosbottom (1995) and Groat & Musson (1995). However, a focus on learning is only half the equation. It is important for the uptake of CAL not to forget the role of the teacher. We need to develop an understanding about what good teaching practice is and how CAL software can best support it. Sadly, much recent CAL software seems intent on replacing the teacher rather than asking how the teacher can be supported. Even Groat & Musson, who do at least consider the relationship between teaching approaches and learning styles, are still mainly concerned with how CAL can be "tailored" to provide the kind of teaching which matches students' learning styles. Their model is still fundamentally one of teacher substitution. The developers of CAL, along with many other software developers, have commonly claimed that a major advantage of the software was efficiency gains. In fact, in some cases, an important criterion for funding of CAL development was the identification of improvements in teaching efficiency. This was true of the TLTP programme, where potential projects had to identify the number of tutor hours their proposals saved. One unfortunate consequence of this focus has been that much CAL has been designed in terms of replacing lectures, tutorials and seminars rather than supporting or enhancing teachers' existing skills. 1 Not only does this mean that some of the most satisfactory and often the TLTP
1. Of course this does not necessarily follow from the need to show efficiency gains. Gains could
also arise from the reduced need for remedial work, lower failure rates, faster learning and so on. But it is perhaps indicative of the poverty of much understanding of the teaching process that efficiency is nearly always seen in terms of reducing student contact and/or increasing staff-student ratios.
88
Recent research on teaching most efficient parts of teaching are replaced, but it also suggests that CAL designers are using a very limited idea of what teaching expertise is. In the latter case it is repeating the salutary experience of software developers in other fields such as the use of computer-based decision support in the medical domain (Young, Chapman, Poile 1990). To date there has been little success in producing systems in medicine which are widely used in routine clinical practice. In part this is because the systems on offer are not sufficiently useful to justify the effort required to use them (Rector et al. 1992) but, as Heathfield & Wyatt (1993) argue, it is also because they are generally replacement systems. In discussing the development of knowledge-based systems, Rector (1989) makes a broad distinction between those systems which aim to augment skilled performance and those which aim to substitute for some skill or knowledge. Substitution systems assume that their users are ignorant, or at most novices, in the field. Augmentation systems assume that their users are "broad experts" who are skilled in the field and exercise ultimate judgement, although they may make slips or lack particular items of knowledge. Augmentation systems are used, whereas substitution systems are consulted (Rector 1989). He argues that the goal of augmentation systems is to "become part of the regular tools of the trade". In contrast, substitution systems tend to deskill the experts they seek to replace and for Rector this is a mistaken strategy. Not only is it probably an impossible goal to replace the expertise of professionals in this way, but it is based on a false view of the nature of professional expertise, one in which knowledge-based systems are simply a question of capturing and reproducing a large body of internalized knowledge.
Recent research on teaching The current state of knowledge about the nature of teaching expertise indicates is that it is not a simple matter of defining a set of knowledge to be acquired and then producing optimal conditions for the transfer of that knowledge to the student. The work of authors like Marton & Saljo (1984), Ramsden (1992), Entwistle (1987) and Laurillard (1993) suggests a general consensus that good teaching practice brings together two things. First is an understanding on the part of the teacher of the scientific and intellectual concepts of the subject matter - what is to be learned. Second is an awareness of the learner's conception of that same subject matter, i.e. what their understanding of the "thing" is and the nature of their mistakes. In this essentially phenomenographic 89
Computer-aided learning as a learning tool or constructivist approach, the skill of teaching in higher education consists of progressively modifying the learner's conception to bring it closer to the teacher's while dealing with a large number of students. Laurillard suggests there are four prescriptive implications of this approach: • there must be a continuing dialogue between teacher and student [not a monologue] • the dialogue must reveal both participants' conceptions • the teacher must analyse the relationship between the student's and the target conception to determine the focus for the continuation of the dialogue • the dialogue must be conducted so that it addresses all aspects of the learning process (Laurillard 1993, p. 85). There are several consequences of these with relevance to CAL development. First, replacing the teacher by software is likely to be detrimental to the dialogue between teacher and learner - it may even eliminate it. Nevertheless, attempts have been made to produce software that substitutes the teacher's diagnostic role. In particular some developers have produced Intelligent Tutoring Systems (Anderson 1994; Anderson, Boyle, Yost 1986). These contain within them a cognitive model of the learner and give feedback and modify the learner's path though the material depending on the kind of misconceptions suggested by a comparison between the learner's answers and the model. However, such systems seem limited at the moment to knowledge domains which are logically well defined and uncontested. Examples include learning computer programming languages and geometry. Unfortunately, in most areas of knowledge in higher education, and in the social sciences in particular, concepts and ideas are commonly challenged and debated. There is as yet no CAL system which can diagnose students' misconceptions in such "ill-structured knowledge domains" (Spiro et al. 1991). Indeed, Perry suggests that recognizing that a discipline consists not of simple, agreed facts but of contested ideas about which one must take a stand is indicative of advanced learning (Perry 1970). Moreover, recent research with the Geometry Tutor (Schofield et al. 1994) suggests that, in fact, rather than replacing the teachers the program became an additional, enriching resource. Second, there have to be occasions on which the student's (mis )conceptions are made apparent so they are amenable to correction by the teacher. This is not just a matter of detecting wrong answers, something which intelligent tutoring systems can do. Learners may be doing the right things but for the wrong reasons. For example, we found, while examining students' use of CAL without intervention from the teacher, that some students while apparently doing sensible things
90
Presentation with the program actually held quite mistaken views about the underlying logic of the subject matter (Gibbs & Robinson, 1995). Third, a key problem is getting the student to modify their conceptions (or develop a coherent one). One difficulty here is that pointed to by Laurillard (1993), namely the distinction between percept and precept. Knowledge of dogs, for example, is a percept which comes about through ordinary, direct and common experience. In that sense it is easy to acquire. On the other hand, and more common in higher education, is knowledge of a more abstract nature such as knowledge of molecules. Laurillard calls this knowledge of precepts. These are learned only indirectly through analogy or metaphor since we have no direct contact with things such as molecules. Understanding here is thus more difficult to acquire; indeed it is fraught with difficulties arising from both the lack of concrete metaphors and the need to keep analogy within bounds. There is clearly a role here for CAL in providing fertile models and analogies that can make precepts more concrete. The advantages of computers - they can do calculations quickly, they can change with time and they can interact with the learner - suggest a rich seam of applications. At the same time, these are only models or analogies. There remains a role for the teacher in ensuring that the student does not generalize erroneously beyond the limits of the metaphor. Entwistle & Entwistle (1991), following the principles of cognitive psychology (e.g. Anderson 1990; McKeachie et al. 1990), suggest four functions which teaching has to meet. These put Laurillard's prescriptions into a more concrete form and elaborate the relationship between the use of CAL and teaching skills. The four functions are presentation, remediation, consolidation and elaboration. Each is associated with particular teaching methods and each suggests different roles for CAL.
Presentation This is most often done via lectures. Lecturers present new knowledge which should be related to learners' prior understanding and knowledge. It should have a clear, logical structure to help the students establish a personal organizing framework. Entwistle (1992) suggests that good lecturing has a deliberate focus and promotes self-confidence in learners and knowledge acquisition by them. It should also strike a balance between serialist and holist thinking. This distinction is made by Pask (1988), who suggests that some students prefer a holist approach in their learning - taking a broad overview, using good illustrations and examples and analogies but consequently giving insufficient attention to detail. They gain more
91
Computer-aided learning as a learning tool from holist approaches in lectures. Others tend to be serialists. They prefer a narrow focus, looking at detail and logical connections in building up to an understanding, but they tend to miss important analogies and connections between ideas. Pask presented evidence which suggests that a mismatch between students' preferred approach and that of lecturers has a detrimental effect on learning. However, the best learning occurred when lecturers used a versatile approach with holist and serialist elements, encouraging both holist and serialist thinking in their students. On another, but related, dimension, Ramsden (1992) suggests it is important to ensure that students undertake deep learning as well as surface learning. In a parallel to versatile approaches to lecturing, he suggests that students need to be encouraged to adopt flexible approaches to deep and surface learning. The main danger here is that the form in which knowledge is presented and especially how it is assessed may encourage learners to adopt mainly surface strategies. This enables them to perform well in the immediate tests, but hinders the development of a deeper and more conceptual understanding. There is clearly a role in presentation for CAL, especially of the hypertext or hypermedia kind, along with lectures and videos. But the dangers here are significant. Unlike a lecture or a video, there is much less control over how the student uses, or navigates around, a hypermedia system. That, indeed, is often thought a major advantage of such systems. However, the dangers are those identified above. Students tend to use the system in a way that simply reflects their prior preferences, holist or serialist, and may fail to develop a versatile approach that produces the best learning. They tend to skim the surface of the content of the hypermedia system and gain little depth understanding. Their use of the system may lack focus and, in the absence of guidance, they may just "wander around". Browsing is not learning. Moreover, without some teaching guidance the information and concepts the system presents them with may be at the wrong level for their stage of understanding - too hard or too simple. None of this rules out the use of CAL in presentation, but it does mean its use is likely to need the continuing intervention of the teaching expert, to give guidance and modify the student's explorations. Admittedly, some of these considerations could be built into the software. For example, the package could ensure that the level of material presented is appropriate for the student by using a built-in pre-test. However, we should be careful of trying too hard to match the content and approach of CAL systems with students' preferred learning styles. There have been several attempts at this (Clarke 1993; Groat & Musson 1995). Done automatically, there is a danger that it will lead merely to the reinforcement of existing habits and the underdevelopment of versatile approaches. 92
Remediation
Remediation Remediation is where students "fill in the gaps" and try to catch up with the concepts and knowledge presented in lectures. Entwistle (1992) suggests this is most often done by students themselves through reading textbooks. Of course this could now be achieved by their use of CAL systems. Tutorials can also play a role in helping learners to assimilate new concepts, where their background understanding is insufficient or they are having difficulties grasping new ideas. However, both educational research and the common experience of lecturers suggest that tutorials are notoriously varied in their adequacy. Attendance is often poor and, as a result of student reluctance to engage or their insufficient preparation, tutorials are often far too didactic. Research suggests tutorials work best if their focus is on meta-cognitive strategies. For instance, Baumgart (1976) suggests the best role for teachers in seminars or tutorials is "the reflexive judge" or "probing". A device that is being increasingly used in small group discussions is work based on case studies and simulations. This is a good way of promoting problemsolving approaches. But, as Entwistle (1992) cautions, good and careful debriefing after the case study is needed so that students can see the general relevance of the material they have examined. Thus both as a replacement for textbooks and as supportive material for case studies there is a role for CAL. Indeed, there is a current TLTP project in Politics which is aimed at developing computer-supported case study materials. But again, as in the case of presentation, without the intervention of the skilled teacher students may gain only limited benefit from the materials. There is a need for careful debriefing, to identify and correct learners' misconceptions and again, unless teachers counteract it, learners may just skim the surface, not properly getting to grips with underlying concepts. In the case of both the presentation and remediation uses of CAL there is a further limiting factor. Teachers and learners want to pick and choose the materials they use. This reflects varying styles and approaches but is also because teachers need to ensure that materials appropriate to the students' pre-existing conceptual sophistication are used. Printed materials are varied and widely available and fit this need well. Parallel materials in CAL packages, and in hypermedia systems particularly, are much less available, sometimes expensive and much less varied. Also, and this is crucial, whereas it easy for both lecturers and learners to gauge the level and content of printed materials (readers have already acquired the skills for doing this and the content is relatively open to access), this is much more difficult with hypermedia systems. The open-structuredness of hypermedia, often seen as a major advantage, may in fact militate against their use. It is difficult to find 93
Computer-aided learning as a learning tool out quickly whether such programs have contents at the required conceptual level, or, even if they do, there is still a need for the skilled teacher to modify their presentation to meet pedagogic needs.
Consolidation Consolidation is the use of the newly acquired knowledge, concepts and skills in wider contexts. Students' tentative and initially fragile understanding is reinforced in a variety of settings and ways such as in laboratory work, fieldwork, essays, projects, examinations and other assignments. Again, case studies and simulations are especially good for this. But, as Entwistle (1992) points out, both project work and case studies need good quality teaching resources and learners need to adapt to a different self-discipline. Learners may be left more to their own devices than they are used to. They need to develop new coping strategies in order that they do not procrastinate over getting work done. Entwistle suggests a mix of approaches is best, and above all, good debriefing is needed. There are several ways in which CAL can help here. As mentioned above, case studies can be presented through the use of computers. In particular, case study and simulation may be brought together in microworlds (Hartog 1989; Isaacs 1990). Second, the process of thinking about what has been learned, of processing and structuring knowledge can be supported by other kinds of program, collectively known as mindtools. Software such as Inspiration and Skate (discussed below) can support concept mapping, which encourages the student to investigate the logical structuring of what they know and to investigate both the links between what they know and the gaps in their understanding. Third, assessment can be computer-assisted. While at the moment this is limited to highly structured and deterministic assessments, the approach does have the great advantages that students get almost instantaneous feedback and most of the assessment requires no teacher involvement. Again, there are dangers. In addition to the need for debriefing mentioned above, in many cases what is presented in software could just as easily, and probably more cheaply, be presented in printed or paper form. Indeed the use of mindtools as a form of consolidation often requires generous resources of printed materials which students read in order to build their concept maps. In the case of computer-assisted assessment, the biggest danger is the tendency to focus development efforts on what is assessable in that way rather than on what it is important to assess. Like the drunken man at night looking for his keys under the street light because, although he didn't drop them there,
94
Software tools for learning that's the only place he might see them, the temptation is to assess what is easily assessed even if it is educationally trivial. In particular, depth learning, which is educationally most significant, may never be assessed this way because of the technical problems of writing the software. Even in those domains where assessment can be done successfully by computer, there remains an important role for the skilled teacher in maintaining a balanced diet of assessment and feedback.
Elaboration The final stage oflearning identified by Entwistle (1992) is elaboration. This refers to the learning where students acquire additional meanings, examples of evidence and a deeper and broader understanding. It is widely accepted within cognitive psychology that greater elaborative processing of information results in better understanding and recall of the material. Whether the elaborations are generated internally by individuals or are generated externally, for example by CAL software or a lecturer, is not necessarily important. It is the precision with which the elaboration relates to the concepts and material to be learned and understood that is the issue. If Anderson (1990) is correct in his assertion that question-making contributes most to elaborative processing then as teachers we should aim to construct situations that facilitate this process. Traditionally this has been done through the use of lectures, tutorials, textbooks, laboratory work and fieldwork, but clearly there is a role for CAL here in examples such as mindtools, microworlds and other case studies. However, the potential dangers and pitfalls faced by CAL software outlined in the discussion of consolidation also apply here. In particular there is a need for proper debriefing so that students recognize what they have learned and for appropriate monitoring of their elaborations to ensure that misconceptions are identified and corrected.
Software tools for learning What kind of picture of appropriate CAL software emerges from the preceding discussion? We suggest four attributes: CAL software should complement the expertise of teachers, it should be flexible and adaptable, it should promote deeper learning, and it should be usable only when integrated into a wider teaching context (i.e. should not stand alone). Such software has been termed software tools or learning aids (Chute 1995). In not substituting for the knowledge of the teacher such 95
Computer-aided learning as a learning tool software may have little apparent informational content. The function of the tool is not to get the student to learn by acquiring surface knowledge from manifest information in the software, but to get the student to master new ideas by manipulating the program in ways made possible by the latent concepts on which its design is based. A parallel can be drawn here with what is needed in order to repair a car. A manual and some tools are needed. The manual alone is useless - as are the tools without some knowledge about how the car works. But the tools can be used on a variety of cars, in a variety of ways, and they can be used once the mechanic is experienced enough not to need the manuals. We believe CAL software should more often provide the tools for learning rather than the manual. The provision of factual knowledge is done very well at the moment by books, videos and to some extent by co-ROM. However, as we have argued above, presentation of knowledge is merely one stage in learning. Students need remediation and they need to consolidate and elaborate. For this they need learning tools as well as the knowledge base. Software tools for learning are relatively free of content. They rely on its provision in other ways. But they support the learner in focusing on remediation, consolidation and elaboration, and they do so in a way that does not eliminate the lecturer, but in fact requires the intervention of the skilled teacher to manage the student's learning. Software tools for learning, therefore, do not substitute for teaching skills, rather they support and augment teachers' essential skill: helping learners to manage their development. Moreover, they do so in a way that is flexible and adaptable. Because software tools have little in the way of content they are easier to adapt to specific teaching needs and for that reason are likely to be usable in a wide variety of teaching contexts by a large number of different teachers. Examples of CAL software tools include software gadgets (like experiment generators), simulations (such as models of political systems, or ecological systems), databases/hypertext systems and knowledge tools (all of which require the learner to structure and process the knowledge they are gaining) and communications systems (such as computerassisted co-operative learning and e-mail which promote collaboration, peer tutoring and asynchronous co-operation). These general types can be illustrated by three examples.
Correlation explorer This is a simple model of the statistical concept of correlation (see Figure 9.1). It is based on the scattergram. Students can modify the data set by changing the number of points and by moving them around the 96
Inspiration
~ 284, 133
Figure 9.1
Correlation Explorer.
scattergram. As they do so they can see the impact on various statistics and on the regression line. There is almost no text in the program. Instruction about correlation takes place outside the program. Its role is to provide a relatively concrete metaphor for correlation so that in manipulating the scattergram students can consolidate and elaborate their understanding.
Inspiration This is a general concept mapping and chart-making program (see Figure 9.2). Its use as a teaching tool would be either as an analysis tool, for instance as a means of analyzing qualitative data (Weitzman & Miles 1995), or as a concept mapping tool. In the latter case the student would use the program as a means of structuring and representing their understanding of the nature and relationship of ideas, theories, hypotheses, evidence, conjectures and so on from the area they are studying. Inspiration is fairly open about how this could be done. In this sense it is close to a word processor (in fact like many word processors Inspiration contains an outline facility). Other similar programs such as Skate (Reader & Hammond 1994) are stricter about how they can be used. For
97
Computer-aided learning as a learning tool
-~
2.1~
P