This is the second of three blogs that examine some basic assumptions about technology and education, based on a review of three books: ‘THE TOWER AND THE CLOUD‘, ‘CATCHING THE KNOWLEDGE WAVE‘, AND ‘THE INTEGRATION OF INFORMATION AND COMMUNICATIONS TECHNOLOGIES IN THE UNIVERSITY‘. (Click on the titles to see the reviews).
In this blog, I wish to examine and to some extent challenge the following assumption:
‘Assumption 2: information technology is no longer just a useful tool that supports university and college administration and to a lesser extent teaching and learning; rather it is now an integral and essential component of almost all core higher education activities, and as such needs to be used, managed and organised accordingly.’
ICTs as a (sometimes) useful tool
I can imagine people saying ‘duh’ when they read this assumption, especially readers of this blog, who have likely already recognised the importance of technology for teaching. Surely it’s obvious, isn’t it, that technology is an integral part of teaching and administration now?
Well, it may be to YOU, but it is not obvious, accepted or acted on by many of the people in higher education that I meet. Yes, they recognise that computers and the Internet are useful tools, and become more useful every day, but they do not accept the next step that in the 21st century, ICTs (information and communication technologies) are essential for teaching and administration. Many professors will argue that they can teach quite successfully, thank you, without having to use technology. ‘Yes, technology may enhance my teaching, but (especially if you give me smaller classes) I can teach just as well or even better without it if I have to.’ (Direct quote from a colleague). This is one assumption that I want to challenge.
On the administrative side, the feeling is probably less prevalent that operations could manage quite nicely without using IT, but still many academics and senior managers in post-secondary education believe that IT is a necessary evil for administrative services, and not a guaranteed beneficial process. The main worry about IT is its high cost and never-ending demand on scarce resources, not just money, but staff time in training and keeping up with the technology. Many post-secondary institutions were badly burned in the 1990s by very large over-runs in costs, huge disruption to services, and poor performance when switching over to data-base driven financial services and student records. Even today, these services are often not well integrated in a seamless manner, often run independently of the academic side of the house, and are difficult or costly to ‘virtualise’, in the form of web accessible delivery of services.
In other words, on both the academic and administrative sides it has been difficult to show major benefits at a level that clearly justifies the high investments made in IT. As one dean said to me as we entered a meeting to discuss upgrading (yet again) a learning management system, ‘Come and meet the snake oil salesmen.’
Although it is essential that the claims of IT enthusiasts continue to be cautiously examined, and although we need to do much more to tie IT investment to clearly measurable benefits, I am going to argue that universities and colleges will not meet the needs of students and society in the 21st century unless they fully integrate ICTs within all their core activities.
Integrating ICTs into teaching and learning: the qualitative argument
The more difficult case to make is on the academic side. Academic knowledge has been successfully taught without the use of computers or the Internet for over 100 years. In my last blog, I argued that the nature of knowledge, particularly academic knowledge, has not substantially changed because of technology.
However, what has changed is society and particularly the economy. In highly developed countries, the majority of jobs are now in service and knowledge-based industries. Knowledge based industries require people with different skills from the majority of those who formerly worked in mass production factories. Knowledge-based workers require advanced education – the ability to think critically, to solve problems, to learn independently, etc. Indeed knowledge-based companies are created and owned often by people with advanced education having the ideas and as importantly the necessary entrepreneurial skills that bring success. And critically, these knowledge-based companies depend on computers and the Internet for their business.
It is though not just businesses that depend on computers and communications technologies. Social life, entertainment, holidays, travel, all are increasingly dependent on ICTs. Without skill in using such technology, people will be severely disadvantaged in such societies. (We may argue whether or not this is a good or bad thing, and whether it should be headed off in some way, but the reality is we need to educate at least partly for the world as it exists, as well as the world we would like it to be). It will become increasingly difficult to see a student as comprehensively educated if he or she graduates without knowing how to use ICTs in their job or profession.
Because digital technology is now so pervasive, and affects the creation, storage, access, analysis and dissemination of knowledge, all areas of human activity are increasingly being touched by it. Academic knowledge is no different. To be a scholar now means knowing how to find, analyse, organise and apply digital information. Studying without the use of technology is increasingly like learning to dive without water. This is not an argument for teaching generic computer literacy skills, such as how to keyboard or use a word-processor, but for using computers for digital imaging in medicine, for graphical information systems in geology, for using wikis to teach writing skills, for knowing what databases hold information relevant to solving a particular problem.
In one polytechnic/vocational college where I worked, I went round every department and asked the question of the instructors: ‘Do students need to use computers in their trade?’ I expected answers to vary by trade. For instance when I got to welding, I expected a negative answer. ‘How do you think we inspect oil pipelines?’, one instructor asked. ‘We send a robot down the pipe. A welder sits at a computer control panel, and remotely operates the robot, including welding small cracks remotely.’ Welders still need to know the properties of metals, and how to use a blow torch, but they also need skills in IT. In all twelve trade areas, I was told that computers were an integral part of their trades.
In other words, the use of digital technologies needs to be embedded within and meet the needs of specific areas of academic knowledge. Notice though that the emphasis here is on ICTs to facilitate the learning process. Thus although there is often a content component of using the technology (knowing what it is and what it does), the main function of the technology is to assist the learning of academic knowledge, or to develop specific trade or professional applications. However, since academic knowledge itself is increasingly embedded within digital technologies, the two aspects of using technology become increasingly integrated.
Integrating ICTs into the curriculum: the quantitative argument
So far, I have focused on a qualitative argument: the world is increasingly digital, and we need to embed digital technology within the everyday teaching of the curriculum, so that students develop the skills needed in an information society. However, there is also a quantitative argument for the use of ICTs.
In an industrial society, higher education was deliberately focused on a small elite: the owners, senior executives, diplomats, senior government bureaucrats, and professionals who would run a society. Thus (particularly in Britain) university education was reserved for an elite (Jane Gilbert, 2005, discusses this in more detail in her book). However, in all advanced countries, we have seen an increasing proportion of students going on to higher education, partly for egalitarian reasons. In Canada and the USA, more than 60 per cent of an age-group now participate in some form of post-secondary education. This is one main reason for the explosion of the knowledge-based society. All these new graduates have provided the source of many of the new, knowledge-based companies. The result though is that the cost of public higher education has dramatically increased.
As a result, governments have not funded the institutions providing post-secondary education in proportion to the additional numbers. This has meant fewer tenured professors per student, increased use of non-tenured adjunct faculty, larger classes, and increases in tuition fees. Nevertheless, despite this, the predominant educational paradigm is based on a time when class sizes were small and interaction between professors and students were high. The lecture and the seminar, and attendance on a campus on a schedule that resembles the industrial work week, are the key organizing principles for teaching. Where technology is used for teaching, it is predominantly added on to the classroom model, rather than being used to increase efficiency or reduce traditional activities.
However, if technology is fully integrated within the curriculum, there is no need for the 9 to 5, Monday to Friday timetable. Students can and will study when and where they want, because much of the learning will be through digital sources. This of course is not distance education in the old sense, but a truly blended learning approach, where students still come to campus for those activities that are essentially done in a face-to-face or hands-on environment. If properly designed, such teaching can be done with larger numbers, without a loss of interaction. Face-to-face interaction may be reduced; even total interaction time with a professor may be reduced, probably replaced by increased peer-to-peer interaction between students; but students will still be interactively learning.
Implications for teaching and learning
What does this mean though for teaching and learning? First of all, we need to look at a new model of curriculum planning that seeks to embed the use of ICTs within the teaching of the subject. This should not be done in an automated way (i.e. every class and every course will use computers – see John Schinker’s blog, Technology Planning, for more discussion of this), but by examining the knowledge, skills and attitudes required in a program, and working out where ICTs fit into this and how they can best be used. In particular, it will mean working as a team to develop coherent approaches to curriculum that ensure that ICTs are used in a planned and consistent way across a whole degree or certificate program so that students graduate with the necessary ICT knowledge and skills within the subject area.
Second it means breaking away from the fixed silos of 13 week, three credit, semesters, with classes scheduled every day from 8.00 am until 9.00 pm in the evening (reflecting an industrial ‘discipline’). Students and professors would come together as needed, and credits would be based on total hours studied, not by butts on seats at fixed times.
It will mean designing courses and programs in ways that serve full-time, part-time and fully distance students at the same time (for an example of this, see Learning Technologies@UBC, 2005). Lastly, and perhaps most challengingly, it will mean better faculty training, with a focus on development of appropriate learning environments, team-based teaching, analysis of curricula requirements for students (knowledge, skills, assessment, etc,.) and an understanding of the strengths and weaknesses of different technologies for teaching.
Implications for administrative services
The issues are somewhat different on the administrative side. The issues are more about cost-effectiveness and return on investment. The main difficulties have arisen because of the difficulty in defining clearly the intended benefits of IT investment in measurable terms, e.g. service times reduced by 30% on average, costs reduced by 10%, increased student satisfaction with services, etc.. More focus on measurable benefits will lead to increased support for investment in IT for administrative purposes.
Poor governance of IT is also a common reason for poor decision-making. Richard Katz’s book ‘The Tower and the Cloud‘ makes it very clear that the more IT becomes integrated within an institution, the wider and deeper the number of stakeholders that need to be involved in decision-making. It has been difficult enough for IT professionals to win the trust and confidence of senior administrators; as control of decision-making regarding ICTs moves out more towards the end-users, e.g. professors, and student advisors, the more an institution needs to ensure that all stakeholders are fully engaged in decision-making about ICTs. We are all IT users now, and we all need to be involved in decisions about IT.
Governance though is just one of several other management issues that arise when ICTs are fully integrated. These will be discussed in more detail in the next blog.
It is argued that ICTs are now mission critical for both teaching and administration in post-secondary institutions, but this is still not fully accepted in many institutions, or by many professors or instructors, especially with respect to teaching and learning. There are major implications for institutional policies, priorities and management once it is accepted that ICTs are mission critical.
I will argue in the next blog that few institutions yet are fully prepared or ready for this situation, and in particular, senior administrators are poorly prepared for dealing with ICTs as a mission-critical area of decision-making, resulting in inadequate IT governance, poor security, and loss of return on investment.
Gilbert, J. (2005) Catching the Knowledge Wave: the Knowledge Society and the Future of Education Wellington, NZ: New Zealand Council for Educational Research
Review of: Katz, R. et al. (2008) The Tower and the Cloud: Higher Education in the Age of Cloud Computing Boulder CO: EDUCAUSE