This is the fourth post in a series on selecting and using technology for teaching and learning. The first three were Models for selecting and using technology: 1. the challenge, 2. Models for selecting and using technology: 2. A (very) brief history of educational technology and 3. Models for selecting and using technology: 3. Broadcast or communicative?
In the last post, I looked at the dimension of broadcast or communicative technologies. In this post I will look at another critical dimension of communications technologies: whether they are synchronous or asynchronous, as we shall see that this is a critical consideration for teaching and learning.
Synchronous or asynchronous
Synchronous technologies require all those participating in the communication to participate together, at the same time. Asynchronous technologies enable participants to access information or communicate at different points of time, usually at the time of choice of the participant.
Other terms can be used to describe this dimension, such as scheduled or on-demand, transient or permanent, live or recorded, but they basically have the same features as synchronous or asynchronous.
Live television or cable broadcasts, radio programs, video-conferences, live web casts and live lectures are primarily synchronous technologies or media. You have to ‘be there’ at the time of transmission, or you miss them.
The main advantages of synchronous technologies for education are immediacy and often increased emotional engagement. Another feature of live or synchronous events is the element of uncertainty (you don’t know what’s coming next and you can’t skim or fast-forward). This is one of the appeals of live sports broadcasts or attending a rock concert.
Books, DVDs, You Tube videos, lectures recorded through lecture capture, Facebook, and online discussion forums are all asynchronous technologies. Learners can log on or access these technologies at times of their own choosing.
It should also be noted that synchronous technologies can be changed to asynchronous technologies in two ways that must be present together: recording; and archiving in an accessible way. (Thus a recorded broadcast is still a synchronous technology at the time of broadcasting, unless the recording can also be accessed through for instance a web site on an asynchronous basis.)
Overall there are huge educational benefits associated with asynchronous technology, because the ability to access information or communicate at any time offers the learner more control and flexibility. The educational benefits have been confirmed in a number of studies. For instance, research at the Open University found that students much preferred to listen to radio broadcasts recorded on cassette than to the actual broadcast, even though the content and format was identical (Grundin, 1981; Bates at al., 1981).
The importance of design
However, even greater benefits were found when the format of the audio was changed to take advantage of the control characteristics of cassettes (stop, replay). It was found that students learned more from ‘designed’ cassettes than from cassette recordings of broadcasts, especially when the cassettes were co-ordinated or integrated with visual material, such as text or graphics. This was particularly valuable, for instance, in talking students through mathematical formula (Durbridge, 1983).
This research underlines the importance of changing design as one moves from synchronous to asynchronous technologies. Thus we can predict that although there are benefits in recording live lectures through lecture capture in terms of flexibility and access, the learning benefits would be even greater if the lecture was redesigned for asynchronous use., with built-in activities, points for students to stop the lecture and do some research or extra reading, etc.
The ability to access technologies asynchronously is one of the biggest changes in the history of teaching, but the dominant paradigm in higher education is still the live lecture or seminar.
It should be emphasised that broadcast/communicative and synchronous/asynchronous are two separate dimensions. By placing them in a matrix design, we can then assign different technologies to different quadrants, as in Figure 1 below. (I have included only a few – you may want to place other technologies on this diagram):
For a larger version of Figure 1, click here or on the figure itself.
The significance of the Internet
Why the Internet is so important is that it is an encompassing technology that embraces all these technologies and forms of communication, thus offering immense possibilities for technology and learning, as can be seen in Figure 2. This enables us, if we wish, to be very specific about how we design our teaching so that we can exploit all the characteristics or dimensions of technology through this one medium.
For a larger version of Figure 2, click here or on the figure itself.
It should be noted at this stage that although I have identified some strengths and weaknesses of the four characteristics of broadcast/communicative/synchronous/asynchronous, we still need an evaluative framework for deciding when to use or combine different technologies. This means developing criteria that will enable us to decide within specific contexts the optimum choice of technologies. I will attempt to do this in later posts, but in the meantime we still have some other characteristics to explore or define.
In particular, in the next post, I want to discuss the differences between technology and media. Although we often use these terms inter-changeably, I believe there are important differences that will enable us to understand better the conditions that enable technology to work better for teaching and learning
Bates, A. (1981) ‘Some unique educational characteristics of television and some implications for teaching or learning’ Journal of Educational Television Vol. 7, No.3
Durbridge, N. (1983) Design implications of audio and video cassettes Milton Keynes: Open University Institute of Educational Technology
Grundin, H. 1981) Open University Broadcasting Times and their Impact on Students’ Viewing/Listening Milton Keynes: The Open University Institute of Educational Technology
1. Does this categorization of technologies make sense to you?
2. Can you easily place other technologies into the diagrams above? What technologies don’t fit? Why not?
3. Can you imagine a situation where an audio cassette might be a better choice for teaching and learning than Second Life (assuming students have access to both technologies)?