July 17, 2018

Five old educational technologies

Etherington, C. (2018) Five educational technologies, circa 1918 ELearning inside news, January 1

Despite rumours, I was not around in 1918, but this article is a very nice reminder of what was happening 100 years ago with educational technologies. The five technologies are:

  • magic lanterns
  • chalkboards
  • ink pens
  • abacuses
  • radio

When I started teaching, in 1965, in my school it was still compulsory for students to use ink pens (not ‘nasty Biros’, which were available then). This was a real problem for left-handed pupils, who tended to drag their hand across the wet ink when writing from left to right. I fought hard to get an exemption but my headmistress was adamant – no exceptions were allowed. We have made at least some advances since then regarding accessibility and accommodation to the needs of minorities.

As the article points out, radio was still a couple of years away from actually being used for instructional purposes, although it was increasingly available by 1918. The first BBC adult educational radio program was broadcast in 1924 and was about fleas: a talk on Insects in Relation to Man.

Nevertheless, these old technologies also illustrate how little has changed in many classrooms in terms of pedagogy. PowerPoint is nothing more than a merger of a magic lantern and a chalkboard, but the form of teaching remains the same.

It is much easier to identify technology changes then over 100 years but far less progress has been made on improving teaching methods –  or do you disagree?

A short history of educational technology

Charlton Heston as Moses: what language is used on the tablets?

Charlton Heston as Moses. Are the tablets of stone an educational technology? (See Selwood, 2014, for a discussion of the possible language of the Ten Commandments)

The first section of my chapter on ‘Understanding Technology in Education’ for my open textbook on Teaching in a Digital Age was a brief introduction to the challenge of choosing technologies in education. This section aims to provide a little historical background. This will not be anything new to most readers of this blog, but remember the the book is not aimed at educational technologists or instructional designers, but at regular classroom teachers, instructors and professors.

Particularly in recent years, technology has changed from being a peripheral factor to becoming more central in all forms of teaching. Nevertheless, arguments about the role of technology in education go back at least 2,500 years.  To understand better the role and influence of technology on teaching, we need a little history, because as always there are lessons to be learned from history. Paul Saettler’s ‘The Evolution of American Educational Technology’ (1990) is one of the most extensive historical accounts, but only goes up to 1989. A lot has happened since then. I’m giving you here the postage stamp version, and a personal one at that.

Technology has always been closely linked with teaching. According to the Bible, Moses used chiseled stone to convey the ten commandments, probably around the 7th century BC. But it may be more helpful to summarise educational technology developments in terms of the main modes of communication.

Oral communication

One of the earliest means of formal teaching was oral – though human speech – although over time, technology has been increasingly used to facilitate or ‘back-up’ oral communication. In ancient times, stories, folklore, histories and news were transmitted and maintained through oral communication, making accurate memorization a critical skill, and the oral tradition is still the case in many aboriginal cultures. For the ancient Greeks, oratory and speech were the means by which people learned and passed on learning. Homer’s Iliad and the Odyssey were recitative poems, intended for public performance. To be learned, they had to be memorized by listening, not by reading, and transmitted by recitation, not by writing.

Nevertheless, by the fifth century B.C, written documents existed in considerable numbers in ancient Greece. If we believe Socrates, education has been on a downward spiral ever since. According to Plato, Socrates caught one of his students (Phaedrus) pretending to recite a speech from memory that in fact he had learned from a written version. Socrates then told Phaedrus the story of how the god Theuth offered the King of Egypt the gift of writing, which would be a ‘recipe for both memory and wisdom’. The king was not impressed. According to the king,

‘it [writing] will implant forgetfulness in their souls; they will cease to exercise memory because they will rely on what is written, creating memory not from within themselves, but by means of external symbols. What you have discovered is a recipe not for memory, but for reminding. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them many things without teaching them anything, you will make them seem to know much, while for the most part they will know nothing. And as men filled not with wisdom but the conceit of wisdom, they will be a burden to their fellow men.’

Phaedrus, 274c-275, translation adapted from Manguel, 1996

I can just hear some of my former colleagues saying the same thing about social media.

The term ‘lecture’, which comes from the Latin ‘to read’, is believed to originate from professors in medieval times reading from the scrolled manuscripts handwritten by monks (around 1200 AD). Because the process of writing on scrolls was so labour intensive, the library would usually have only one copy, so students were usually forbidden direct access to the manuscripts. Thus scarcity of one technology tends to drive the predominance of other technologies.

Slate boards were in use in India in the 12th century AD, and blackboards/chalkboards became used in schools around the turn of the 18th century. At the end of World War Two the U.S. Army started using overhead projectors for training, and their use became common for lecturing, until being largely replaced by electronic projectors and presentational software such as Powerpoint around 1990. This may be the place to point out that most technologies used in education were not developed specifically for education but for other purposes (mainly business.)

Although the telephone dates from the late 1870s, the standard telephone system never became a major educational tool, not even in distance education, because of the high cost of analogue telephone calls for multiple users, although audio-conferencing has been used to supplement other media since the 1970s.  Video-conferencing using dedicated cable systems and dedicated conferencing rooms have been in use since the 1980s. The development of video compression technology and relatively low cost video servers in the early 2000s led to the introduction of lecture capture systems for recording and streaming classroom lectures in 2008. Webinars now are used largely for delivering lectures over the Internet.

None of these technologies though changes the oral basis of communication for teaching.

Written communication

The role of text or writing in education also has a long history. Even though Socrates is reported to have railed against the use of writing, written forms of communication make analytic, lengthy chains of reasoning and argument much more accessible, reproducible without distortion, and thus more open to analysis and critique than the transient nature of speech. The invention of the printing press in Europe in the 15th century was a truly disruptive technology, making written knowledge much more freely available, very much in the same way as the Internet has done today. As a result of the explosion of written documents resulting from the mechanization of printing, many more people in government and business were required to become literate and analytical, which led to a rapid expansion of formal education in Europe. There were many reasons for the the development of the Renaissance and the Enlightenment, and triumph of reason and science over superstition and beliefs, but the technology of printing was a key agent of change.

Improvements in transport infrastructure in the 19th century, and in particular the creation of a cheap and reliable postal system in the 1840s, led to the development of the first formal correspondence education, with the University of London offering an external degree program by correspondence from 1858. This first formal distance degree program still exists today in the form of the University of London International Program. In the 1970s, the Open University transformed the use of print for teaching through specially designed, highly illustrated printed course units that integrated learning activities with the print medium, based on advanced instructional design.

With the development of web-based learning management systems in the mid-1990s, textual communication, although digitized, became, at least for a brief time, the main communication medium for Internet-based learning, although lecture capture is now changing that.

Broadcasting and video

BBC television studio and radio transmitter, Alexandra Palace, London Image: © Copyright Oxyman and licensed for reuse under this Creative Commons Licence

BBC television studio and radio transmitter, Alexandra Palace, London
Image: © Copyright Oxyman and licensed for reuse under a Creative Commons Licence

The British Broadcasting Corporation (BBC) began broadcasting educational radio programs for schools in the 1920s. The first adult education radio broadcast from the BBC in 1924 was a talk on Insects in Relation to Man, and in the same year, J.C. Stobart, the new Director of Education at the BBC, mused about ‘a broadcasting university’ in the journal Radio Times (Robinson, 1982).Television was first used in education in the 1960s, for schools and for general adult education (one of the six purposes in the current BBC’s Royal Charter is still ‘promoting education and learning’).

In 1969, the British government established the Open University (OU), which worked in partnership with the BBC to develop university programs open to all, using a combination originally of printed materials specially designed by OU staff, and television and radio programs made by the BBC but integrated with the courses. It should be noted that although the radio programs involved mainly oral communication, the television programs did not use lectures as such, but focused more on the common formats of general television, such as documentaries, demonstration of processes, and cases/case studies (see Bates, 1985). In other words, the BBC focused on the unique ‘affordances’ of television, a topic that will be discussed in much more detail later. Over time, as new technologies such as audio- and video-cassettes were introduced, live broadcasting, especially radio, was cut back for OU programs, although there are still some general educational channels broadcasting around the world (e.g. TVOntario in Canada; PBS, the History Channel, and the Discovery Channel in the USA).

The use of television for education quickly spread around the world, being seen in the 1970s by some, particularly in international agencies such as the World Bank and UNESCO, as a panacea for education in developing countries, the hopes for which quickly faded when the realities of lack of electricity, cost, security of publicly available equipment, climate, resistance from local  teachers, and local language and cultural issues became apparent. Satellite broadcasting started to become available in the 1980s, and similar hopes were expressed of delivering ‘university lectures from the world’s leading universities to the world’s starving masses’, but these hopes too quickly faded for similar reasons. However, India, which had launched its own satellite, INSAT, in 1983, used it initially for delivering locally produced educational television programs throughout the country, in several indigenous languages, using Indian-designed receivers and television sets in local community centres as well as schools. India is still using satellites for tele-education into the poorest parts of the country at the time of writing (2014).

In the 1990s the cost of creating and distributing video dropped dramatically due to digital compression and high-speed Internet access.  This reduction in the costs of recording and distributing video also led to the development of lecture capture systems. The development of lecture capture technology allows students to view or review lectures at any time and place with an Internet connection. The Massachusetts Institute of Technology (MIT) started making its recorded lectures available to the public, free of charge, via its OpenCourseWare project, in 2002.  YouTube started in 2005 and was bought by Google in 2006. YouTube is increasingly being used for short educational clips that can be downloaded and integrated into online courses. The Khan Academy started using YouTube in 2006 for recorded voice-over lectures using a digital blackboard for equations and illustrations. Apple Inc. in 2007 created iTunesU to became a portal or a site where videos and other digital materials on university teaching could be collected and downloaded free of charge by end users.

Until lecture capture arrived, learning management systems had integrated basic educational design features, but this required instructors to redesign their classroom-based teaching to fit the LMS environment. Lecture capture on the other hand required no changes to the standard lecture model, and in a sense reverted back to primarily oral communication supported by Powerpoint or even writing on a chalkboard. Thus oral communication remains as strong today in education as ever, but has been incorporated into or accommodated by new technologies.

Computer technologies

Computer-based learning

In essence the development of programmed learning aims to computerize teaching, by structuring information, testing learners’ knowledge, and providing immediate feedback to learners, without human intervention other than in the design of the hardware and software and the selection and loading of content and assessment questions. B.F. Skinner started experimenting with teaching machines that made use of programmed learning in 1954, based on the theory of behaviourism (see Chapter 3, Section 3.2.). Skinner’s teaching machines were one of the first forms of computer-based learning. There has been a recent revival of programmed learning approaches as a result of MOOCs, since machine based testing scales much more easily than human-based assessment.

PLATO was a generalized computer assisted instruction system originally developed at the University of Illinois, and, by the late 1970s, comprised several thousand terminals worldwide on nearly a dozen different networked mainframe computers (Wikipedia). It was in fact a highly successful system, lasting almost 40 years, and incorporated key on-line concepts: forums, message boards, online testing, e-mail, chat rooms, instant messaging, remote screen sharing, and multi-player games.

Attempts to replicate the teaching process through artificial intelligence (AI) began in the mid-1980s, with a focus initially on teaching arithmetic. Despite large investments of research in AI for teaching over the last 30 years, the results generally have been disappointing. It has proved difficult for machines to cope with the extraordinary variety of ways in which students learn (or fail to learn.) Recent developments in cognitive science and neuroscience are being watched closely but at the time of writing the gap is still great between the basic science, and analysing or predicting specific learning behaviours from the science.

More recently we have seen the development of adaptive learning, which analyses learners’ responses then re-directs them to the most appropriate content area, based on their performance. Learning analytics, which also collects data about learner activities and relates them to other data, such as student performance, is a related development. These developments will be discussed in further detail in Section 8.7.

Computer networking

Arpanet in the U.S.A was the first network to use the Internet protocol in 1982. In the late 1970s, Murray Turoff and Roxanne Hiltz at the New Jersey Institute of Technology were experimenting with blended learning, using NJIT’s internal computer network. They combined classroom teaching with online discussion forums, and termed this ‘computer-mediated communication’ (CMC) (Hiltz and Turoff, 1978). At the University of Guelph in Canada, an off-the-shelf software system called CoSy was developed in the 1980s that allowed for online threaded group discussion forums, a predecessor to today’s forums contained in learning management systems. In 1988, the Open University in the United Kingdom offered a course, DT200, that as well as the OU’s traditional media of printed texts, television programs and audio-cassettes, also included an online discussion component using CoSy. Since this course had 1,200 registered students, it was one of the earliest ‘mass’ open online courses. We see then the emerging division between the use of computers for automated or programmed learning, and the use of computer networks to enable students and instructors to communicate with each other.

The Word Wide Web was formally launched in 1991. The World Wide Web is basically an application running on the Internet that enables ‘end-users’ to create and link documents, videos or other digital media, without the need for the end-user to transcribe everything into some form of computer code. The first web browser, Mosaic, was made available in 1993. Before the Web, it required lengthy and time-consuming methods to load text, and to find material on the Internet. Several Internet search engines have been developed since 1993, with Google, created in 1999, emerging as one of the primary search engines.

Online learning environments

In 1995, the Web enabled the development of the first learning management systems (LMSs), such as WebCT (which later became Blackboard). LMSs provide an online teaching environment, where content can be loaded and organized, as well as providing ‘spaces’ for learning objectives, student activities, assignment questions, and discussion forums. The first fully online courses (for credit) started to appear in 1995, some using LMSs, others just loading text as PDFs or slides. The materials were mainly text and graphics. LMSs became the main means by which online learning was offered until lecture capture systems arrived around 2008.

By 2008, George Siemens, Stephen Downes and Dave Cormier in Canada were using web technology to create the first ‘connectivist’ Massive Open Online Course (MOOC), a community of practice that linked webinar presentations and/or blog posts by experts to participants’ blogs and tweets, with just over 2,000 enrollments. The courses were open to anyone and had no formal assessment. In 2012, two Stanford University professors launched a lecture-capture based MOOC on artificial intelligence, attracting more than 100,000 students, and since then MOOCs have expanded rapidly around the world.

Social media

Social media are really a sub-category of computer technology, but their development deserves a section of its own in the history of educational technology. Social media cover a wide range of different technologies, including blogs, wikis, You Tube videos, mobile devices such as phones and tablets, Twitter, Skype and Facebook. Andreas Kaplan and Michael Haenlein (2010) define social media as

a group of Internet-based applications that …allow the creation and exchange of user-generated content, based on interactions among people in which they create, share or exchange information and ideas in virtual communities and networks.

Social media are strongly associated with young people and ‘millenials’ – in other words, many of the students in post-secondary education. At the time of writing social media are only just being integrated into formal education, and to date their main educational value has been in non-formal education, such as fostering online communities of practice, or around the edges of classroom teaching, such as ‘tweets’ during lectures or rating of instructors. It will be argued though that they have much greater potential for learning.

A paradigm shift

It can be seen that education has adopted and adapted technology over a long period of time. There are some useful lessons to be learned from past developments in the use of technology for education, in particular that many claims made for a newly emerging technology are likely to be neither true nor new. Also new technology rarely completely replaces an older technology. Usually the old technology remains, operating within a more specialised ‘niche’, such as radio, or integrated as part of a richer technology environment, such as video in the Internet.

However, what distinguishes the digital age from all previous ages is the rapid pace of technology development and our immersion in technology-based activities in our daily lives. Thus it is fair to describe the impact of the Internet on education as a paradigm shift, at least in terms of educational technology. We are still in the process of absorbing and applying the implications. The next section attempts to pin down more closely the educational significance of different media and technologies.

Over to you

1. Given the target audience, is this section necessary or useful?

2. Given also the need to brief, what would you add, change, or leave out?

Next

We start getting to the meat of the chapter in the next section, which examines more closely differences between media and technologies and the concept of educational affordances of technology.

References

Hiltz, R. and Turoff, M. (1978) The Network Nation: Human Communication via Computer Reading MA: Addison-Wesley

Kaplan, A. and Haenlein, M. (2010), Users of the world, unite! The challenges and opportunities of social media, Business Horizons, Vol.  53, No. 1, pp. 59-68

Manguel, A. (1996) A History of Reading London: Harper Collins

Robinson, J. (1982) Broadcasting Over the Air London: BBC

Saettler, P. (1990) The Evolution of American Educational Technology Englewood CO: Libraries Unlimited

Selwood, D. (2014) What does the Rosetta Stone tell us about the Bible? Did Moses read hieroglyphs? The Telegraph, July 15

Watters, A. (2018) A History of Teaching Machines: A Timeline, Hack Education

Educational technology 30 years on: why hasn’t education changed much?

Apple's 1984 Superbowl advert launching the Macintosh. (This will amuse only Mac users.)

Apple’s 1984 Superbowl advert launching the Macintosh. (This will amuse only Mac users.)

Bates, A.W. (ed.) (1984) The Role of Technology in Distance Education London/New York: Routledge/Taylor and Francis

For some inexplicable reason, Routledge, of the publishing group Taylor and Francis, has decided to revive this book I edited in 1984. As a result a copy landed on my desk recently. It is easy to forget how much has happened in educational technology over the last 30 years, and in particular how far the technology has advanced. At the same time, how little has changed in terms of the challenges of using technology to improve the quality of post-secondary education.

How the technology has changed

In 1984, specially designed and printed texts, or ‘course units’, were still the predominant medium of communication with distance education students. However, the process of print production was cumbersome. Word-processing was possible on personal computers, but many faculty preferred to type their drafts on typewriters, and text then had to be type-set manually from paper copies before printing. Making changes after the units were published was incredibly expensive.

Although the U.K.’s Open University was also still using broadcast television and radio in 1984, its use had actually declined from when it opened in 1971, so by 1984 broadcasting was occupying less than 10 per cent of student study time. Students were flocking to audio and video cassette recordings, because they were able to be played at the student’s own pace and accessed at any time.

Computer assisted learning (CAL) was just beginning to be experimented with at the Open University, in the form of ‘tutorial’ CAL and some simulations in chemistry. However, in 1984 most students could access computers only at local study centres (less than 30 per cent had a computer at home, and none had Internet access). A typical ‘micro’-computer used MS-DOS, weighed 45 lb, and cost between £1,500 – £2,000 or $2,500 – $3,500. Indeed it was in 1984 that Apple introduced its first Macintosh computer (click on the video to see its striking Superbowl advertisement, where presumably Microsoft’s Personal Computer was Big Brother.) The Internet, although in existence, was just in its infancy in the USA and available only to research universities and the military. It would be another four years before the Internet first became available as a public service, and of course the World Wide Web didn’t come into existence for another seven years.

Nevertheless, elements of the future were present in 1984. Teaching by telephone, or telephone tutoring, was becoming widespread, in most cases supporting other media such as printed texts, but also in some cases for delivering interactive lectures. Particularly in the USA, some states had built dedicated private telephone systems for educational purposes, such as the Wisconsin Educational Telephone Network and in Canada, Memorial University in Newfoundland had built an educational telephone network that it shared with 40 other institutions. On public telephone networks, bridge technology was being introduced, enabling between three to nine people to participate at the same time, but most institutions using telephone teaching delivered them through local centres or multi-campuses. The U.K Open University, working with British Telecom, was using an early form of multimedia teleconferencing called CYCLOPS, which enabled two way communication of both voice and graphics over the public telephone network, but again using local study centres. Unfortunately the OU decided not to patent the technology, which it must be regretting today. However, long distance charges were expensive and the quality of sound was often variable, but the educational context was not dissimilar to webinars today.

Cable TV and satellites were being used quite heavily in education in 1984, with dedicated educational cable networks such as TVOntario and Knowledge Network in Canada (which are still in existence today, although they are more like specialty documentary channels than educational service providers). But it was satellite broadcasting that was going to do what has been claimed for MOOCs today – lectures from the world’s best professors being delivered for free into poor developing countries, and we know what happened to that. Video discs were also big in 1984, and had a lot of educational promise but the technology turned out to be too expensive for general educational use. Many other technologies that were discussed in the book faded away completely. Anyone remember teletext technology such as Telidon (Canada), Minitel (France), and Prestel (U.K.)?

So, yes, looking back, it is clear that the Internet – free, readily accessible, and multimedia – and low cost personal computers and social media have revolutionized educational technology in ways that were unimaginable in 1984 (except perhaps by Steve Jobs).

So why hasn’t education changed?

If the technology is so much better and cheaper today, why does post-secondary education still cost as much if not more per student as 30 years ago? Are the learning outcomes any better? It would be hard to make the case that the quality of education has improved over the last 30 years, at least on campus. Class sizes are much larger now, and teaching methods haven’t really changed that much. What has changed is that we have many more students in post-secondary education (and many more students studying online) but the unit costs haven’t dropped.

It’s the system, stupid

Both the cost of creating and delivering content has dropped dramatically and will continue to do so as open content rapidly expands through open textbooks, open research and open educational resources. But I have to admit to being conflicted over the issue as to why costs are the same or indeed somewhat higher than they were 30 years ago.

What’s keeping up the cost is the need for learner support – facilitating learning through discourse and dialogue. Technology in fact is still a relatively small cost within the overall cost of teaching. Faculty salaries constitute at least two thirds of all costs and while we still require an instructor:student ratio of roughly 1:25 in higher education, costs will not come down significantly. However, I am not convinced that we can effectively substitute that instructor:learner interaction by technology alone without losing quality.

But we could still be doing more to reduce costs, and/or improve quality, as follows:

  • implementing open textbooks more widely, saving roughly $1,000 per student per year
  • making savings of up to 10 per cent on the total cost of teaching by greater use of open educational resources and sharing content. For instance, in a large system like Ontario or Quebec, do we need 50 different introductory psychology courses? Would it not be better to develop say four or five really excellent online courses, and share that content across the system, freeing up instructors from delivering content via lectures, and enabling them to spend more time or cover more students in discussion and dialogue? Also with open content instructors could choose different approaches to fit their approach to the topic, again without extra costs. This would ensure that there were different approaches to psychology, and maybe improve the quality of the learning at the same time. For this to happen though institutions need to work together collaboratively rather than competitively (hence it’s a systemic problem that probably only government can fix)
  • get faculty to teach more. Over the last 30 years, the average teaching load for full-time university faculty members, in Canada at least, has actually dropped, so many faculty have a teaching load of roughly four to five courses a year compared with six or more 30 years ago. In other words any possible gains from the implementation of technology for teaching has been more than gobbled up by faculty spending less time teaching. (It may feel like more teaching though if you are teaching larger classes.)
  • re-organise the teaching of large classes, with a senior faculty member responsible for overall design and assessment methods, but with a team of lesser paid but still highly qualified (adjunct) faculty supported by lower cost teaching assistants to ensure that every student has adequate learner support and quality assessment.
  • this of course requires major re-design of teaching, but without changing teaching methods there will be no cost benefits from technology. Instead, technology just adds cost to doing the same things, but with more technology
  • build new institutions for the 21st century and beyond: instead of cramming even more students into existing institutions, why not create some new institutions from scratch designed around the cost-effective use of technology, as the UK government did in 1969, and the Catalan government did in 1995. Start by building the institution at 50% of the capital cost of a traditional university and 75% of the average operating cost per student, with modern course design to maintain or improve quality standards.

None of this can happen without serious systemic change. This is a major challenge for senior administrators, institutional governing boards, and above all government. The aim also has to be clear. It is not to cut costs alone, but to improve the quality of the output – better qualified students fit for a digital age. But it is no longer acceptable to continue to invest in technology without demanding at the same time better results.

 

 

 

 

Developing intellectual and practical skills in a digital age

 

Skills 2

The story so far

Chapter 5 of my open textbook, ‘Teaching in a Digital Age’ is about the design of teaching and learning, which I am currently writing and publishing as I go.

I started Chapter 5 by suggesting that instructors should think about design through the lens of constructing a comprehensive learning environment in which teaching and learning will take place. I have started to work through the various components of a learning environment, focusing particularly on how the digital age affects the way we need to look at some of these components.

I started by looking at how the characteristics of our learners are changing, and followed that by examining how our perspectives on content are being influenced by the digital age. In this post, I look at how both intellectual and practical skills can be developed to meet the needs of a digital age. The following posts will do the same for learner support, resources and assessment respectively.

This will then lead to a discussion of different models for designing teaching and learning. These models aim to provide a structure for and integration of these various components of a learning environment.

Scenario: Developing historical thinking

© Wenxue City: China During the Early Days of the Reform

© Wenxue City: China During the Early Days of the Reform

Ralph Goodyear is a professor of history in a public Tier 1 research university in the central United States. He has a class of 120 undergraduate students taking HIST 305, ‘Historiography’.

For the first three weeks of the course, Goodyear had recorded a series of short 15 minute video lectures that covered the following topics/content:

  • the various sources used by historians (e.g. earlier writings, empirical records including registries of birth, marriage and death, eye witness accounts, artifacts such as paintings, photographs, and physical evidence such as ruins.)
  • the themes around which historical analysis tend to be written,
  • some of the techniques used by historians, such as narrative, analysis and interpretation
  • three different positions or theories about history (objectivist, marxist, post modernist).

Students downloaded the videos according to a schedule suggested by Goodyear. Students attended two one hour classes a week, where specific topics covered in the videos were discussed. Students also had an online discussion forum in the course space on the university’s learning management system, where Goodyear had posted similar topics for discussion. Students were expected to make at least one substantive contribution to each online topic for which they received a grade that went towards their final grade.

Students also had to read a major textbook on historiography over this three week period.

In the fourth week, he divided the class into twelve groups of six, and asked each group to research the history of any city outside the United States over the last 50 years or so. They could use whatever sources they could find, including online sources such as newspaper reports, images, research publications, and so on, as well as the university’s own library collection. In writing their report, they had to do the following:

  • pick a particular theme that covered the 50 years and write a narrative based around the theme
  • identify the sources they finally used in their report, and discuss why they selected some sources and dismissed others
  • compare their approach to the three positions covered in the lectures
  • post their report in the form of an online e-portfolio in the course space on the university’s learning management system

They had five weeks to do this.

The last three weeks of the course were devoted to presentations by each of the groups, with comments, discussion and questions, both in class and online (the in class presentations were recorded and made available online). At the end of the course, students assigned grades to each of the other groups’ work. Goodyear took these student gradings into consideration, but reserved the right to adjust the grades, with an explanation of why he did the adjustment. Goodyear also gave each student an individual grade, based on both their group’s grade, and their personal contribution to the online and class discussions.

Goodyear commented that he was surprised and delighted at the quality of the students’ work. He said: ‘What I liked was that the students weren’t learning about history; they were doing it.’

Based on an actual case, but with some embellishments.

Skills in a digital age

In Chapter 1, Section 1.4, I listed some of the skills that graduates need in a digital age, and argued that this requires a greater focus on developing such skills, at all levels of education, but particularly at a post-secondary level, where the focus is often on specialised content. Although skills such as critical thinking, problem solving and creative thinking have always been valued in higher education, the identification and development of such skills is often implicit and almost accidental, as if students will somehow pick up these skills from observing faculty themselves demonstrating such skills or through some form of osmosis resulting from the study of content. I also pointed out in the same section, though, that there is substantial research on skills development but the knowledge deriving from such research is at best applied haphazardly, if at all, to the development of intellectual skills.

Furthermore the skills required in a digital age are broader and more wide ranging than the abstract academic skills traditionally developed in higher education. For instance, they need to be grounded just as much in digital communications media as in traditional writing or lecturing, and include the development of digital competence and expertise within a subject domain, as well as skills such as independent learning and knowledge management. These are not so much new skills as a different emphasis, focus or direction.

It is somewhat artificial to separate content from skills, because content is the fuel that drives the development of intellectual skills. At the same time, in more traditionally vocational training, we see the reverse trend in a digital age, with much more focus on developing high level conceptual thinking as well as manual skills development. My aim here is not to downplay the importance of content, but to ensure that skills development receives as much focus and attention from instructors, and that we approach intellectual skills development in the same rigorous and explicit way as apprentices are trained in manual skills.

Setting goals for skills development

Thus a critical step is to be explicit about what skills a particular course or program is trying to develop, and to define these goals in such a way that they can be implemented and assessed. In other words it is not enough to say that a course aims to develop critical thinking, but to state clearly what this would look like in the context of the particular course or content area, in ways that are clear to students. In particular the ‘skills’ goals should be capable of assessment and students should be aware of the criteria or rubrics that will be used for assessment.

Thinking activities

A skill is not binary, in the sense that you either have it or you don’t. There is a tendency to talk about skills and competencies in terms of novice, intermediate, expert, and master, but in reality skills require constant practice and application and there is, at least with regard to intellectual skills, no final destination. So it is critically important when designing a course or program to design activities that require students to develop, practice and apply thinking skills on a continuous basis, preferably in a way that starts with small steps and leads eventually to larger ones. There are many ways in which this can be done, such as written assignments, project work, and focused discussion, but these thinking activities need to be thought about, planned and implemented on a consistent basis by the instructor.

Practical activities

It is a given in vocational programs that students need lots of practical activities to develop their manual skills. This though is equally true for intellectual skills. Students need to be able to demonstrate where they are along the road to mastery, get feedback on it, and retry as a result. This means doing work that enables them to practice specific skills.

In the scenario above, students had to cover and understand the essential content in the first three weeks, do research in a group, develop an agreed project report, in the form of an e-portfolio, share it with other students and the instructor for comments, feedback and assessment, and present their report orally and online. Ideally, they will have the opportunity to carry over many of these skills into other courses where the skills can be further refined and developed. Thus, with skills development, a longer term horizon than a single course will be necessary, so integrated program as well as course planning is important.

Discussion as a tool for developing intellectual skills

Discussion is a very important tool for developing thinking skills. However, not any kind of discussion. It was argued in Chapter 2 that academic knowledge requires a different kind of thinking to everyday thinking. It usually requires students to see the world differently, in terms of underlying principles, abstractions and ideas. Thus discussion needs to be carefully managed by the instructor, so that it focuses on the development of skills in thinking that are integral to the area of study. This requires the instructor to plan, structure and support discussion within the class, keeping the discussions in focus, and providing opportunities to demonstrate how experts in the field approach topics under discussion, and comparing students’ efforts.

Figure 5.3: Online threaded discussion forums provide students with opportunities for developing intellectual skills, but the instructor needs to design and manage such forums carefully for this to happen

Figure 5.3: Online threaded discussion forums provide students with opportunities for developing intellectual skills, but the instructor needs to design and manage such forums carefully for this to happen

In conclusion

There are many opportunities in even the most academic courses to develop intellectual and practical skills that will carry over into work and life activities in a digital age, without corrupting the values or standards of academia. Even in vocational courses, students need opportunities to practice intellectual or conceptual skills such as problem-solving, communication skills, and collaborative learning. However, this won’t happen merely through the delivery of content. Instructors need to:

  • think carefully about exactly what skills their students need,
  • how this fits with the nature of the subject matter,
  • the kind of activities that will allow students to develop and improve their intellectual skills, and
  • how to give feedback and to assess those skills, within the time and resources available.

This is a very brief discussion of how and why skills development should be an integral part of any learning environment. We will be discussing skills and skill development in more depth in later chapters.

Over to you

Your views, comments and criticisms are always welcome. In particular:

  • how does the history scenario work for you? Does it demonstrate adequately the points I’m making about skills development?
  • are the skills being developed by students in the history scenario relevant to a digital age?
  • is this post likely to change the way you think about teaching your subject, or do you already cover skills development adequately? If you feel you do cover skills development well, does your approach differ from mine?

Love to hear from you.

Next up

Learner support in a digital age

 

My seven ‘a-ha’ moments in the history of educational technology

A good question

I get asked a lot of questions about online learning, educational technology and distance education, but recently I was asked one that really stumped me, and forced me to reflect on the whole history of educational technology, at least as it has affected me.

The question was simple:

‘You’ve been working in the field now for 44 years. What have been your most seminal moments in terms of what you’ve learned?’

I’ve been able to boil the answer down into seven seminal moments.  Here I merely summarize these ‘aha’ moments. I will do a different post on each that will describe both the circumstances that led to the ‘aha’ moment, and the consequent heuristic implications for making more effective decisions about the use of technology.

1. 1970: Media are different

By this, I mean different media have different educational effects or affordances. If you just transfer the same teaching to a different media, you fail to exploit the unique characteristics of that medium. Put more positively, you can do different and often better teaching by adapting it to the medium. That way students will learn more deeply and effectively.

2. 1974: God helps those who help themselves

This stems from my experience of working in developing countries. Ever since I started working in this field, people have argued that ‘Western’ technology is the solution to educational problems in developing countries. This is hubris, and just plain wrong. Progress in education in developing countries has to start at home. Western technology can help, but only as long as it is adapted and transformed locally.

3. 1978: Asynchronous is better

Everyone learns better from media and technologies that allow them to study anywhere, at any time. In particular the ability to repeat and revise recorded material makes learning much more effective than live, synchronous teaching. This ‘insight’ stemmed originally from research on the effectiveness of audio-cassettes compared to broadcast radio, but has subsequently been found true also for television and the Internet.

4. 1986: Computers for communication, not as teaching machines

Until 1986, I had always been skeptical of computers as an effective teaching medium, especially in distance education. Up to then, I had seen them as ‘teaching machines’, attempting, ineffectively to replace teachers. The Internet changed that. In 1986, I realised that computers could allow learners and teachers to  communicate effectively over space and time. This fits much better with my philosophy of teaching and learning. Despite developments since then in artificial intelligence, this seminal moment still holds true today.

5. 1995: WWW: a universal standard

Like most people in education, I was caught cold by the World Wide Web. Until 1995, I was still using non-web technology for teaching online. The web allows rich multimedia material to be transmitted to any computer, any software system, anywhere in the world, with an Internet connection. This has had profound implications for the design of online teaching which we still have by no means fully understood or exploited.

6. 1995: Convergence of online learning

This was the year I moved from a distance teaching organization to a campus-based university. The move was partly driven by a growing realization that the technologies being introduced into distance education would eventually transform campus-based teaching as well. This is just beginning to be fully realised 18 years later, through developments such as hybrid learning. The challenge now is to identify what is best done on campus, and what online, when students have the choice of both.

7. 1997: Strategy matters

Having worked as a manager by this time for 7 years, I was beginning to understand the bigger picture regarding the planning and management of learning technologies, and it wasn’t pretty. For educational technology to be used effectively, it has to be planned and managed well, and there were almost no specific guidelines at the time. Almost everything was left to the IT people. This had to change. Academics had to get involved as well.  This also is now beginning to happen but we still have a long way to go to be better planners and managers, despite my two books on the subject.

The time perspective

Why nothing in the last 16 years? Well, the further back in time you go, the clearer becomes the signal from the noise. Also, if something is universally true, you are likely to recognize it earlier than later. And in the educational technology field, I doubt if many things are universally true, because it is an area that is still rapidly developing.

The one exception though I might make (an eighth aha) is 2008 when I realised the importance of web 2.0 for enabling more learner-centered teaching and learning, but I still need more time to see the real significance.

In the meantime, I will develop each of these seven themes further in later posts.

Question

What are your ‘seminal,’ aha moments in educational technology? Why?