July 20, 2018

Learning analytics, student satisfaction, and student performance at the UK Open University

There is very little correlation between student satisfaction and student performance. Image: Bart Rienties. Click on image to see the video.

Rienties, B. and Toetenel, L. (2016) The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules, Computers in Human Behaviour, Vol. 60, pp.333-341

Li, N. et al. (2017) Online learning experiences of new versus continuing learners: a large-scale replication study, Assessment and Evaluation in Higher Education, Vol. 42, No. 4, pp.657-672

It’s never too late to learn

It’s been a hectic month with two trips from Vancouver to Ontario and back and one to the UK and back, a total of four keynotes, two panel sessions and two one day consultancies. By the time I got to the end of the month’s travels, I had learned so much that at a conference in Toronto I had to go to my room and lie down  – I just couldn’t take any more!

At my age, it takes time to process all this new information, but I will try to summarise the main points of what I learned in the next three posts.

Learning analytics at the Open University

The Open University, with over 100,000 students and more than 1,000 courses (modules), and most of its teaching online in one form or another, is an ideal context for the application of learning analytics. Fortunately the OU has some of the world leaders in this field. 

At the conference on STEM teaching at the Open University that I attended as the opening keynote, the closing keynote was given by Bart Rienties, Professor of Learning Analytics at the Institute of Educational Technology at the UK Open University. Rienties and his team linked 151 modules (courses) and 111,256 students with students’ behaviour, satisfaction and performance at the Open University UK, using multiple regression models. 

His whole presentation (40 minutes, including questions) can be accessed online, and is well worth viewing, as it provides a clear summary of the results published in the two detailed papers listed above. As always, if you find my summary of results below of interest or challenging, I strongly recommend you view Bart’s video first, then read the two articles in more detail. Here’s what I took away.

There is little correlation between student course evaluations and student performance

This result is a bit of a zinger. The core dependent variable used was academic retention (the number of learners who completed and passed the module relative to the number of learners who registered for each module). As Rientes and Toetenel (p.340) comment, almost as an aside, 

it is remarkable that learner satisfaction and academic retention were not even mildly related to each other….Our findings seem to indicate that students may not always be the best judge of their own learning experience and what helps them in achieving the best outcome.’

The design of the course matters

One of the big challenges in online and blended learning is getting subject matter experts to recognise the importance of what the Open University calls ‘learning design.’ 

Conole (2012, p121) describes learning design as:

a methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies. LD is focussed on ‘what students do’ as part of their learning, rather than the ‘teaching’ which is focussed on the content that will be delivered.

Thus learning design is more than just instructional design.

However, Rienties at al. comment that ‘only a few studies have investigated how educators in practice are actually planning and designing their courses and whether this is then implemented as intended in the design phase.’ 

The OU has done a good job in breaking down some of the elements of learning design. The OU has mapped the elements of learning design in nearly 200 different courses. The elements of this mapping can be seen below (Rientes and Toetenal, 2016, p.335):

Rientes and Toetenel then analysed the correlations between each of these learning design elements against both learner satisfaction and learner performance. What they found is that what OU students liked did not match with learner performance. For instance, students were most satisfied with ‘assimilative’ activities, which are primarily content focused, and disliked communication activities, which are primarily social activities. However, better student retention was most strongly associated with communication activities, and overall, with the quality of the learning design.

Rientes and Toetenel conclude:

although more than 80% of learners were satisfied with their learning experience, learning does not always need to be a nice, pleasant experience. Learning can be hard and difficult at times, and making mistakes, persistence, receiving good feedback and support are important factors for continued learning….

An exclusive focus on learner satisfaction might distract institutions from understanding the impact of LD on learning experiences and academic retention. If our findings are replicated in other contexts, a crucial debate with academics, students and managers needs to develop whether universities should focus on happy students and customers, or whether universities should design learning activities that stretch learners to their maximum abilities and ensuring that they eventually pass the module. Where possible, appropriate communication tasks that align with the learning objectives of the course may seem to be a way forward to enhance academic retention.

Be careful what you measure

As Rientes and Toetenel put it:

Simple LA metrics (e.g., number of clicks, number of downloads) may actually hamper the advancement of LA research. For example, using a longitudinal data analysis of over 120 variables from three different VLE/LMS systems and a range of motivational, emotions and learning styles indicators, Tempelaar et al. (2015) found that most of the 40 proxies of simple” VLE LA metrics provided limited insights into the complexity of learning dynamics over time. On average, these clicking behaviour proxies were only able to explain around 10% of variation in academic performance.

In contrast, learning motivations, emotions (attitudes), and learners’ activities during continuous assessments (behaviour) significantly improved explained variance (up to 50%) and could provide an opportunity for teachers to help at-risk learners at a relatively early stage of their university studies.

My conclusions

Student feedback on the quality of a course is really important but it is more useful as a conversation between students and instructors/designers than as a quantitative ranking of the quality of a course.  In fact using learner satisfaction as a way to rank teaching is highly misleading. Learner satisfaction encompasses a very wide range of factors as well as the teaching of a particular course. It is possible to imagine a highly effective course where teaching in a transmissive or assimilative manner is minimal, but student activities are wide, varied and relevant to the development of significant learning outcomes. Students, at least initially, may not like this because this may be a new experience for them, and because they must take more responsibility for their learning. Thus good communication and explanation of why particular approaches to teaching have been chosen is essential (see my comment to a question on the video).

Perhaps though the biggest limitation of student satisfaction for assessing the quality of the teaching is the often very low response rates from students, limited evaluation questions due to standardization (the same questions irrespective of the nature of the course), and the poor quality of the student responses. This is no way to assess the quality of an individual teacher or a whole institution, yet far too many institutions and governments are building this into their evaluation of teachers/instructors and institutions.

I have been fairly skeptical of learning analytics up to now, because of the tendency to focus more on what is easily measurable (simple metrics) than on what students actually do qualitatively when they are learning. The focus on learning design variables in these studies is refreshing and important but so will be analysis of student learning habits.

Finally, this research provides quantitative evidence of the importance of learning design in online and distance teaching. Good design leads to better learning outcomes. Why then are we not applying this knowledge to the design of all university and college courses, and not just online courses? We need a shift in the power balance between university and college subject experts and learning designers resulting in the latter being treated as at least equals in the teaching process.

References

Conole, G. (2012). Designing for learning in an open world. Dordrecht: Springer

Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: learning analytics in a data-rich context. Computers in Human Behavior, 47, 157e167. http://dx.doi.org/10.1016/j.chb.2014.05.038.

 

Online learning in 2016: a personal review


global-peace-index-2016-aglobal-peace-initiative-b

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Image: © Institute for Economics and Peace. Canada is ranked seventh most peaceful. We don’t know where it ranks though in terms of online learning.

A personal review

I am not going to do a review of all the developments in online learning in 2016 (for this, see Audrey Watters’ excellent HackEducation Trends). What I am going to do instead is review what I actually wrote about in 2016 in this blog, indicating what to me was of particular interest in online learning during 2016. I have identified 38 posts I wrote in which I have explored in some detail issues that bubbled up (at least for me) in 2016.

1. Tracking online learning

Building a national survey of online learning in Canada (134 hits)

A national survey of university online and distance learning in Canada (1,529 hits)

In the USA, fully online enrollments continue to grow in 2014 (91 hits)

Are you ready for blended learning? (389 hits)

What the Conference Board of Canada thinks about online learning (200 hits)

I indulged my obsession with knowing the extent to which online learning is penetrating post-secondary education with five posts on this topic. In a field undergoing such rapid changes, it is increasingly important to be able to track exactly what is going on. Thus a large part of my professional activity in 2016 has been devoted to establishing, almost from scratch, a national survey of online learning in Canadian post-secondary institutions. I would have written more about this topic, but until the survey has been successfully conducted in 2017, I have preferred to keep a low profile on this issue.

However, during 2016 it did become clear to me, partly as a result of pilot testing of the questionnaire, and partly through visits to universities, that blended learning is not only gaining ground in Canadian post-secondary education at a much faster rate than I had anticipated, but is raising critical questions about what is best done online and what face-to-face, and how to prepare institutions and instructors for what is essentially a revolution in teaching.

This can be best summarized by what I wrote about the Conference Board of Canada’s report:

What is going on is a slowly boiling and considerably variable revolution in higher education that is not easily measured or even captured in individual anecdotes or interviews.

2. Faculty development and training

Getting faculty and instructors into online learning (183 hits)

Initiating instructors to online learning: 10 fundamentals (529 hits)

Online learning for beginners: 10. Ready to go (+ nine other posts on this topic = 4,238 hits)

5 IDEAS for a pedagogy of online learning (708 hits)

This was the area to which I devoted the most space, with ten posts on ‘Online Learning for Beginners’, aimed at instructors resisting or unready for online learning. These ten posts were then edited and published by Contact North as the 10 Fundamentals of Teaching Online.

Two fundamental conclusions: we need not only better organizational strategies to ensure that faculty have the knowledge and training they will need for effective teaching and learning in a digital age, but we also need to develop new teaching strategies and approaches that can exploit the benefits and even more importantly avoid the pitfalls of blended learning and learning technologies. I have been trying to make a contribution in this area, but much more needs to be done.

3. Learning environments

Building an effective learning environment (6,173 hits)

EDEN 2016: Re-imagining Learning Environments (597 hits)

Culture and effective online learning environments (1,260 hits)

Closely linked to developing appropriate pedagogies for a digital age is the concept of designing appropriate learning environments, based on learners’ construction of knowledge and the role of instructors in guiding and fostering knowledge management, independent learning and other 21st century skills.

This approach I argued is a better ‘fit’ for learners in a digital age than thinking in terms of blended, hybrid or fully online learning, and recognizes that not only can technology to be used to design very different kinds of learning environments from school or campus based learning environments, but also that technology is just one component of a much richer learning context.
Slide15

4. Experiential learning online

A full day of experiential learning in action (188 hits)

An example of online experiential learning: Ryerson University’s Law Practice Program (383 hits)

Is networked learning experiential learning? (163 hits)

These three posts explored a number of ways in which experiential learning is being done online, as this is a key methodology for developing skills in particular.

5. Open education

Acorns to oaks? British Columbia continues its progress with OERs (185 hits)

Talking numbers about open publishing and online learning (113 hits)

Towards an open pedagogy for online learning (385 hits)

These posts also tracked the development of open publishing and open educational resources, particularly in British Columbia, leading me to conclude that the OER ‘movement’ has far too narrow a concept of open-ness and that in its place we need an open pedagogy into which open educational resources are again just one component, and perhaps not the most significant.

6. Technology applications in online learning

An excellent guide to multimedia course design (659 hits)

Is video a threat to learning management systems? (603 hits)

Some comments on synchronous online learning technologies (231 hits)

Amongst all the hype about augmented reality, learning analytics and the application of artificial intelligence, I found it more useful to look at some of the technologies that are in everyday use in online learning, and how these could best be used.

7. Technology and alienation

Technology and alienation: online learning and labour market needs (319 hits)

Technology and alienation: symptoms, causes and a framework for discussion (512 hits)

Technology, alienation and the role of education: an introduction (375 hits)

Automation or empowerment: online learning at the crossroads (1,571 hits)

Why digital technology is not necessarily the answer to your problem (474 hits)

These were more philosophical pieces, prompted to some extent by the wider concerns of the impact of technology on jobs and how that has influenced Brexit and the Trump phenomena.

Nevertheless this issue is also very relevant to the teaching context. In particular I was challenging the ‘Silicon Valley’ assumption that computers will eventually replace the need for teachers, and in particular the danger of using algorithms in teaching without knowing who wrote the algorithms, what their philosophy of teaching is, and thus what assumptions have been built into the use of data.

Image: Applift

Image: Applift

8. Learning analytics

Learning analytics and learning design at the UK Open University (90 hits)

Examining ethical and privacy issues surrounding learning analytics (321 hits)

Continuing more or less the same theme of analysing the downside as well as the upside of technology in education, these two posts looked at how some institutions, and the UK Open University in particular, are being thoughtful about the implications of learning analytics, and building in policies for protecting privacy and gaining student ‘social license’ for the use of analytics.

9. Assessment

Developing a next generation online learning assessment system (532 hits)

This is an area where much more work needs to be done. If we are to develop new or better pedagogies for a digital age, we will also need better assessment methods. Unfortunately the focus once again appears to be more on the tools of assessment, such as online proctoring, where large gains have been made in 2016, but which still focus on proctoring traditional assessment procedures such as time-restricted exams, multiple choice tests and essay writing. What we need are new methods of assessment that focus on measuring the types of knowledge and skills that are needed in a digital age.

For instance, e-portfolios have held a lot of promise for a long time, but are still being used and evaluated at a painfully slow rate. They do offer though one method for assessment that reflects much better the needs of assessing 21st century knowledge and skills. However we need more imagination and creativity in developing new assessment methods for measuring the knowledge and skills needed for a digital age.

That was the year that was

Well, it was 2016 from the perspective of someone no longer teaching online or managing online learning:

  • How far off am I, from your perspective?
  • What were the most significant developments for you in online learning in 2016?
  • What did I miss that you think should have been included? Perhaps I can focus on this next year.

I have one more post looking at 2016 to come, but that will be more personal, looking at my whole range of online learning activities in 2016.

In the meantime have a great seasonal break and I will be back in touch some time in the new year.

Learning analytics and learning design at the UK Open University

Maxim Jean-Louis (President, Contact North) and myself outside Walton Hall, the headquarters of the UK Open University, in 2012. It was my first visit since I left the OU in 1989.

Maxim Jean-Louis (President, Contact North) and myself outside Walton Hall, the headquarters of the UK Open University, in 2012. It was my first visit since I left the OU in 1989.

The Open University (2016) Developing learning design and analytics for student success, Connections, Vol. 21, No. 3

The latest edition of the Commonwealth of Learning’s magazine, Connections, has an interesting if brief article of the effective use of learning analytics. There are four key points that I noted:

  • the OU has scaled up its predictive use of learning analytics to cover over 45,000 students, and it works as well in traditional universities as in the OU
  • learning analytics is used in connection with learning design to identify not only students at risk but also to improve the design of the learning materials:

    the OU for the first time can empirically analyse the design of its modules. By linking learning designs with student satisfaction and success measures, it became possible to systematically identify, measure and improve critical aspects of students’ learning experience.

  • the OU is the first university in the world to develop and adopt a policy relating to the ethical uses of student data for
    learning analytics, involving students themselves in the development of the policy. This makes the adoption and use of learning analytics much easier
  • the OU has appointed a reader in learning analytics, Dr. Bart Rienties: that is treating learning analytics really seriously.

Unfortunately there were no links or ways to follow up the article.

Examining ethical and privacy issues surrounding learning analytics

Image: SecurityCamExpert, 2013

Image: SecurityCamExpert, 2013

Drachsler, H. et al. (2016) Is Privacy a Show-stopper for Learning Analytics? A Review of Current Issues and Their Solutions Learning Analytics Review, no. 6, January 2016, ISSN: 2057-7494

About LACE

One of the most interesting sessions for me at last week’s EDEN conference in Budapest was a workshop run by Sally Reynolds of  ATiT in Brussels and Dai Griffiths of the University of Bolton, UK. They are both participants in a European Commission project called LACE (Learning Analytics Community Exchange).

The LACE web site states:

LACE partners are passionate about the opportunities afforded by current and future views of learning analytics (LA) and educational data mining (EDM) but we were concerned about missed opportunities and failing to realise value. The project aimed to integrate communities working on LA and EDM from schools, workplace and universities by sharing effective solutions to real problems.

There are a number of reviews and case studies of the use of learning analytics available from the web site, which, if you are interested in (or concerned) about the use of learning analytics, are well worth reading.

The EDEN workshop

The EDEN workshop focused on one of the reviews concerned with issues around ethics and privacy in the use of learning analytics, and in particular the use of big data.

I am reasonably familiar with the use of ‘small’ data for learning analytics, such as the use of institutional student data regarding the students in the courses I am teaching, or the analysis of participation in online discussions, both in quantitative and qualitative terms. I am less familiar with the large-scale use of data and especially how data collected via learning management or MOOC registration systems are or could be used to guide teaching and learning.

However, the focus of the workshop was specifically on ethical and privacy issues, based on the review quoted above, but nevertheless I learned a great deal about learning analytics in general through the workshop.

What is the concern?

This is best stated in the review article:

Once the Pandora’s Box of data availability has been opened, then individuals lose control of the data about them that have been harvested. They are unable to specify who has access to the data, and for what purpose, and may not be confident that the changes to the education system which result from learning analytics will be desirable. More generally, the lack of transparency in data collection and analysis exacerbates the fear of undermining privacy and personal information rights in society beyond the confines of education. The transport of data from one context to another can result in an unfair and unjustified discrimination against an individual.

In the review article, these concerns are exemplified by case studies covering schools, universities and the workplace. These concerns are summarized under the following headings:

  • privacy
  • informed consent and transparency in data collection
  • location and interpretation of data
  • data management and security
  • data ownership
  • possibility of error
  • role of knowing and obligation to act

There are in fact a number of guidelines regarding data collection and use that could be applied to learning analytics, such as the Nuremberg Code on research ethics, the OECD Privacy Framework, (both of which are general), or the JISC code of practice for learning analytics. However, the main challenge is that some proponents of learning analytics want to approach the issue in ways that are radically different from past data collection methods (like my ‘small’ data analysis). In particular they propose using random data collection then subsequently analysing it through data analysis algorithms to identify possible post-hoc applications and interpretations.

It could be argued that educational organizations have always collected data about students, such as registers of attendance, age, address and student grades. However, new technology, such as data trawling and the ability to combine data from completely different sources, as well as automated analysis, completely changes the game, raising the following questions:

  • who determines what data is collected and used within a learning management system?
  • who ensures the security of student (or instructor) data?
  • who controls access to student data?
  • who controls how the data is used?
  • who owns the data?

In particular, increasingly student (and instructor) data is being accessed, stored and used not just outside an institution, but even outside a particular country, and hence subject to laws (such as the U.S. Patriot Act) that do not apply in the country from which the data was collected.

Recommendations from the LACE working group

The LACE working group has developed an eight point checklist called DELICATE, ‘to support a new learner contract, as the basis for a trusted implementation of Learning Analytics.’

Delicate 2

For more on DELICATE see:

Drachsler, H. and Greller, W. (2016) Privacy and Learning Analytics – its a DELICATE issue Heerlen NL: The Open University of the Netherlands

Issues raised in the workshop

First it was pointed out that by today’s standards, most institutional data doesn’t qualify as ‘big data’. In education, what would constitute big data would for example be student information from the whole education system. The strategy would be to collect data about or from all students, then apply analysis that may well result in by-passing or even replacing institutions with alternative services. MOOC platforms are possibly the closest that come to this model, hence their potential for disruption. Nevertheless, even within an institution, it is important to develop policies and practices that take into account ethics and privacy when collecting and using data.

As in many workshops, we were divided into small groups to discuss some of these issues, with a small set of questions to guide the discussion. In my small group of five conference participants, none of the participants was in an institution that had a policy regarding ethics and privacy in the use of learning analytics (or if it existed, they were unaware of it).

There was a concern on our table that increasing amounts of student data around learning was accessible to external organizations (such as LMS software companies and social media organizations such as Facebook). In particular, there was a  concern that in reality, many technology decisions, such as choice of an institutional learning platform, were influenced strongly by the CIO, who may not take into sufficient account ethical and privacy concerns when negotiating agreements, or even by students themselves, who are often unaware of the implications of data collection and use by technology providers.

Our table ended by suggesting that every post-secondary institution should establish a small data ethics/privacy committee that would include, if available, someone who is a specialist in data ethics and privacy, and representatives of faculty and students, as well as the CIO, to implement and oversee policy in this area.

This was an excellent workshop that tried to find solutions that combine a balance between the need to track learner behaviour and privacy and ethical issues.

Over to you

Some questions for you:

  • is your institution using learning analytics – or considering it
  • if so, does your institution have a policy or process for monitoring data ethics and privacy issues?
  • is this really a lot of fuss over nothing?

I’d love to hear from you on this.

Automation or empowerment: online learning at the crossroads

Image: Applift

Image: AppLift, 2015

You are probably, like me, getting tired of the different predictions for 2016. So I’m not going to do my usual look forward for the year for individual developments in online learning. Instead, I want to raise a fundamental question about which direction online learning should be heading in the future, because the next year could turn out to be very significant in determining the future of online learning.

The key question we face is whether online learning should aim to replace teachers and instructors through automation, or whether technology should be used to empower not only teachers but also learners. Of course, the answer will always be a mix of both, but getting the balance right is critical.

An old but increasingly important question

This question, automation or human empowerment, is not new. It was raised by B.F. Skinner (1968) when he developed teaching machines in the early 1960s. He thought teaching machines would eventually replace teachers. On the other hand, Seymour Papert (1980) wanted computing to empower learners, not to teach them directly. In the early 1980s Papert got children to write computer code to improve the way they think and to solve problems. Papert was strongly influenced by Jean Piaget’s theory of cognitive development, and in particular that children constructed rather than absorbed knowledge.

In the 1980s, as personal computers became more common, computer-assisted learning (CAL or CAD) became popular, using computer-marked tests and early forms of adaptive learning. Also in the 1980s the first developments in artificial intelligence were applied, in the form of intelligent math tutoring. Great predictions were made then, as now, about the potential of AI to replace teachers.

Then along came the Internet. Following my first introduction to the Internet in a friend’s basement in Vancouver, I published an article in the first edition of the Journal of Distance Education, entitled ‘Computer-assisted learning or communications: which way for IT in distance education?’ (1986). In this paper I argued that the real value of the Internet and computing was to enable asynchronous interaction and communication between teacher and learners, and between learners themselves, rather than as teaching machines. This push towards a more constructivist approach to the use of computing in education was encapsulated in Mason and Kaye’s book, Mindweave (1989). Linda Harasim has since argued that online collaborative learning is an important theory of learning in its own right (Harasim, 2012).

In the 1990s, David Noble of York University attacked online learning in particular for turning universities into ‘Digital Diploma Mills’:

‘universities are not only undergoing a technological transformation. Beneath that change, and camouflaged by it, lies another: the commercialization of higher education.’

Noble (1998) argued that

‘high technology, at these universities, is often used not to ……improve teaching and research, but to replace the visions and voices of less-prestigious faculty with the second-hand and reified product of academic “superstars”.

However, contrary to Noble’s warnings, for fifteen years most university online courses followed more the route of interaction and communication between teachers and students than computer-assisted learning or video lectures, and Noble’s arguments were easily dismissed or forgotten.

Then along came lecture capture and with it, in 2011, Massive Open Online Courses (xMOOCs) from Coursera, Udacity and edX, driven by elite, highly selective universities, with their claims of making the best professors in the world available to everyone for free. Noble’s nightmare suddenly became very real. At the same time, these MOOCs have resulted in much more interest in big data, learning analytics, a revival of adaptive learning, and claims that artificial intelligence will revolutionize education, since automation is essential for managing such massive courses.

Thus we are now seeing a big swing back to the automation of learning, driven by powerful computing developments, Silicon Valley start-up thinking, and a sustained political push from those that want to commercialize education (more on this later). Underlying these developments is a fundamental conflict of philosophies and pedagogies, with automation being driven by an objectivist/behaviourist view of the world, compared with the constructivist approaches of online collaborative learning.

In other words, there are increasingly stark choices to be made about the future of online learning. Indeed, it is almost too late – I fear the forces of automation are winning – which is why 2016 will be such a pivotal year in this debate.

Automation and the commercialization of education

These developments in technology are being accompanied by a big push in the United States, China, India and other countries towards the commercialization of online learning. In other words, education is being seen increasingly as a commodity that can be bought and sold. This is not through the previous and largely discredited digital diploma mills of the for-profit online universities such as the University of Phoenix that David Noble feared, but rather through the encouragement and support of commercial computer companies moving into the education field, companies such as Coursera, Lynda.com and Udacity.

Audrey Watters and EdSurge both produced lists of EdTech ‘deals’ in 2015 totalling between $1-$2 billion. Yes, that’s right, that’s $1-$2 billion in investment in private ed tech companies in the USA (and China) in one year alone. At the same time, entrepreneurs are struggling to develop sustainable business models for ed tech investment, because with education funded publicly, a ‘true’ market is restricted. Politicians, entrepreneurs and policy makers on the right in the USA increasingly see a move to automation as a way of reducing government expenditure on education, and one means by which to ‘free up the market’.

Another development that threatens the public education model is the move by very rich entrepreneurs such as the Gates, the Hewletts and the Zuckerbergs to move their massive personal wealth into ‘charitable’ foundations or corporations and use this money for their pet ‘educational’ initiatives that also have indirect benefits for their businesses. Ian McGugan (2015) in the Globe and Mail newspaper estimates that the Chan Zuckerberg Initiative is worth potentially $45 billion, and one of its purposes is to promote the personalization of learning (another name hi-jacked by computer scientists; it’s a more human way of describing adaptive learning). Since one way Facebook makes its money is by selling personal data, forgive my suspicions that the Zuckerberg initiative is a not-so-obvious way of collecting data on future high earners. At the same time, the Chang Zuckerberg initiative enables the Zuckerberg’s to avoid paying tax on their profits from Facebook. Instead then of paying taxes that could be used to support public education, these immensely rich foundations enable a few entrepreneurs to set the agenda for how computing will be used in education.

Why not?

Technology is disrupting nearly every other business and profession, so why not education? Higher education in particular requires a huge amount of money, mostly raised through taxes and tuition fees, and it is difficult to tie results directly to investment. Surely we should be looking at ways in which technology can change higher education so that it is more accessible, more affordable and more effective in developing the knowledge and skills required in today’s and tomorrow’s society?

Absolutely. It is not so much the need for change that I am challenging, but the means by which this change is being promoted. In essence, a move to automated learning, while saving costs, will not improve the learning that matters, and particularly the outcomes needed in a digital age, namely, the high level intellectual skills of critical thinking, innovation, entrepreneurship, problem-solving , high-level multimedia communication, and above all, effective knowledge management.

To understand why automated approaches to learning are inappropriate to the needs of the 21st century we need to look particularly at the tools and methods being proposed.

The problems with automating learning

The main challenge for computer-directed learning such as information transmission and management through Internet-distributed video lectures, computer-marked assessments, adaptive learning, learning analytics, and artificial intelligence is that they are based on a model of learning that has limited applications. Behaviourism works well in assisting rote memory and basic levels of comprehension, but does not enable or facilitate deep learning, critical thinking and the other skills that are essential for learners in a digital age.

R. and D. Susskind (2015) in particular argue that there is a new age in artificial intelligence and adaptive learning driven primarily by what they call the brute force of more powerful computing. Why AI failed so dramatically in the 1980s, they argue, was because computer scientists tried to mimic the way that humans think, and computers then did not have the capacity to handle information in the way they do now. When however we use the power of today’s computing, it can solve previously intractable problems through analysis of massive amounts of data in ways that humans had not considered.

There are several problems with this argument. The first is that the Susskinds are correct in that computers operate differently from humans. Computers are mechanical and work basically on a binary operating system. Humans are biological and operate in a far more sophisticated way, capable of language creation as well as language interpretation, and use intuition as well as deductive thinking. Emotion as well as memory drives human behaviour, including learning. Furthermore humans are social animals, and depend heavily on social contact with other humans for learning. In essence humans learn differently from the way machine automation operates.

Unfortunately, computer scientists frequently ignore or are unaware of the research into human learning. In particular they are unaware that learning is largely developmental and constructed, and instead impose an old and less appropriate method of teaching based on behaviourism and an objectivist epistemology. If though we want to develop the skills and knowledge needed in a digital age, we need a more constructivist approach to learning.

Supporters of automation also make another mistake in over-estimating or misunderstanding how AI and learning analytics operate in education. These tools reflect a highly objectivist approach to teaching, where procedures can be analysed and systematised in advance. However, although we know a great deal about learning in general, we still know very little about how thinking and decision-making operate biologically in individual cases. At the same time, although brain research is promising to unlock some of these secrets, most brain scientists argue that while we are beginning to understand the relationship between brain activity and very specific forms of behaviour, there is a huge distance to travel before we can explain how these mechanisms affect learning in general or how an individual learns in particular. There are too many variables (such as emotion, memory, perception, communication, as well as neural activity) at play to find an isomorphic fit between the firing of neurons and computer ‘intelligence’.

The danger then with automation is that we drive humans to learn in ways that best suit how machines operate, and thus deny humans the potential of developing the higher levels of thinking that make humans different from machines. For instance, humans are better than machines at dealing with volatile, uncertain, complex and ambiguous situations, which is where we find ourselves in today’s society.

Lastly, both AI and adaptive learning depend on algorithms that predict or direct human behaviour. These algorithms though are not transparent to the end users. To give an example, learning analytics are being used to identify students at high risk of failure, based on correlations of previous behaviour online by previous students. However, for an individual, should a software program be making the decision as to whether that person is suitable for higher education or a particular course? If so, should that person know the grounds on which they are considered unsuitable and be able to challenge the algorithm or at least the principles on which that algorithm is based? Who makes the decision about these algorithms – a computer scientist using correlated data, or an educator concerned with equitable access? The more we try to automate learning, the greater the danger of unintended consequences, and the more need for educators rather than computer scientists to control the decision-making.

The way forward

In the past, I used to think of computer scientists as colleagues and friends in designing and delivering online learning. I am now increasingly seeing at least some of them as the enemy. This is largely to do with the hubris of Silicon Valley, which believes that computer scientists can solve any problem without knowing anything about the problem itself. MOOCs based on recorded lectures are a perfect example of this, being developed primarily by a few computer scientists from Stanford (and unfortunately blindly copied by many people in universities who should have known better.)

We need to start with the problem, which is how do we prepare learners for the knowledge and skills they will need in today’s society. I have argued (Bates, 2015) that we need to develop, in very large numbers of people, high level intellectual and practical skills that require the construction and development of knowledge, and that enable learners to find, analyse, evaluate and apply knowledge appropriately.

This requires a constructivist approach to learning which cannot be appropriately automated, as it depends on high quality interaction between knowledge experts and learners. There are many ways to accomplish this, and technology can play a leading role, by enabling easy access to knowledge, providing opportunities for practice in experientially-based learning environments, linking communities of scholars and learners together, providing open access to unlimited learning resources, and above all by enabling students to use technology to access, organise and demonstrate their knowledge appropriately.

These activities and approaches do not easily lend themselves to massive economies of scale through automation, although they do enable more effective outcomes and possibly some smaller economies of scale. Automation can be helpful in developing some of the foundations of learning, such as basic comprehension or language acquisition. But at the heart of developing the knowledge and skills needed in today’s society, the role of a human teacher, instructor or guide will remain absolutely essential. Certainly, the roles of teachers and instructors will need to change quite dramatically, teacher training and faculty development will be critical for success, and we need to use technology to enable students to take more responsibility for their own learning, but it is a dangerous illusion to believe that automation is the solution to learning in the 21st century.

Protecting the future

There are several practical steps that need to be taken to prevent the automation of teaching.

  1. Educators – and in particular university presidents and senior civil servants with responsibility for education – need to speak out clearly about the dangers of automation, and the technology alternatives available that still exploit its potential and will lead to greater cost-effectiveness. This is not an argument against the use of technology in education, but the need to use it wisely so we get the kind of educated population we need in the 21st century.
  2. Computer scientists need to show more respect to educators and be less arrogant. This means working collaboratively with educators, and treating them as equals.
  3. We – teachers and educational technologists – need to apply in our own work and disseminate better to those outside education what we already know about effective learning and teaching.
  4. Faculty and teachers need to develop compelling technology alternatives to automation that focus on the skills and knowledge needed in a digital age, such as:
    • experiential learning through virtual reality (e.g. Loyalist College’s training of border service agents)
    • networking learners online with working professionals, to solve real world problems (e.g. by developing a program similar to McMaster’s integrated science program for online/blended delivery)
    • building strong communities of practice through connectivist MOOCs (e.g. on climate change or mental health) to solve global problems
    • empowering students to use social media to research and demonstrate their knowledge through multimedia e-portfolios (e.g. UBC’s ETEC 522)
    • designing openly accessible high quality, student-activated simulations and games but designed and monitored by experts in the subject area.
  5. Governments need to put as much money into research into learning and educational technology as they do into innovation in industry. Without better and more defensible theories of learning suitable for a digital age, we are open to any quack or opportunist who believes he or she has the best snake oil. More importantly, with better theory and knowledge of learning disseminated and applied appropriately, we can have a much more competitive workforce and a more just society.
  6. We need to educate our politicians about the dangers of commercialization in education through the automation of learning and fight for a more equal society where the financial returns on technology applications are more equally shared.
  7. Become edupunks and take back the web from powerful commercial interests by using open source, low cost, easy to use tools in education that protect our privacy and enable learners and teachers to control how they are used.

That should keep you busy in 2016.

Your views are of course welcome – unless you are a bot.

References

Bates, A. (1986) Computer assisted learning or communications: which way for information technology in distance education? Journal of Distance Education Vol. 1, No. 1

Bates, A. (2015) Teaching in a Digital Age Victoria BC: BCcampus

Harasim, L. (2012) Learning Theory and Online Technologies New York/London: Routledge

Mason, R. and Kaye, A (Eds).(1989)  Mindweave: communication, computers and distance education. Oxford: Pergamon

McGugan, I. (2015)Why the Zuckerberg donation is not a bundle of joy, Globe and Mail, December 2

Noble, D. (1998) Digital Diploma Mills, Monthly Review http://monthlyreview.org/product/digital_diploma_mills/

Papert, S. (1980) Mindstorms: Children, Computers and Powerful Ideas New York: Basic Books

Skinner, B. (1968)  The Technology of Teaching, 1968 New York: Appleton-Century-Crofts

Susskind, R. and Susskind, D. (2015) The Future of the Professions: How Technology will Change the Work of Human Experts Oxford UK: Oxford University Press

Watters, A. (2015) The Business of EdTech, Hack Edu, undated http://2015trends.hackeducation.com/business.html

Winters, M. (2015) Christmas Bonus! US Edtech Sets Record With $1.85 Billion Raised in 2015 EdSurge, December 21 https://www.edsurge.com/news/2015-12-21-christmas-bonus-us-edtech-sets-record-with-1-85-billion-raised-in-2015