September 20, 2018

Why is innovation in teaching in higher education so difficult? 3. Learning management systems

Reasons for using a Learning Management System

I pointed out in my previous post that the LMS is a legacy system that can inhibit innovation in teaching. Also in an earlier post I had pointed to the articles about the future of Blackboard and other proprietary LMSs, and commented that 

what surprises me is that in an age of multimedia and social media…. anyone is using an LMS at all.

This provoked an unusually large number of comments, both on my blog and on Twitter, some supporting my position and many more critical of it. 

The main critical points made were that LMSs have many advantages:

  • convenience: an LMS is the most effective way to organise teaching materials, activities, grievances, tracking students;
  • linked to convenience: it is too much to expect instructors to integrate a range of tools from scratch; the LMS is a simpler way to do this;
  • compliance and security: an LMS is safer than general, public apps (less open to hacking), protects student privacy, and allows for audit/management of grievances.

I will try to address these points below, but note that none of these advantages has anything to do with improving students’ learning – they are mainly instructor, legal, administrative and institutional benefits.

I do not underestimate the importance of convenience to faculty and administrators, and of privacy and security for students, but I would like to see this balanced against the potential learning benefits of using something other than a learning management system. I will also argue that there are other ways to address convenience and privacy/security issues.

What do I mean by an LMS?

One of the issues here is definition. You can define an LMS so broadly that even a physical campus institution can be considered a learning management system. I want to make the distinction in particular between a ‘course’ and an LMS. By LMS I mean basically the off-the-shelf, proprietary software platforms such as Blackboard, Canvas, Brightspace, Moodle that are used in 90% or more of post-secondary institutions, at least in Canada. I don’t include specific platforms developed on a one-off basis for a particular institution or academic department, or by an individual instructor, as I see these more as tailored rather than bespoke. 

Until quite recently, I believed that any of these proprietary LMSs was flexible enough to allow me to teach in the way I wanted. I could post content, determine a schedule for what had to be covered each week, set student activities such as graded or ungraded assignments, communicate individually or in a group with students, set up discussion forums, choose topics for discussion, monitor the discussions, set and mark assessments, grade students, post their grades to the student information system, and give individual or group feedback, all in a secure online environment. 

However, I no longer wish to teach like that. With an LMS, I am given a tool then required to fit my teaching within the boundaries of that tool. I will shortly describe why I want to teach differently, but the essence here is that I want software solutions that fit the way I want to teach.  I want to decide how I want to teach, and more importantly, how I want my students to study, and then find the tool or tools that will allow me and them to do that. If I can be persuaded that an LMS can meet that requirement, fine, but I don’t believe at the moment that this is the case.

Why I want to change my approach to teaching and learning

Basically, in my previous approach, the focus was on me defining the curriculum/what had to be studied, the transmission of this knowledge to students, helping them to develop understanding and critical thinking about this content, and assessing the students. There was a focus on both content and skills, but a limited range of skills. In particular, I was the one who primarily defined what students had to know, and provided or directed them to the relevant content sources.

In a digital age, I don’t believe that this is any longer a satisfactory approach. I was doing most of the hard work, in defining what to read, and what students should do. They were limited in particular to writing or online multiple choice assessments to demonstrate what they had learned. Of course, students liked this. It was clear what they had to do, not just each week but often daily. They had a clear choice: do what I told them, or fail. 

I have written extensively in Teaching in a Digital Age about my ‘new’ approach to teaching and learning (although actually it’s not new – it is a somewhat similar approach I and some other teachers used in teaching in elementary schools in Britain in the 1960s, which was then called ‘discovery learning’ – see Bruner, 1961).

In essence, there is too much new knowledge being generated every day in every discipline for students to be able to master it all, particularly within the scope of a four year degree or even seven years’ higher education. Secondly, information is everywhere on the Internet. I don’t have to provide most of the content I wish to teach; it’s already out there somewhere.

The challenge now is to know where to find that information, how to analyse it, how to evaluate the reliability and relevance of that information, then organise and then apply that information in appropriate ways. This means knowing how to navigate the Internet, how to behave responsibly and ethically online, and how to protect one’s privacy and that of others. I used to do that for students; now I want them to learn how do it themselves.

I therefore want students not only to know things, but to be able to apply their knowledge appropriately within specific contexts. I want them in particular to develop the skills of independent learning, critical thinking, problem solving, and a broad digital literacy, because these are the skills they will need once they have left post-secondary education (or more accurately, skills that they will continue to develop after completing a formal qualification). 

I realise that this approach will not suit all instructors or fit well with every subject area, although I think these are challenges that most subject disciplines are now facing in a digital era.

What do I need to do to teach in this way?

I think it will help to use the concepts of ‘inside’ and ‘outside’. ‘Inside’ is within the relatively safe, secure confines of the institution (I am still talking digitally, here.) To be inside you must be a registered student (or an institutionally employed instructor). What happens in Vegas, stays in Vegas. Students can discuss with other students and their instructors maybe highly controversial issues in an open, academic way, without fear of being sued, imprisoned or ridiculed. Their work and grades are secure (unless they choose to make them public). The same applies to instructors. They can communicate individually with students or to the class as a whole, but it is confidential within the boundaries of the institution.

‘Outside’ is whatever is available publicly through the Internet. This can be open educational resources, public reports, open data, open journals, open textbooks, publicly available You Tube videos, Wikipedia, social media, such as Facebook. It can also be student blogs and wikis, student-made YouTube videos, and those parts of their e-portfolios – a record of their studies – that they choose to make public. Students may also choose to use social media as part of their studies, but they will need to know that this is public and not private or secure, and what the risks are.

For me, most student learning will be done outside: finding, analysing, demonstrating and testing what they have learned. Some inter-student discussion or engagement with external sources such as the general public may take place outside, but students will be provided with guidelines or even rules about what is appropriate for discussion in public forums. Again, instructors will vary in the amount of learning they want done outside, but in my case I would like to push as much as possible ‘outside’ without compromising student security or safety. However, managing risk is a critical part of the learning process here for student and instructor alike.

It will still be necessary to provide a structure and schedule for the course, in terms of desired learning outcomes, student activities and when they are to be completed, and assessment rubrics. These guidelines can be strict and rigid, or open and vague, depending on the needs of the students and the learning objectives.

Student assessment will be mainly through written or multi-media reporting, organised probably through e-portfolios, which will have both a private and a public section. The students will choose (within guidelines) what to make public. Assessment will be continuous, as the e-portfolio is developed.

Is an LMS necessary for this kind of teaching?

This is where I need help. I am not an IT expert, and I’m not up-to-date with all the tools that are now available. If you can show me that I can do all these things within one of the current proprietary LMSs, then that’s fine with me, but unless they have changed significantly since I last used one, I will be surprised. I will though accept that perhaps for the ‘inside’ work, an LMS might be suitable, but it has to be integrated in some way with the outside work.

Here’s where I need the feedback of my readers. Many of you have to grapple with these issues every day. What I am NOT willing to do though is to compromise my vision of teaching to fit an institutional, proprietary software platform.

So can a current proprietary LMS meet my needs?

Over to you!

Reference

Bruner, J. S. (1961). ‘The act of discovery’ Harvard Educational Review Vol. 31, No. 1, pp: 21–32.

Woolf University: the Airbnb of higher education or a sheep in wolf’s clothing?

Broggi, J.D. et al. (2018) Building the first blockchain university, Oxford UK, April 3

You are going to hear a lot about Woolf University over the next year or so and possibly much longer. This is in some ways a highly innovative proposal for a new type of university, but in other ways, it is a terribly conservative proposal, an extension of the Platonic dialogue to modern times. It could only have come from Oxford University academics, with its mix of blue sky dreaming, the latest technological buzz, and regression to cloistered academe.

The proposal

As always, I am going to recommend that you read the original paper from cover to cover. It has a number of complex, radical proposals that each need careful consideration (the whitepaper would make an excellent topic for an Oxbridge tutorial).

I am not sure I completely understand the financial aspect of the blockchain tokens (but that probably puts me with 99.99999 per cent of the rest of the world). But the basic ideas behind the university are as follows:

  • Woolf University will issue blockchain-guaranteed ‘contracts’ between an individual professor and an individual student;
  • Woolf University will initially include only professors who have a post-graduate research degree from one of the 200 ‘top-ranked’ universities;
  • the core blockchain contract consists of an agreement to deliver a one hour, one-on-one tutorial, for which the student will directly pay the instructor (in real money, but tied to a blockchain token system which I don’t fully understand);
  • the tutorial can be delivered face-to-face, or over the Internet (presumably synchronously – Skype is suggested), but the maximum number of students per tutorial is set at two;
  • the contract (and payment) is initiated once the student ‘accepts’ the contract with a push of a button on their cell phone. If the tutor fails to deliver the tutorial, the student is automatically refunded (and offered another instructor). Instructors who miss a tutorial will be fined by the university in the form of a deduction from the next tutorial payment;
  • on successful completion of the tutorial (which will include a written essay or other assessable pieces of work from the student) the blockchain registers the grade against the student record;
  • once the student has accumulated enough ‘credits’ within an approved program they will be issued with a Woolf University degree;
  • a full student workload consists of two classes a week over 8 weeks in each of three semesters or a total of 144 meetings over three years for a degree;
  • annual tuition is expected to be in the order of $20,000 a year, excluding scholarships;
  • instructor payments will depend on the number and cost of tutorials, but at four a week would range from $38,000 to $43,000 per annum with fees in the range of $350-$400 per tutorial;
  • colleges of a minimum of 30 individual instructors can join Woolf University and issue their own qualifications, but each college’s qualification requirements must also meet Woolf University’s criteria. Colleges can set their own tutorial fee above a minimum of $150 an hour. Colleges’ instructors must meet the qualification requirements of Woolf University;
  • the first college, called Ambrose, will consist of 50 academics from Oxford University, and Woolf has invited academics from Cambridge University to set up another college;
  • Woolf University will be a not-for-profit institution. There will be a deduction of 0.035% of each financial transaction to build the Woolf Reserve to update and maintain the blockchain system. There will also be a student financial aid program for scholarships for qualified students;
  • Woolf University would be managed by a Faculty Council with voting rights on decision-making from every employed instructor;
  • Ambrose College will deduct 4% from each tuition fee for administrative overheads.

There are other proposals such as a language school, peer review, etc.

What’s to like?

This is clearly an effort to cut out the institutional middleman of university and institutional administration. Although the tutorial fees are close to the average of universities in the UK and the more elite state universities in the USA, students are getting a one-on-one learning experience from an instructor who is highly qualified (at least in terms of content).

I was fortunate to have a tutorial system when I was an undergraduate at the University of Sheffield at the UK, and it worked very well, although we had between two and four students at each tutorial, and only in the last two years of my bachelor’s degree. Such tutorials are excellent for developing critical thinking skills, because each statement you make as a student is likely to be challenged by the professor or one of the other students.

Woolf University has highly idealistic goals for democratic governance – by the faculty – and its main attraction is offering alternative and regular employment for the very large number of poorly paid but highly qualified adjunct professors who can’t get tenure at regular universities. However there is no suggestion of student representation in the governance process, and the use of faculty is demand driven – if no student wants your course, no money – which seems an even more precarious position than working as an adjunct.

Most of all, though, it is a serious attempt to provide an independent system of academic validation of qualifications through the use of blockchain which could lead to better standardization of degree qualifications.

What’s not to like?

Well, the first thing that jumps to my mind is conflict of interest. If faculty are already employed by a traditional university, Woolf will be a direct, and if successful, a very dangerous competitor. Will universities allow their best faculty to moonlight for a direct competitor? If instructors cannot get employment in a traditional university, will they be as well qualified as the instructors in the regular system? The corollary though is that Woolf may force universities to pay their adjunct faculty better, but that will increase costs for the existing universities.

Second, the tuition fees may be reasonable by the absurdly inflated cost of HE tuition fees in the UK, but these are double or triple the fees in Canada, and much higher than the fees in the rest of Europe.

Third, the tutorial is just one mode of teaching. The report recommends (but does not insist) that instructors should also provide recorded lectures, but there are now so many other ways for students to learn that it seems absurd to tie Woolf to just the one system Oxbridge dons are familiar with.  The proposal does not address the issue of STEM teaching or experiential learning. All the examples given are from Greek philosophy. Not all my tutorials were great – it really depended on the excellence of the professor as a teacher as well as a scholar and that varied significantly. (It is also clear from reading the report that the authors have no knowledge about best practices in online teaching, either). The whole proposal reeks of the worst kind of elitism in university teaching.

Will it succeed?

Quite possibly, if it can sell the substitute Oxbridge experience to students and if it can explain more clearly its business model and in particular how the blockchain currency will work with regard to the payment of instructors. What can make or break it is the extent to which traditional universities will go to protect their core faculty from being hijacked by Woolf. 

I’m somewhat baffled by the claims that this new business model will be much much more cost-effective than the current system. Academic salaries make up almost 70% of the cost of a traditional university so the savings on administration alone are a comparatively small proportion of the costs of higher education, and the proposed tuition fees are still very high. It seems to be more a solution for the problem of unemployed Ph.D.s than the problem of expanding more cost-effectively quality higher education to large numbers of students.

Nevertheless, it is a very interesting development. I am guessing that this will ultimately fail, because establishing its credentials as equivalent to the elite universities will be a hard sell, and costs to students will be too high, but much will be learned about the strengths and weaknesses of blockchain in higher education, resulting in a better/more sustainable higher education model developing in another way. It is definitely a development to be carefully tracked.

 

Towards an open pedagogy for online learning

Image: © University of Victoria, BC

Image: © University of Victoria, BC

The problems with OER

I was interviewed recently by a reporter doing an article on OER (open educational resources) and I found myself being much more negative than I expected, since I very much support the principle of open-ness in education. In particular, I pointed out that OER, while slowly growing in acceptance, are still used for a tiny minority of teaching in North American universities and colleges. For instance, open textbooks are a no brainer, given the enormous savings they can bring to students, but even in the very few state or provincial jurisdictions that have an open textbook program, the take-up is still very slow.

I have written elsewhere in more detail about why this is so, but here is a summary of the reasons:

  • lack of suitable OER: finding the right OER for the right context. This is a problem that is slowly disappearing, as more OER become available, but it is still difficult to find exactly the right kind of OER to fit a particular teaching context in too many instances. It is though a limitation that I believe will not last for much longer (for the reasons for this, read on).
  • the poor quality of what does exist. This is not so much the quality of content, but the quality of production. Most OER are created by an individual instructor working alone, or at best with an instructional designer. This is the cottage industry approach to design. I have been on funding review committees where institutions throughout a province are bidding for funds for course development or OER production. In one case I reviewed requests from about eight different institutions for funds to produce OER for statistics. Each institution (or rather faculty member) made its proposal in isolation of the others. I strongly recommended that the eight faculty members got together and designed a set of OER together that would benefit from a larger input of expertise and resources. That way all eight institutions were likely to use the combined OER, and the OER would likely be of a much higher quality as a result.
  • the benefits are less for instructors than students. Faculty for instance set the textbook requirement. They don’t have to pay for the book themselves in most cases. With the textbook often comes a whole package of support materials from the publisher, such as tests, supplementary materials, and model answers (which is why the textbook is so expensive). This makes life easier for instructors but it is the students who have to pay the cost.
  • OER take away the ‘ownership’ of knowledge from the instructor. Instructors do not see themselves as merely distributors of information, a conveyor belt along which ‘knowledge’ passes, but as constructors of knowledge. They see their lecture as unique and individual, something the student cannot get from someone else. And often it is unique, with an instructor’s personal spin on a topic. OER’s take away from instructors that which they see as being most important about their teaching: their unique perspective on a topic.
  • and now we come to what I think is the main problem with OER: OER do not make much sense out of context. Too often the approach is to create an OER then hope that others will find applications for it. But this assumes that knowledge is like a set of bricks. All you have to do is to collect bricks of knowledge together, add a little  mortar, and lo, you have a course. The instructor chooses the bricks and the students apply the mortar. Or you have a course but you need to fill some holes in it with OER. I suggest these are false metaphors for teaching, or at least for how people learn. You need a context, a pedagogy, where it makes sense to use open resources.

Towards an open pedagogy

I am making three separate but inter-linked arguments here:

  • OER are too narrowly defined and conceptualized
  • we need to design teaching in such a way that it is not just sensible to use OER but unavoidable
  • we should start by defining what we are trying to achieve, then identify how OER will enable this.

So I will start with the last argument first.

Developing the knowledge and skills needed in the 21st century

Again I have written extensively about this (see Chapter 1 of Teaching in a Digital Age), but in essence we need to focus specifically on developing core ‘soft’ or ‘intellectual’ skills in our students, and especially the core skills of independent learning and knowledge management. Put in terms of learning outcomes, in a world where the content component of knowledge is constantly developing and growing, students need to learn independently so they can continue to learn after graduation, and students also need to know how to find, analyse, evaluate, and apply knowledge.

If we want students to develop these and other ‘soft’ skills such as problem-solving, critical thinking, evidence-based argumentation, what teaching methods or pedagogy should we adopt and how would it differ from what we do now?

The need for teaching methods that are open rather than closed

The first thing we should recognise is that in a lecture based methodology, it is the instructor doing the knowledge management, not the student. The instructor (or his or her colleagues) decide the curriculum, the required reading, what should be covered in each lecture, how it should be structured, and what should be assessed. There is little independence for the learner – either do what you are instructed to do, or fail. That is a closed approach to teaching.

I am suggesting that we need to flip this model on its head. It should ultimately be the students learning and deciding what content is important, how it should be structured, how it can be applied. The role of the instructor then would not be to choose, organise and deliver content, but to structure the teaching to enable students to do this effectively themselves.

This also should not be a sudden process, where students suddenly switch from a lecture-based format as an undergraduate to a more open structure as a post-graduate, but a process that is slowly and increasingly developed throughout the undergraduate program or a two-year college program where soft skills are considered important. One way – although there are many others – of doing this is through project- or problem-based learning, where students start with real challenges then develop the knowledge and skills needed to address such challenges.

This does not mean we no longer need subject specialists or content experts. Indeed, a deep understanding of a subject domain is essential if students are to be steered and guided and properly assessed. However, the role of the subject specialist is fundamentally changed. He or she is now required to set their specialist knowledge in a context that enables student discovery and exploration, and student responsibility for learning. The specialist’s role now is to support learning, by providing appropriate learning contexts, guidance to students, criteria for assessing the quality of information, and quality standards for problem-solving, knowledge management and critical thinking, etc.

A new definition of open resources

Here I will be arguing for a radical change: the dropping of the term ‘educational’ from OER.

If students are to develop the skills identified earlier, they will need access to resources: research papers, reports from commissions, case-study material, books, first-hand reports, YouTube video, a wide range of opinions or arguments about particular topics, as well as the increasing amount of specifically named open educational resources, such as recorded lectures from MIT and other leading research universities.

Indeed, increasingly all knowledge is becoming open and easily accessible online. All publicly funded research in many countries must now be made available through open access journals, increasingly government and even some commercial data (think government commission reports, environmental assessments, public statistics, meteorological models) are now openly accessible online, and this will become more and more the norm. In other words, all content is becoming more free and more accessible, especially online.

With that comes of course more unreliable information, more false truths, and more deliberate propaganda. What better preparation for our students’ future is there than equipping them with the knowledge and skills to sift through this mass of contradictory information?  What better than to make them really good at identifying the true from the false, to evaluate the strength of an argument, to assess the evidence used to support an argument, whatever the subject domain? To do this though means exposing them to a wide range of openly accessible content, and providing the guidance and criteria, and the necessary prior knowledge, that they will need to make these decisions.

But we cannot do this if we restrict our students to already ‘approved’ OER. All content eventually becomes an educational resource, a means to help students to differentiate, evaluate and decide. By naming content as ‘educational’ we are already validating its ‘truth’ – we are in fact closing the mind to challenge. What we want is access to open resources – full stop. Let’s get rid of the term OER and instead fight for an open pedagogy.

Automation or empowerment: online learning at the crossroads

Image: Applift

Image: AppLift, 2015

You are probably, like me, getting tired of the different predictions for 2016. So I’m not going to do my usual look forward for the year for individual developments in online learning. Instead, I want to raise a fundamental question about which direction online learning should be heading in the future, because the next year could turn out to be very significant in determining the future of online learning.

The key question we face is whether online learning should aim to replace teachers and instructors through automation, or whether technology should be used to empower not only teachers but also learners. Of course, the answer will always be a mix of both, but getting the balance right is critical.

An old but increasingly important question

This question, automation or human empowerment, is not new. It was raised by B.F. Skinner (1968) when he developed teaching machines in the early 1960s. He thought teaching machines would eventually replace teachers. On the other hand, Seymour Papert (1980) wanted computing to empower learners, not to teach them directly. In the early 1980s Papert got children to write computer code to improve the way they think and to solve problems. Papert was strongly influenced by Jean Piaget’s theory of cognitive development, and in particular that children constructed rather than absorbed knowledge.

In the 1980s, as personal computers became more common, computer-assisted learning (CAL or CAD) became popular, using computer-marked tests and early forms of adaptive learning. Also in the 1980s the first developments in artificial intelligence were applied, in the form of intelligent math tutoring. Great predictions were made then, as now, about the potential of AI to replace teachers.

Then along came the Internet. Following my first introduction to the Internet in a friend’s basement in Vancouver, I published an article in the first edition of the Journal of Distance Education, entitled ‘Computer-assisted learning or communications: which way for IT in distance education?’ (1986). In this paper I argued that the real value of the Internet and computing was to enable asynchronous interaction and communication between teacher and learners, and between learners themselves, rather than as teaching machines. This push towards a more constructivist approach to the use of computing in education was encapsulated in Mason and Kaye’s book, Mindweave (1989). Linda Harasim has since argued that online collaborative learning is an important theory of learning in its own right (Harasim, 2012).

In the 1990s, David Noble of York University attacked online learning in particular for turning universities into ‘Digital Diploma Mills’:

‘universities are not only undergoing a technological transformation. Beneath that change, and camouflaged by it, lies another: the commercialization of higher education.’

Noble (1998) argued that

‘high technology, at these universities, is often used not to ……improve teaching and research, but to replace the visions and voices of less-prestigious faculty with the second-hand and reified product of academic “superstars”.

However, contrary to Noble’s warnings, for fifteen years most university online courses followed more the route of interaction and communication between teachers and students than computer-assisted learning or video lectures, and Noble’s arguments were easily dismissed or forgotten.

Then along came lecture capture and with it, in 2011, Massive Open Online Courses (xMOOCs) from Coursera, Udacity and edX, driven by elite, highly selective universities, with their claims of making the best professors in the world available to everyone for free. Noble’s nightmare suddenly became very real. At the same time, these MOOCs have resulted in much more interest in big data, learning analytics, a revival of adaptive learning, and claims that artificial intelligence will revolutionize education, since automation is essential for managing such massive courses.

Thus we are now seeing a big swing back to the automation of learning, driven by powerful computing developments, Silicon Valley start-up thinking, and a sustained political push from those that want to commercialize education (more on this later). Underlying these developments is a fundamental conflict of philosophies and pedagogies, with automation being driven by an objectivist/behaviourist view of the world, compared with the constructivist approaches of online collaborative learning.

In other words, there are increasingly stark choices to be made about the future of online learning. Indeed, it is almost too late – I fear the forces of automation are winning – which is why 2016 will be such a pivotal year in this debate.

Automation and the commercialization of education

These developments in technology are being accompanied by a big push in the United States, China, India and other countries towards the commercialization of online learning. In other words, education is being seen increasingly as a commodity that can be bought and sold. This is not through the previous and largely discredited digital diploma mills of the for-profit online universities such as the University of Phoenix that David Noble feared, but rather through the encouragement and support of commercial computer companies moving into the education field, companies such as Coursera, Lynda.com and Udacity.

Audrey Watters and EdSurge both produced lists of EdTech ‘deals’ in 2015 totalling between $1-$2 billion. Yes, that’s right, that’s $1-$2 billion in investment in private ed tech companies in the USA (and China) in one year alone. At the same time, entrepreneurs are struggling to develop sustainable business models for ed tech investment, because with education funded publicly, a ‘true’ market is restricted. Politicians, entrepreneurs and policy makers on the right in the USA increasingly see a move to automation as a way of reducing government expenditure on education, and one means by which to ‘free up the market’.

Another development that threatens the public education model is the move by very rich entrepreneurs such as the Gates, the Hewletts and the Zuckerbergs to move their massive personal wealth into ‘charitable’ foundations or corporations and use this money for their pet ‘educational’ initiatives that also have indirect benefits for their businesses. Ian McGugan (2015) in the Globe and Mail newspaper estimates that the Chan Zuckerberg Initiative is worth potentially $45 billion, and one of its purposes is to promote the personalization of learning (another name hi-jacked by computer scientists; it’s a more human way of describing adaptive learning). Since one way Facebook makes its money is by selling personal data, forgive my suspicions that the Zuckerberg initiative is a not-so-obvious way of collecting data on future high earners. At the same time, the Chang Zuckerberg initiative enables the Zuckerberg’s to avoid paying tax on their profits from Facebook. Instead then of paying taxes that could be used to support public education, these immensely rich foundations enable a few entrepreneurs to set the agenda for how computing will be used in education.

Why not?

Technology is disrupting nearly every other business and profession, so why not education? Higher education in particular requires a huge amount of money, mostly raised through taxes and tuition fees, and it is difficult to tie results directly to investment. Surely we should be looking at ways in which technology can change higher education so that it is more accessible, more affordable and more effective in developing the knowledge and skills required in today’s and tomorrow’s society?

Absolutely. It is not so much the need for change that I am challenging, but the means by which this change is being promoted. In essence, a move to automated learning, while saving costs, will not improve the learning that matters, and particularly the outcomes needed in a digital age, namely, the high level intellectual skills of critical thinking, innovation, entrepreneurship, problem-solving , high-level multimedia communication, and above all, effective knowledge management.

To understand why automated approaches to learning are inappropriate to the needs of the 21st century we need to look particularly at the tools and methods being proposed.

The problems with automating learning

The main challenge for computer-directed learning such as information transmission and management through Internet-distributed video lectures, computer-marked assessments, adaptive learning, learning analytics, and artificial intelligence is that they are based on a model of learning that has limited applications. Behaviourism works well in assisting rote memory and basic levels of comprehension, but does not enable or facilitate deep learning, critical thinking and the other skills that are essential for learners in a digital age.

R. and D. Susskind (2015) in particular argue that there is a new age in artificial intelligence and adaptive learning driven primarily by what they call the brute force of more powerful computing. Why AI failed so dramatically in the 1980s, they argue, was because computer scientists tried to mimic the way that humans think, and computers then did not have the capacity to handle information in the way they do now. When however we use the power of today’s computing, it can solve previously intractable problems through analysis of massive amounts of data in ways that humans had not considered.

There are several problems with this argument. The first is that the Susskinds are correct in that computers operate differently from humans. Computers are mechanical and work basically on a binary operating system. Humans are biological and operate in a far more sophisticated way, capable of language creation as well as language interpretation, and use intuition as well as deductive thinking. Emotion as well as memory drives human behaviour, including learning. Furthermore humans are social animals, and depend heavily on social contact with other humans for learning. In essence humans learn differently from the way machine automation operates.

Unfortunately, computer scientists frequently ignore or are unaware of the research into human learning. In particular they are unaware that learning is largely developmental and constructed, and instead impose an old and less appropriate method of teaching based on behaviourism and an objectivist epistemology. If though we want to develop the skills and knowledge needed in a digital age, we need a more constructivist approach to learning.

Supporters of automation also make another mistake in over-estimating or misunderstanding how AI and learning analytics operate in education. These tools reflect a highly objectivist approach to teaching, where procedures can be analysed and systematised in advance. However, although we know a great deal about learning in general, we still know very little about how thinking and decision-making operate biologically in individual cases. At the same time, although brain research is promising to unlock some of these secrets, most brain scientists argue that while we are beginning to understand the relationship between brain activity and very specific forms of behaviour, there is a huge distance to travel before we can explain how these mechanisms affect learning in general or how an individual learns in particular. There are too many variables (such as emotion, memory, perception, communication, as well as neural activity) at play to find an isomorphic fit between the firing of neurons and computer ‘intelligence’.

The danger then with automation is that we drive humans to learn in ways that best suit how machines operate, and thus deny humans the potential of developing the higher levels of thinking that make humans different from machines. For instance, humans are better than machines at dealing with volatile, uncertain, complex and ambiguous situations, which is where we find ourselves in today’s society.

Lastly, both AI and adaptive learning depend on algorithms that predict or direct human behaviour. These algorithms though are not transparent to the end users. To give an example, learning analytics are being used to identify students at high risk of failure, based on correlations of previous behaviour online by previous students. However, for an individual, should a software program be making the decision as to whether that person is suitable for higher education or a particular course? If so, should that person know the grounds on which they are considered unsuitable and be able to challenge the algorithm or at least the principles on which that algorithm is based? Who makes the decision about these algorithms – a computer scientist using correlated data, or an educator concerned with equitable access? The more we try to automate learning, the greater the danger of unintended consequences, and the more need for educators rather than computer scientists to control the decision-making.

The way forward

In the past, I used to think of computer scientists as colleagues and friends in designing and delivering online learning. I am now increasingly seeing at least some of them as the enemy. This is largely to do with the hubris of Silicon Valley, which believes that computer scientists can solve any problem without knowing anything about the problem itself. MOOCs based on recorded lectures are a perfect example of this, being developed primarily by a few computer scientists from Stanford (and unfortunately blindly copied by many people in universities who should have known better.)

We need to start with the problem, which is how do we prepare learners for the knowledge and skills they will need in today’s society. I have argued (Bates, 2015) that we need to develop, in very large numbers of people, high level intellectual and practical skills that require the construction and development of knowledge, and that enable learners to find, analyse, evaluate and apply knowledge appropriately.

This requires a constructivist approach to learning which cannot be appropriately automated, as it depends on high quality interaction between knowledge experts and learners. There are many ways to accomplish this, and technology can play a leading role, by enabling easy access to knowledge, providing opportunities for practice in experientially-based learning environments, linking communities of scholars and learners together, providing open access to unlimited learning resources, and above all by enabling students to use technology to access, organise and demonstrate their knowledge appropriately.

These activities and approaches do not easily lend themselves to massive economies of scale through automation, although they do enable more effective outcomes and possibly some smaller economies of scale. Automation can be helpful in developing some of the foundations of learning, such as basic comprehension or language acquisition. But at the heart of developing the knowledge and skills needed in today’s society, the role of a human teacher, instructor or guide will remain absolutely essential. Certainly, the roles of teachers and instructors will need to change quite dramatically, teacher training and faculty development will be critical for success, and we need to use technology to enable students to take more responsibility for their own learning, but it is a dangerous illusion to believe that automation is the solution to learning in the 21st century.

Protecting the future

There are several practical steps that need to be taken to prevent the automation of teaching.

  1. Educators – and in particular university presidents and senior civil servants with responsibility for education – need to speak out clearly about the dangers of automation, and the technology alternatives available that still exploit its potential and will lead to greater cost-effectiveness. This is not an argument against the use of technology in education, but the need to use it wisely so we get the kind of educated population we need in the 21st century.
  2. Computer scientists need to show more respect to educators and be less arrogant. This means working collaboratively with educators, and treating them as equals.
  3. We – teachers and educational technologists – need to apply in our own work and disseminate better to those outside education what we already know about effective learning and teaching.
  4. Faculty and teachers need to develop compelling technology alternatives to automation that focus on the skills and knowledge needed in a digital age, such as:
    • experiential learning through virtual reality (e.g. Loyalist College’s training of border service agents)
    • networking learners online with working professionals, to solve real world problems (e.g. by developing a program similar to McMaster’s integrated science program for online/blended delivery)
    • building strong communities of practice through connectivist MOOCs (e.g. on climate change or mental health) to solve global problems
    • empowering students to use social media to research and demonstrate their knowledge through multimedia e-portfolios (e.g. UBC’s ETEC 522)
    • designing openly accessible high quality, student-activated simulations and games but designed and monitored by experts in the subject area.
  5. Governments need to put as much money into research into learning and educational technology as they do into innovation in industry. Without better and more defensible theories of learning suitable for a digital age, we are open to any quack or opportunist who believes he or she has the best snake oil. More importantly, with better theory and knowledge of learning disseminated and applied appropriately, we can have a much more competitive workforce and a more just society.
  6. We need to educate our politicians about the dangers of commercialization in education through the automation of learning and fight for a more equal society where the financial returns on technology applications are more equally shared.
  7. Become edupunks and take back the web from powerful commercial interests by using open source, low cost, easy to use tools in education that protect our privacy and enable learners and teachers to control how they are used.

That should keep you busy in 2016.

Your views are of course welcome – unless you are a bot.

References

Bates, A. (1986) Computer assisted learning or communications: which way for information technology in distance education? Journal of Distance Education Vol. 1, No. 1

Bates, A. (2015) Teaching in a Digital Age Victoria BC: BCcampus

Harasim, L. (2012) Learning Theory and Online Technologies New York/London: Routledge

Mason, R. and Kaye, A (Eds).(1989)  Mindweave: communication, computers and distance education. Oxford: Pergamon

McGugan, I. (2015)Why the Zuckerberg donation is not a bundle of joy, Globe and Mail, December 2

Noble, D. (1998) Digital Diploma Mills, Monthly Review http://monthlyreview.org/product/digital_diploma_mills/

Papert, S. (1980) Mindstorms: Children, Computers and Powerful Ideas New York: Basic Books

Skinner, B. (1968)  The Technology of Teaching, 1968 New York: Appleton-Century-Crofts

Susskind, R. and Susskind, D. (2015) The Future of the Professions: How Technology will Change the Work of Human Experts Oxford UK: Oxford University Press

Watters, A. (2015) The Business of EdTech, Hack Edu, undated http://2015trends.hackeducation.com/business.html

Winters, M. (2015) Christmas Bonus! US Edtech Sets Record With $1.85 Billion Raised in 2015 EdSurge, December 21 https://www.edsurge.com/news/2015-12-21-christmas-bonus-us-edtech-sets-record-with-1-85-billion-raised-in-2015

Another perspective on the personalisation of learning online

To see the video recording click on the image

To see the video recording click on the image

I gave a keynote presentation last week at a large educational conference in the Netherlands, Dé Onderwijsdagen’ (Education Days). I was asked to talk about the personalisation of learning. I agreed as I think this is one of many potential advantages of online learning.

However, the personalisation of learning tends to be looked at often through a very narrow lens. I suggest that there are in fact at least seven ways in which online learning can facilitate the personalisation of learning. This is a blog post version of my keynote, which can be seen in full here.

Why personalisation?

Personalisation is one of the buzzwords going around these days in educational circles, like experiential learning or competency-based learning. Sometimes when I look more closely at some of the current buzzwords I end up thinking: ‘Oh, is that what it is? But I’ve always done that – I just haven’t given it that name before.’

However, I think there are good reasons why we should be focusing more on personalisation in post-secondary education:

  • the need to develop a wide range of knowledge and skills in learners for the 21st century;
  • as the system has expanded, so has the diversity of students: in age, language ability, prior learning, and interests;
  • a wider range of modes of delivery for students to choose from (campus, blended, fully online);
  • a wider range of media accessible not only to instructors but also to learners themselves;
  • the need to actively engage a very wide range of preferred learning styles, interests and motivation.

Clearly in such a context one size does not fit all. But with a continuously expanding post-secondary system and more pressures on faculty and instructors, how can we make learning more individualised in a cost-effective manner?

Seven roads to personalisation

I can think of at least seven ways to make learning more personal. In my keynote I discuss the strengths and weaknesess of each of these approaches:.

  • adaptive learning;
  • competency-based learning;
  • virtual personal learning environments;
  • multi-media, multi-mode courses and learning materials;
  • modularisation of courses and learning materials;
  • new qualifications/certification (badges, nanodegrees, etc.);
  • disaggregated services.

There are probably others and I would be interested in your suggestions. However I recommend that you look at the video presentation, as it provides more ‘flesh’ on each of these seven approaches to personalisation.

An overall design approach to personalisation

Personalisation of learning will work best if it is embedded within an overall, coherent learning design, In my keynote I suggest one approach that fully exploits both the potential of online learning and the personalisation of learning:

  • the development of the core skill of knowledge management within a particular subject domain (other skills development could also be included, such as independent learning, research, critical thinking, and 21st century communication)
  • the use of open content by students, guided and supported by the instructor
  • student-generated multi-media content through online project work
  • active online discussion embedded within and across the different student projects
  • assessment through personal e-portfolios and group project assessment.

Such an ‘open’ design allows for greater choice in topics and approaches by learners while still developing the core skills and knowledge needed by our learners in a digital age. Other designs are also of course possible to reach the same kind of overall learning goals.

The role of the instructor though remains crucial, both as a content expert, guiding students and ensuring that they meet the academic needs of the discipline, and in providing feedback and assessment of their learning.

Conclusion

With knowledge continuing to rapidly grow and change, and a wide range of skills as well as knowledge needed in a knowledge-based society, we need new approaches to teaching that address such challenges.

Also because of increased diversity in our students and a wide range of different learning needs, we need to develop more flexible teaching methods and modes of delivery. This will also mean understanding better the differences between media and using them appropriately in our teaching.

Making learning more personal for our students is increasingly important, but it is only one element in new designs for learning. There are in fact many possibilities, limited only by the imagination and vision of teachers and instructors.