Image: Dreamstime (royalty-free)

No, no firm predictions from me for 2020 other than more of the same from 2019, plus something unexpected. However, I will suggest three issues that will need special attention during 2020, at least in Canadian online learning.

Time for a new name for online learning?

But before I get into these issues, I’m going to suggest that we need to re-consider the appropriateness of the term ‘online learning’, as it gets increasingly absorbed into campus-based teaching. OK, we have the term blended learning, but that tends to be seen as a separate category from fully online or ‘distance’ learning.

What we need to come to terms with is digitally-supported teaching and learning – or maybe the older term of e-learning, because ‘digital’ will increasingly permeate all post-secondary teaching and learning. Why this change of terminology is important will be clear when we look at the three key issues for 2020.

1. OPMs in Canadian institutions?

In the USA, OPMs (online program managers) have been making some significant inroads during 2019 and are likely to continue to do so during 2020. OPMs are private companies that are hired by institutions to develop and manage their online programs. They have a variety of financial arrangements with colleges or universities, from fee for service to revenue sharing. There are nearly 30 of these companies currently operating in the USA (Mackenzie, 2018), such as 2U, Academic Partnerships, Learning House, Wiley Education Services or Pearson Online Services. Sometimes MOOC providers such as Coursera and Udacity are included as OPMs (Hill, 2018). Universities using OPMs include Arizona State University, Louisiana State University, and the University of Massachusetts.

So far, most Canadian post-secondary institutions have not yet signed contracts with an OPM, preferring to develop their online programs internally. However, I am hearing rumours about OPMs lobbying officials and elected representatives in Canadian provincial governments, and I would not be surprised to see one or more Canadian institutions signing an agreement with an OPM in 2020. I will even go so far as to suggest that it will be one of the more prestigious Canadian universities, such as University of Toronto or McGill University, as they have come to the party a little late in terms of fully online credit programs, although they have been active in MOOCs. An OPM might be seen as a relatively quick and risk-free way of becoming a major player in credit-based online learning for such institutions.

Kim (2018) provides a good analysis of the pros and cons of using an OPM. However, let’s come back to my earlier point about not thinking so much about online learning as digitally-supported learning. As long as teaching is considered a core competency of universities, it makes no sense to outsource the major development that will impact teaching and learning. If teaching is considered a core activity, you want those digital learning competencies in house. It seems clear to me that those institutions that have developed successful fully online programs are in the best position to move into quality digitally supported learning, which affects all students, and not just the 20-30% of those studying fully online.

In particular, this means institutions having a flexible and active strategy for the development of digitally supported learning, which may or may not include the use of an OPM. The latest report of the Canadian Digital Learning Research Association (2019) clearly identifies that few Canadian institutions are yet in a position to evaluate how using an OPM would fit with their strategy – because such a strategy is not yet in place. So maybe an institutional priority for 2020 should be the implementation of such a strategy for all Canadian post-secondary institutions and certainly before they decide to sign a contract with an OPM.

2. Metrics

We will certainly see a push from several right-leaning provincial governments in 2020 for better ‘metrics’ from universities and colleges, such as measures of ‘output’ (e.g. degree completion rates, graduate employment) as well as ‘input’ (finances, etc.).

Although Deming is often incorrectly attributed the quote ‘If you can’t measure it, you can’t manage it’ he did argue for (relevant) data to be collected and used to help decision-making (although he also recognised that many things that cannot be measured also need to be managed).

As institutions start to break down the barriers between fully online and fully face-to-face, the issue arises as to how to measure what is happening. To what extent is an institution moving to digitally-supported learning, and in what ways? 

First, why is it important to measure this? Well, if we are to provide not only more flexible access for learners but also the development of 21st century skills, many of which are digitally based, we need to know the extent to which an institution is changing. It is no good having a strategy or plan if you don’t know how well that strategy is being implemented.

So what needs to be measured? First, we need to find ways to define the various forms that digitally-based learning can take. This would include what students do online as well as in class. The old contact-time credit hours as a way of measuring learning no longer makes much sense. It measures input not output. Students can take various routes now to get to the same end. So we need to know what students are actually doing.

Second, this is a very dynamic and fast moving area. Digitally supported learning can take many forms, from merely supporting regular classroom based teaching (e.g. adding on a virtual reality experience), to hybrid classes, where some face-to-face time but not all is replaced by online work/study, to fully online courses. Also with micro-credentials the line between credit and non-credit is beginning to become less sharp. There are in fact an infinite number of ways digitally-based learning can be designed and implemented, making definitions extremely difficult (but not impossible).

So how can the move to digitally-supported learning best be measured? This is an area that needs much more discussion, but to get this discussion going I have some suggestions.

  1. As part of an institution’s strategy for digitally-supported learning, it should appoint an e-learning co-ordinator, whose primary responsibility should be to know what is going on in this area within their institution, and whose secondary responsibility will be to remove institutional barriers to knowledge-sharing in this area. Depending on the institutional strategy, another responsibility might be to help academic departments move more aggressively to meeting targets for digitally-supported learning. 
  2. The e-learning co-ordinator will also need to put in place a means of measuring progress in this area. To do this, the e-learning co-ordinator should bring together all the various interested areas, including the institutional research department, to put in place a system for tracking developments in digitally-supported learning. This will mean developing some definitions or categories and ensuring that each instructor reports on them on an annual basis. Ideally the e-learning co-ordinator should work with the Canadian Digital Learning Research Association to ensure consistency in definitions and enable inter-institutional comparisons.

Whatever system is put in place, the need for better measuring the move to digitally-supported learning is not going to go away, so this should be a centre of attention during 2020.

3. Learning analytics and other AI applications

Although we are still probably a few years away from a major breakthrough in the use of AI for teaching and learning in higher education there will no doubt be further moves toward the use of analytics. Once again, though, the issue here is measurement.

Too often, dubious analogues of learning are used in analytics, such as number of interactions with the LMS, tests of memory, and ‘personalisation’ that in essence is merely re-routing learners back to repeat previously unsuccessful ‘modules’ of teaching, all at a very micro-level of learning.

Part of the problem here is that many of the applications of AI in education are being driven by computer scientists with a misunderstanding of the nature of learning and the purposes of education. Another part of the problem is that using AI for advanced level learning, where the goal should be to develop high level skills of critical thinking, ‘messy’ problem solving, and creativity, is much more difficult than applying it to say corporate training or basic courses, where there are standard procedures and a very defined set of content that must be learned.

However, it is just not good enough now for academics to sit back and say we can do this but AI can’t. Where’s the evidence? We saw last year that HEQCO had a hard time finding any evidence to show that skills such as numeracy, literacy and critical thinking are in fact being developed in Ontario universities, mainly because the tools for measuring objectively such high level skills are lacking (Finnie et al, 2018). Once again, then, we come back to metrics. How do we measure the skills and knowledge that students need in a digital age, and what teaching methods are most likely to deliver on such skills?

Thus the issue that needs to be addressed in 2020 is what are the core outcomes we are trying to achieve in higher education, how to teach to deliver these, and how to measure them? If educators do not do this, the issues will be defined by computer scientists and the outcomes fitted to meet the parameters of artificial intelligence. This would not be a good outcome in the long run.

Welcome to 2020

In Chinese mythology, the Rat is yang and denotes the beginning of a new day. Will 2020 denote the beginning of a new era in online learning – or digitally-supported learning? Well, I guess this will depend on you, dear reader. The future is in your hands, so have a wonderful 2020.

References

Finnie, R. et al. (2018) Measuring Critical-thinking Skills of Postsecondary Students Toronto ON: Higher Education Quality Council of Ontario

Hill, P. (2018) Online Program Management: Spring 2018 view of the market landscape eLiterate, April

Kim, J. (2018) 5 Misconceptions About Online Program Management Providers Inside Higher Education, July 11

McKenzie, L. (2018) A Tipping Point for OPM? Inside Higher Education, June 4

LEAVE A REPLY

Please enter your comment!
Please enter your name here