June 24, 2017

Ontario funds research and innovation in online learning

eCampus Ontario (2017) Research and Innovation: Funded Projects Toronto ON: eCampus Ontario

A few days ago, eCampus Ontario officially announced nearly $2.5 million of grants for research and innovation in online learning for Ontario universities and colleges. This is a separate fund from their grants for developing online courses.

The 45 grants, from a total of 135 proposals, ranged from $17,000 to $100,000 in total. Ryerson University and Mohawk College each had five projects funded, but the University of Waterloo had the most in total grants at $396,000 with Ryerson close behind with $380,000. Mohawk received a total of $259,000, and Algonquin College received $186,000. Of the 45 grants, 14 involved two or more institutions working collaboratively.

The one common factor among all the proposals was their variety. No one area of online learning dominated, although six of the proposals were directly concerned with assessing quality in online courses. Four of the grants were to study ways to improve the course development process or to facilitate faculty better in online teaching.

Then there was a bunch of grants looking at the effectiveness of particular technologies, including four for games/gaming, three for the use of animations or simulations, and grants for exploring virtual labs or the application of virtual reality. There were about four grants focused on the use of online learning for skills development, including one on evaluating competency-based learning.

Lastly, there was a very significant grant of $80,000 to Ryerson University to support the national survey of online and distance education that I am leading.

Comment

Even setting aside my gratitude for my own grant, eCampus Ontario and the Ontario government deserve praise for investing in research and development at this level. There has been a desperate lack of funding for research or development in online learning in Canada, at least in recent years, and hopefully a great deal of learning, new developments and innovation in online learning will emerge from this process. 

The major challenge now will be to ensure that the projects disseminate their results across the system, so that major innovations do not just hide within tiny corners of the institutions. I am eagerly looking forward to seeing what emerges from these grants.

EDEN Research Workshop, October, 2016

The city of Olenburg Image: © Marcus Thielen, 2015

The city of Oldenburg
Image: © Marcus Thielen, 2015

What: Forging New Pathways of research and innovation in open and distance learning: reaching from the roots

The Ninth EDEN Research Workshop in Oldenburg, Germany, will bring together researchers from all walks of life and provide a platform for engaging in discussion and debate, exchanging research ideas, and presenting new developments in ODL, with the goal of creating dialogues and forming opportunities for research collaboration.

Workshop Themes:

  • emerging distance education systems and theories
  • management and organizational models and approaches
  • evolving practices in technology-enhanced learning and teaching

Keynotes:

  • Olaf Zawacki-Richter, Carl von Ossietzki University, Oldenburg
  • Inge de Waard, The Open University, UK
  • Adnan Qayyum, Penn State university, USA
  • Som Naidu, Monash University, Australia
  • Paul Prinsloo, University of South Africa
  • George Veletsianos, Royal Roads University, Canada
  • Isa Jahnke, University of Missouri, USA

Types of sessions:

  • paper presentations
  • hands-on workshops
  • posters
  • demonstrations
  • ‘synergy’ sessions (to share and discuss EU projects)
  • training sessions

Where: Carl von Ossietzki University, Oldenburg, Germany. Oldenburg is a charming city in north east Germany between Bremen and Groningen.

When: 4-7 October, 2016

Who: The European Distance and e-Learning Network and the Centre for Distance Education, Carl von Ossietzki University. The university is a partner with the University of Maryland University College in offering a fully online Master in Distance Education and e-Learning, which has been running for many years. The Centre for Distance Education has published 15 books on distance education and e-learning in its ASF series.

How: Registration opens mid-August. For more details on registration, fees and accommodation go to the conference web site

Comment: EDEN Research Workshops are one of my favourite professional development activities. They bring together online learning researchers from all over Europe, and it is a remarkably efficient way to keep up to date not only with the latest research but also the technology trends in open and distance education that are getting serious attention. The conference is usually small (about 100-200 participants) and very well focused on practical aspects of research and practice in online learning and distance education.

 

Online learning for beginners: 2. Isn’t online learning worse than face-to-face teaching?

Distance education: anyone sitting more than 10 rows from the front

Distance learning: anyone sitting more than 10 rows from the front

The short answer to this question is: no, online learning is neither inherently worse – nor better – than face-to-face teaching; it all depends on the circumstances.

The research evidence

There have been thousands of studies comparing face-to-face teaching to teaching with a wide range of different technologies, such as televised lectures, computer-based learning, and online learning, or comparing face-to-face teaching with distance education.

With regard to online learning there have been several meta-studies. A meta-study combines the results of many ‘well-conducted scientific’ studies, usually studies that use the matched comparisons or quasi-experimental method (Means et al., 2011; Barnard et al., 2014). Nearly all such ‘well-conducted’ meta-studies find no or little significant difference in the modes of delivery, in terms of the effect on student learning or performance. For instance, Means et al. (2011), in a major meta-analysis of research on blended and online learning for the U.S. Department of Education, reported:

In recent experimental and quasi-experimental studies contrasting blends of online and face-to-face instruction with conventional face-to-face classes, blended instruction has been more effective, providing a rationale for the effort required to design and implement blended approaches. When used by itself, online learning appears to be as effective as conventional classroom instruction, but not more so.

However, the ‘no significant difference’ finding is often misinterpreted. If there is no difference, then why do online learning? I’m comfortable teaching face-to-face, so why should I change?

This is a misinterpretation of the findings, because there may indeed within any particular study be large differences between conditions (face-to-face vs online), but they cancel each other out over a wide range of studies, or because with matched comparisons you are looking at only very specific, strictly comparable conditions, that never exist in a real teaching context.

For instance the ‘base’ variable chosen is nearly always the traditional classroom. In order to make a ‘scientific’ comparison, the same learning objectives and same treatment (teaching) is applied to the comparative condition (online learning). This means using exactly the same kind of students, for instance, in both conditions. But what if (as is the case) online learning better suits non-traditional students, or will achieve better learning outcomes if the teaching is designed differently to suit the context of online learning?

Asking the right questions

Indeed, it is the variables or conditions for success that we should be examining, not just the technological delivery. In other words, we should be asking a question first posed by Wilbur Schramm as long ago as 1977:

What kinds of learning can different media best facilitate, and under what conditions?

In terms of making decisions then about mode of delivery, we should be asking, not which is the best method overall, but:

What are the most appropriate conditions for using face-to-face, blended or fully online learning respectively? 

So what are the conditions that best suit online learning?

There are a number of possible answers:

  • learners:
    • fully online learning best suits more mature, adult, lifelong learners who already have good independent learning skills and for work and family reasons don’t want to come on campus
    • blended learning or a mix of classroom and fully online courses best suits full time undergraduate students who are also working part-time to keep their debt down, and need the flexibility to do part of their studies online
    • ‘dependent’ learners who lack self-discipline or who don’t know how to manage their own learning probably will do better with face-to-face teaching; however independent learning is a skill that can be taught, so blended learning is a safe way to gradually introduce such students to more independent study methods
  • learning outcomes:
    • embedding technology within the teaching may better enable the development of certain ’21st century skills’, such as independent learning, confidence in using information technologies within a specific subject domain, and knowledge management
    • online learning may provide more time on task to enable more practice of skills, such as problem-solving in math
    • redesign of very large lecture classes, so that lectures are recorded and students come to class for discussion and questions, making the classes more interactive and hence improving learning outcomes

Even this is really putting the question round the wrong way. A better question is:

What are the challenges I am facing as an instructor (or my learners are facing as students) that could be better addressed through online learning? And what form of online learning will work best for my students?

Quality

However, the most important condition influencing the effectiveness of both face-to-face and online teaching is how well it is done. A badly designed and delivered face-to-face class will have worse learning outcomes than a well designed online course – and vice versa. Ensuring quality in online learning will be the topic of the last few blogs in this series.

Implications

  1. Don’t worry about the effectiveness of online learning. Under the right conditions, it works well.
  2. Start with the challenges you face. Keep an open mind when thinking about whether online learning might be a better solution than continuing in the same old way.
  3. If you think it might be a solution for some of your problems, start thinking about the necessary conditions for success. The next few blog posts should help you with this.

Follow up

Here is some suggested further reading on the effectiveness of online learning:

Up next

‘Aren’t MOOCs online learning?’ (to be posted later in the week July 18-22, 2016)

Comparing modes: horses for courses

Comparing modes: horses for courses

MIT aims to expand its research into learning

Diffusion tension imaging Satrajit Ghosh, MIT

Diffusion tension imaging Satrajit Ghosh, MIT

Chandler, D. (2016) New initiatives accelerate learning research and its applications MIT News, February 2

The President of MIT has announced a significant expansion of the Institute’s programs in learning research and online and digital education, through the creation of the MIT Integrated Learning Initiative (MITili).

The integrated science of learning — now emerging as a significant field of research — will be the core of MITili (to be pronounced “mightily”), a cross-disciplinary, Institute-wide initiative to foster rigorous quantitative and qualitative research on how people learn.

MITili will combine research in cognitive psychology, neuroscience, economics, engineering, public policy, and other fields to investigate what methods and approaches to education work best for different people and subjects. The effort will also examine how to improve the educational experience within MIT and in the world at large, at all levels of teaching.

The findings that spin out of MITili will then be applied to improve teaching on campus and online.

Comment

First, I very much welcome this initiative by a prestigious research university seriously to research what MIT calls the ‘science of learning’. Research into learning has generally been relatively poorly funded compared with research into science, engineering and computing.

However, I hope that MIT will approach this in the right way and avoid the hubris they displayed when moving into MOOCs, where they ignored all previous research into online learning.

It is critical that those working in MITili do not assume that there is nothing already known about learning. Although exploring the contribution that the physical sciences, such as biological research into the relationship between brain functionality and learning, can make to our understanding of learning is welcome, as much attention needs to be paid to the environmental conditions that support or inhibit learning, to what kind of teaching approaches encourage different kinds of learning, and to the previous, well-grounded research into the psychology of learning.

In other words, not only a multi-disciplinary, but also a multi-epistemological approach will be needed, drawing as much from educational research and the social sciences as from the natural sciences. Is MIT willing and able to do this? After all, learning is a human, not a mechanical activity, when all is said and done.

Lessons about researching technology-enhanced instruction

Meiori, Amalfi Coast

Meiori, Amalfi Coast – when it’s not raining

Lopes, V. and Dion, N. (2105) Pitfalls and Potential: Lessons from HEQCO-Funded Research on Technology-Enhanced Instruction Toronto ON: Higher Education Quality Council of Ontario

Since it’s raining heavily here on the Amalfi Coast today for the first time in months, I might as well do another blog post.

What this report is about

HEQCO (the Higher Education Quality Council of Ontario) is an independent advisory agency funded by the Ontario Ministry of Training, Colleges, and Universities to provide recommendations for improving quality, accessibility, inter-institutional transfer, system planning, and effectiveness in higher education in Ontario. In 2011, HEQCO:

issued a call for research projects related to technology-enhanced instruction…. Now that the technology studies have concluded and that most have been published, this report draws some broader conclusions from their methods and findings.

What are the main conclusions?

1. There is no clear definition of what ‘technology’ means or what it refers to in many studies that investigate its impact on learning:

One assumes that the nature of the tools under investigation would have an impact on research design and on the metrics being measured. Yet little attention is paid to this problem, which in turns creates challenges when interpreting study findings.

2. There is no clear definition of blended or hybrid learning:

The proportion of online to face-to-face time, as well as the nature of the resources presented online, can both differ considerably. In a policy context, where we may wish to discuss issues across institutions or at a system level, the lack of consensus definitions can be particularly disruptive. In this respect, a universal definition of blended learning, applied consistently to guide practice across all colleges and universities, would be helpful.

3. Students need orientation to/training in the use of the technologies used in their teaching: they are not digital natives in the sense of being intuitively able to use technology for study purposes.

4. Instructors and teaching assistants should also be trained on the use and implementation of technology.

5. The simple presence of technology will rarely enhance a classroom. Instead, some thought has to go into integrating it effectively.

6. New technologies should be implemented not for their own sake but with a specific goal or learning outcome in mind.

7. Many of the HEQCO-funded studies, including several of those with complex study designs and rigorous methodologies, concluded that the technology being assessed had no significant effect on student learning.

8. Researchers in the HEQCO-funded studies faced challenges encouraging student participation, which often led to small sample sizes in situations where classroom-based interventions already limited the potential pool of participants.

9. The integration of technology in postsecondary education has progressed to such a point that we no longer need to ask whether we should use technology in the classroom, but rather which tool to use and how.

10. There is no single, unified, universally accepted model or theory that could be applied to ensure optimal learning in all educational settings.

Comment

I need to be careful in my comments, not because I’m ticked off with the weather here (hey, I live in Vancouver – we know all about rain), but because I’ve spent most of my working life researching technology-enhanced instruction, so what appears blindingly obvious to me is not necessarily obvious to others. So I don’t really know where to start in commenting on this report, except to say I found it immensely depressing.

Let me start by saying that there is really nothing in this report that was not known before the research was done (in other words, if they had asked me, I could have told HEQCO what to expect). I am a great supporter of action or participant research, because the person doing the research learns a great deal. But it is almost impossible to generalise such results, because they are so context-specific, and because the instructor is not usually trained in educational research, there are often – as with these studies – serious methodological flaws.

Second, trying to define technology is like trying to catch a moonbeam. The whole concept of defining a fixed state so that generalisations can be made to the same fixed state is entirely the wrong kind of framework for researching technology influences, because the technology is constantly changing. (This is just another version of the objectivist vs constructivist debate.)

So one major problem with this research is HEQCO’s expectations that the studies would lead to generalisations that could be applied across the system. If HEQCO wants that, it needs to use independent researchers and fund the interventions on a large enough scale – which of course means putting much more money into educational research than most governments are willing to risk. It also means sophisticated design that moves away from matched, controlled comparisons to in-depth case studies, using though rigorous qualitative research methodology.

This illustrates a basic problem with most educational research. It is done on such a small scale that the interventions are unlikely to lead to significant results. If you tweak just a little bit of a complex environment, any change is likely to be swamped by changes in other variables.

The second problem in most of the studies appears to be the failure to link technology-based interventions to changes in learning outcomes. In other words, did the use of technology lead to a different kind of learning? For instance, did the application of the technology lead students to think more critically or manage information well rather than reproduce or memorize what was being taught before? So another lesson is that you have to ask the right kind of research questions that focus on different kinds of learning outcomes.

Thus it is pointless to ask whether technology-based interventions lead to better learning outcomes than classroom teaching. There are too many other variables than technology to provide a definitive answer. The question to ask instead is: what are the required conditions for successful blended or hybrid learning, and what counts as success? The last part of the question means being clear on what different learning outcomes are being sought.

Indeed, there is a case to be made that it may be better not to set firm outcomes before the intervention, but to provide enough flexibility in the teaching context to see what happens when instructors and students have choices to make about technology use. This might mean looking backwards rather than forwards by identifying what most would deem highly successful technology interventions, then working back to see what conditions enabled this success.

But fiddling with the research methods won’t produce much if the intervention is too small scale. Nineteen little, independent studies are great for the instructors, but if we are to learn things than can be generalized, we need fewer but larger, more sophisticated, and more integrated studies. In the meantime, we are no further in being able to improve the design of blended or hybrid learning than before these research studies were done, which is why I am depressed.