November 20, 2017

Responses to the Canadian survey of online and distance learning

Hi, everyone, and welcome back. I hope you all had a great summer. As many readers will know, I am leading a team conducting a survey of online and distance learning in Canadian public post-secondary educational institutions. You can get more general information about the survey from earlier posts:

During the summer the survey team has been extremely busy. We have now completed the collection of data and have started on the analysis and report writing.

Thanks to support from Contact North, we are building a web site for the survey which will contain news about the survey, access to the reports, and opportunities to discuss the results and their implications. However this won’t be ready for a couple of weeks, so I wanted to provide an update on where we are at the moment, especially as I know some of you have been engaged in collecting data for the survey (many thanks!). 

Building a database of institutions

As this is the first year for the survey the focus is exclusively on provincially funded and accredited post-secondary educational institutions, which still represent by far the majority of post-secondary institutions and students in Canada.

One challenge the survey faced was the lack of a commonly used, publicly accessible database of all Canadian public post-secondary educational institutions. We worked our way through the membership listings of Universities Canada, Colleges and Institutes Canada (CICAN), Maclean’s EduHub, and provincial government web sites. From Statistics Canada we could find only aggregate data on student enrolments broken down by province and by part-time or full time students, but not data for individual institutions. 

We ended up with a list of 203 institutions, once we had eliminated duplications, incorporated affiliated colleges and universities with the main institution awarding the qualification, and removed institutions not funded by provincial governments. We also identified institutions by language (anglophone or francophone) and their total student headcount (full-time and part-time), almost entirely from information publicly available through provincial government web sites, although not all provinces provide this information. We then had to identify the appropriate contact person in each institution (usually Provosts or VPs Education).

This process resulted in 

  • 72 universities (35%),
  • 81 colleges outside Québec (40%), and
  • 50 CEGEPs/colleges within Québec (25%).

Of the 203 institutions, 70 (34%) were either francophone institutions or were bi-lingual institutions with a separate francophone program. 

One thing that became clear even at this stage is that there is no consistency between provinces and Statistics Canada on how data about students is collected or reported. Several different measures are used: student headcount (full time, or full time and part-time); student course enrolments; student FTEs (full-time equivalents); and student program enrolments, with variations within each of these broad categories. Also some data include non-credit, continuing education students as well as students taking courses for credit. All this variation in student statistics makes inter-provincial comparisons very difficult. In the end, for the database of all institutions, we used primarily official provincial student headcounts, the measure most common across all provinces.

Statistics Canada’s most recent figures for Canadian post-secondary student enrolments are for the fall of the 2014/2015 academic year (in our survey, we are looking at fall 2016 enrolments). Statistics Canada’s enrolment numbers are based on program counts and not student counts. If a student is enrolled in more than one program as of the snapshot date, then all of their programs are included in the count.

Table 1: Comparison of StatCan student enrolment numbers, and student headcount totals from institutions in the survey population base

Without knowing more about the basis on which Statistics Canada built its data, we cannot explain the difference between the two populations sets, but the differences are relatively small, except for CEGEPs. We are confident we have included all the CEGEP institutions but we probably do not have all enrolled students counted, just those for which the Québec provincial government provides funding, from which we derived the data. Nevertheless, if we take Statistics Canada data as the comparator, our population base appears to represent a very large proportion (93%) of students studying for institutional credit at Canadian public post-secondary institutions.

We will be providing on the survey web site a list of all the institutions we included in the population database.

Response rates

The questionnaire itself was online and was accessed using a link unique for each participant institution. The final cut-off date for the full questionnaire was June 30, 2017. At this point, for those institutions that had not responded, an invitation was sent to complete a shorter questionnaire that excluded questions on student enrolments.

Table 2: Response rate by type of institution

It can be seen that 128 institutions (63%) completed the full questionnaire, and 140 (69%) completed either the full or the shorter version of the questionnaire. The response rate was lower for small institutions (59% overall for institutions with less than 2,000  students, compared with 79% for institutions with more than 10,000 students). The responding institutions were spread proportionately across all provinces and nearly all territories.

If we look at the response rate by the number of student enrolments, Table 3 below indicates that the survey covered institutions with 78% of the overall Canadian student population in public post-secondary education.

Table 3: Student headcounts for institutions responding compared to overall student headcounts.

Conclusion

It should be remembered that this was a voluntary survey with no formal government requirement to complete. Our target was a 75% response rate, which we have achieved in terms of the number of students covered by the survey, although the number of institutions covered fell a little short of the target at 69%. Nevertheless we think we have a large enough response rate to make valid and reliable statements about the state of online and distance learning in Canadian post-secondary education.

This would not have been possible without first of all a huge effort by the institutions to provide the data, and secondly a great deal of support from the various professional associations such as CICAN, Universities Canada, the eCampuses in Ontario, Manitoba, Alberta and British Columbia, Contact North, REFAD, and others too numerous to describe in a short blog post.

Next steps

We are now in the process of analyzing the results. We expect to have a draft report that will go out to selected readers in two weeks time. We will then produce two ‘public’ reports:

  • a main executive report that covers the main findings (in English and French)
  • a full research report that provides an analysis of all the data collected from the survey.

Both these reports will be ready for publication and a launch at the ICDE World Conference on Online Learning in Toronto on October 17, 2017. 

We will also be developing a number of sub-reports, such as one on francophone institutions, and one on Ontario (which was a primary funder of the survey).

In the meantime, as soon as the survey web site is ready I will let you know. This will contain preliminary results and an update on activities surrounding the survey, such as future plans and developments, and, from October 17, copies of all the reports as they become available.

Ontario funds research and innovation in online learning

eCampus Ontario (2017) Research and Innovation: Funded Projects Toronto ON: eCampus Ontario

A few days ago, eCampus Ontario officially announced nearly $2.5 million of grants for research and innovation in online learning for Ontario universities and colleges. This is a separate fund from their grants for developing online courses.

The 45 grants, from a total of 135 proposals, ranged from $17,000 to $100,000 in total. Ryerson University and Mohawk College each had five projects funded, but the University of Waterloo had the most in total grants at $396,000 with Ryerson close behind with $380,000. Mohawk received a total of $259,000, and Algonquin College received $186,000. Of the 45 grants, 14 involved two or more institutions working collaboratively.

The one common factor among all the proposals was their variety. No one area of online learning dominated, although six of the proposals were directly concerned with assessing quality in online courses. Four of the grants were to study ways to improve the course development process or to facilitate faculty better in online teaching.

Then there was a bunch of grants looking at the effectiveness of particular technologies, including four for games/gaming, three for the use of animations or simulations, and grants for exploring virtual labs or the application of virtual reality. There were about four grants focused on the use of online learning for skills development, including one on evaluating competency-based learning.

Lastly, there was a very significant grant of $80,000 to Ryerson University to support the national survey of online and distance education that I am leading.

Comment

Even setting aside my gratitude for my own grant, eCampus Ontario and the Ontario government deserve praise for investing in research and development at this level. There has been a desperate lack of funding for research or development in online learning in Canada, at least in recent years, and hopefully a great deal of learning, new developments and innovation in online learning will emerge from this process. 

The major challenge now will be to ensure that the projects disseminate their results across the system, so that major innovations do not just hide within tiny corners of the institutions. I am eagerly looking forward to seeing what emerges from these grants.

EDEN Research Workshop, October, 2016

The city of Olenburg Image: © Marcus Thielen, 2015

The city of Oldenburg
Image: © Marcus Thielen, 2015

What: Forging New Pathways of research and innovation in open and distance learning: reaching from the roots

The Ninth EDEN Research Workshop in Oldenburg, Germany, will bring together researchers from all walks of life and provide a platform for engaging in discussion and debate, exchanging research ideas, and presenting new developments in ODL, with the goal of creating dialogues and forming opportunities for research collaboration.

Workshop Themes:

  • emerging distance education systems and theories
  • management and organizational models and approaches
  • evolving practices in technology-enhanced learning and teaching

Keynotes:

  • Olaf Zawacki-Richter, Carl von Ossietzki University, Oldenburg
  • Inge de Waard, The Open University, UK
  • Adnan Qayyum, Penn State university, USA
  • Som Naidu, Monash University, Australia
  • Paul Prinsloo, University of South Africa
  • George Veletsianos, Royal Roads University, Canada
  • Isa Jahnke, University of Missouri, USA

Types of sessions:

  • paper presentations
  • hands-on workshops
  • posters
  • demonstrations
  • ‘synergy’ sessions (to share and discuss EU projects)
  • training sessions

Where: Carl von Ossietzki University, Oldenburg, Germany. Oldenburg is a charming city in north east Germany between Bremen and Groningen.

When: 4-7 October, 2016

Who: The European Distance and e-Learning Network and the Centre for Distance Education, Carl von Ossietzki University. The university is a partner with the University of Maryland University College in offering a fully online Master in Distance Education and e-Learning, which has been running for many years. The Centre for Distance Education has published 15 books on distance education and e-learning in its ASF series.

How: Registration opens mid-August. For more details on registration, fees and accommodation go to the conference web site

Comment: EDEN Research Workshops are one of my favourite professional development activities. They bring together online learning researchers from all over Europe, and it is a remarkably efficient way to keep up to date not only with the latest research but also the technology trends in open and distance education that are getting serious attention. The conference is usually small (about 100-200 participants) and very well focused on practical aspects of research and practice in online learning and distance education.

 

Online learning for beginners: 2. Isn’t online learning worse than face-to-face teaching?

Distance education: anyone sitting more than 10 rows from the front

Distance learning: anyone sitting more than 10 rows from the front

The short answer to this question is: no, online learning is neither inherently worse – nor better – than face-to-face teaching; it all depends on the circumstances.

The research evidence

There have been thousands of studies comparing face-to-face teaching to teaching with a wide range of different technologies, such as televised lectures, computer-based learning, and online learning, or comparing face-to-face teaching with distance education.

With regard to online learning there have been several meta-studies. A meta-study combines the results of many ‘well-conducted scientific’ studies, usually studies that use the matched comparisons or quasi-experimental method (Means et al., 2011; Barnard et al., 2014). Nearly all such ‘well-conducted’ meta-studies find no or little significant difference in the modes of delivery, in terms of the effect on student learning or performance. For instance, Means et al. (2011), in a major meta-analysis of research on blended and online learning for the U.S. Department of Education, reported:

In recent experimental and quasi-experimental studies contrasting blends of online and face-to-face instruction with conventional face-to-face classes, blended instruction has been more effective, providing a rationale for the effort required to design and implement blended approaches. When used by itself, online learning appears to be as effective as conventional classroom instruction, but not more so.

However, the ‘no significant difference’ finding is often misinterpreted. If there is no difference, then why do online learning? I’m comfortable teaching face-to-face, so why should I change?

This is a misinterpretation of the findings, because there may indeed within any particular study be large differences between conditions (face-to-face vs online), but they cancel each other out over a wide range of studies, or because with matched comparisons you are looking at only very specific, strictly comparable conditions, that never exist in a real teaching context.

For instance the ‘base’ variable chosen is nearly always the traditional classroom. In order to make a ‘scientific’ comparison, the same learning objectives and same treatment (teaching) is applied to the comparative condition (online learning). This means using exactly the same kind of students, for instance, in both conditions. But what if (as is the case) online learning better suits non-traditional students, or will achieve better learning outcomes if the teaching is designed differently to suit the context of online learning?

Asking the right questions

Indeed, it is the variables or conditions for success that we should be examining, not just the technological delivery. In other words, we should be asking a question first posed by Wilbur Schramm as long ago as 1977:

What kinds of learning can different media best facilitate, and under what conditions?

In terms of making decisions then about mode of delivery, we should be asking, not which is the best method overall, but:

What are the most appropriate conditions for using face-to-face, blended or fully online learning respectively? 

So what are the conditions that best suit online learning?

There are a number of possible answers:

  • learners:
    • fully online learning best suits more mature, adult, lifelong learners who already have good independent learning skills and for work and family reasons don’t want to come on campus
    • blended learning or a mix of classroom and fully online courses best suits full time undergraduate students who are also working part-time to keep their debt down, and need the flexibility to do part of their studies online
    • ‘dependent’ learners who lack self-discipline or who don’t know how to manage their own learning probably will do better with face-to-face teaching; however independent learning is a skill that can be taught, so blended learning is a safe way to gradually introduce such students to more independent study methods
  • learning outcomes:
    • embedding technology within the teaching may better enable the development of certain ’21st century skills’, such as independent learning, confidence in using information technologies within a specific subject domain, and knowledge management
    • online learning may provide more time on task to enable more practice of skills, such as problem-solving in math
    • redesign of very large lecture classes, so that lectures are recorded and students come to class for discussion and questions, making the classes more interactive and hence improving learning outcomes

Even this is really putting the question round the wrong way. A better question is:

What are the challenges I am facing as an instructor (or my learners are facing as students) that could be better addressed through online learning? And what form of online learning will work best for my students?

Quality

However, the most important condition influencing the effectiveness of both face-to-face and online teaching is how well it is done. A badly designed and delivered face-to-face class will have worse learning outcomes than a well designed online course – and vice versa. Ensuring quality in online learning will be the topic of the last few blogs in this series.

Implications

  1. Don’t worry about the effectiveness of online learning. Under the right conditions, it works well.
  2. Start with the challenges you face. Keep an open mind when thinking about whether online learning might be a better solution than continuing in the same old way.
  3. If you think it might be a solution for some of your problems, start thinking about the necessary conditions for success. The next few blog posts should help you with this.

Follow up

Here is some suggested further reading on the effectiveness of online learning:

Up next

‘Aren’t MOOCs online learning?’ (to be posted later in the week July 18-22, 2016)

Comparing modes: horses for courses

Comparing modes: horses for courses

MIT aims to expand its research into learning

Diffusion tension imaging Satrajit Ghosh, MIT

Diffusion tension imaging Satrajit Ghosh, MIT

Chandler, D. (2016) New initiatives accelerate learning research and its applications MIT News, February 2

The President of MIT has announced a significant expansion of the Institute’s programs in learning research and online and digital education, through the creation of the MIT Integrated Learning Initiative (MITili).

The integrated science of learning — now emerging as a significant field of research — will be the core of MITili (to be pronounced “mightily”), a cross-disciplinary, Institute-wide initiative to foster rigorous quantitative and qualitative research on how people learn.

MITili will combine research in cognitive psychology, neuroscience, economics, engineering, public policy, and other fields to investigate what methods and approaches to education work best for different people and subjects. The effort will also examine how to improve the educational experience within MIT and in the world at large, at all levels of teaching.

The findings that spin out of MITili will then be applied to improve teaching on campus and online.

Comment

First, I very much welcome this initiative by a prestigious research university seriously to research what MIT calls the ‘science of learning’. Research into learning has generally been relatively poorly funded compared with research into science, engineering and computing.

However, I hope that MIT will approach this in the right way and avoid the hubris they displayed when moving into MOOCs, where they ignored all previous research into online learning.

It is critical that those working in MITili do not assume that there is nothing already known about learning. Although exploring the contribution that the physical sciences, such as biological research into the relationship between brain functionality and learning, can make to our understanding of learning is welcome, as much attention needs to be paid to the environmental conditions that support or inhibit learning, to what kind of teaching approaches encourage different kinds of learning, and to the previous, well-grounded research into the psychology of learning.

In other words, not only a multi-disciplinary, but also a multi-epistemological approach will be needed, drawing as much from educational research and the social sciences as from the natural sciences. Is MIT willing and able to do this? After all, learning is a human, not a mechanical activity, when all is said and done.