September 21, 2017

Responses to the Canadian survey of online and distance learning

Hi, everyone, and welcome back. I hope you all had a great summer. As many readers will know, I am leading a team conducting a survey of online and distance learning in Canadian public post-secondary educational institutions. You can get more general information about the survey from earlier posts:

During the summer the survey team has been extremely busy. We have now completed the collection of data and have started on the analysis and report writing.

Thanks to support from Contact North, we are building a web site for the survey which will contain news about the survey, access to the reports, and opportunities to discuss the results and their implications. However this won’t be ready for a couple of weeks, so I wanted to provide an update on where we are at the moment, especially as I know some of you have been engaged in collecting data for the survey (many thanks!). 

Building a database of institutions

As this is the first year for the survey the focus is exclusively on provincially funded and accredited post-secondary educational institutions, which still represent by far the majority of post-secondary institutions and students in Canada.

One challenge the survey faced was the lack of a commonly used, publicly accessible database of all Canadian public post-secondary educational institutions. We worked our way through the membership listings of Universities Canada, Colleges and Institutes Canada (CICAN), Maclean’s EduHub, and provincial government web sites. From Statistics Canada we could find only aggregate data on student enrolments broken down by province and by part-time or full time students, but not data for individual institutions. 

We ended up with a list of 203 institutions, once we had eliminated duplications, incorporated affiliated colleges and universities with the main institution awarding the qualification, and removed institutions not funded by provincial governments. We also identified institutions by language (anglophone or francophone) and their total student headcount (full-time and part-time), almost entirely from information publicly available through provincial government web sites, although not all provinces provide this information. We then had to identify the appropriate contact person in each institution (usually Provosts or VPs Education).

This process resulted in 

  • 72 universities (35%),
  • 81 colleges outside Québec (40%), and
  • 50 CEGEPs/colleges within Québec (25%).

Of the 203 institutions, 70 (34%) were either francophone institutions or were bi-lingual institutions with a separate francophone program. 

One thing that became clear even at this stage is that there is no consistency between provinces and Statistics Canada on how data about students is collected or reported. Several different measures are used: student headcount (full time, or full time and part-time); student course enrolments; student FTEs (full-time equivalents); and student program enrolments, with variations within each of these broad categories. Also some data include non-credit, continuing education students as well as students taking courses for credit. All this variation in student statistics makes inter-provincial comparisons very difficult. In the end, for the database of all institutions, we used primarily official provincial student headcounts, the measure most common across all provinces.

Statistics Canada’s most recent figures for Canadian post-secondary student enrolments are for the fall of the 2014/2015 academic year (in our survey, we are looking at fall 2016 enrolments). Statistics Canada’s enrolment numbers are based on program counts and not student counts. If a student is enrolled in more than one program as of the snapshot date, then all of their programs are included in the count.

Table 1: Comparison of StatCan student enrolment numbers, and student headcount totals from institutions in the survey population base

Without knowing more about the basis on which Statistics Canada built its data, we cannot explain the difference between the two populations sets, but the differences are relatively small, except for CEGEPs. We are confident we have included all the CEGEP institutions but we probably do not have all enrolled students counted, just those for which the Québec provincial government provides funding, from which we derived the data. Nevertheless, if we take Statistics Canada data as the comparator, our population base appears to represent a very large proportion (93%) of students studying for institutional credit at Canadian public post-secondary institutions.

We will be providing on the survey web site a list of all the institutions we included in the population database.

Response rates

The questionnaire itself was online and was accessed using a link unique for each participant institution. The final cut-off date for the full questionnaire was June 30, 2017. At this point, for those institutions that had not responded, an invitation was sent to complete a shorter questionnaire that excluded questions on student enrolments.

Table 2: Response rate by type of institution

It can be seen that 128 institutions (63%) completed the full questionnaire, and 140 (69%) completed either the full or the shorter version of the questionnaire. The response rate was lower for small institutions (59% overall for institutions with less than 2,000  students, compared with 79% for institutions with more than 10,000 students). The responding institutions were spread proportionately across all provinces and nearly all territories.

If we look at the response rate by the number of student enrolments, Table 3 below indicates that the survey covered institutions with 78% of the overall Canadian student population in public post-secondary education.

Table 3: Student headcounts for institutions responding compared to overall student headcounts.

Conclusion

It should be remembered that this was a voluntary survey with no formal government requirement to complete. Our target was a 75% response rate, which we have achieved in terms of the number of students covered by the survey, although the number of institutions covered fell a little short of the target at 69%. Nevertheless we think we have a large enough response rate to make valid and reliable statements about the state of online and distance learning in Canadian post-secondary education.

This would not have been possible without first of all a huge effort by the institutions to provide the data, and secondly a great deal of support from the various professional associations such as CICAN, Universities Canada, the eCampuses in Ontario, Manitoba, Alberta and British Columbia, Contact North, REFAD, and others too numerous to describe in a short blog post.

Next steps

We are now in the process of analyzing the results. We expect to have a draft report that will go out to selected readers in two weeks time. We will then produce two ‘public’ reports:

  • a main executive report that covers the main findings (in English and French)
  • a full research report that provides an analysis of all the data collected from the survey.

Both these reports will be ready for publication and a launch at the ICDE World Conference on Online Learning in Toronto on October 17, 2017. 

We will also be developing a number of sub-reports, such as one on francophone institutions, and one on Ontario (which was a primary funder of the survey).

In the meantime, as soon as the survey web site is ready I will let you know. This will contain preliminary results and an update on activities surrounding the survey, such as future plans and developments, and, from October 17, copies of all the reports as they become available.

A better ranking system for university teaching?

Who is top dog among UK universities?
Image: © Australian Dog Lover, 2017 http://www.australiandoglover.com/2017/04/dog-olympics-2017-newcastle-april-23.html

Redden, E. (2017) Britain Tries to Evaluate Teaching Quality Inside Higher Ed, June 22

This excellent article describes in detail a new three-tiered rating system of teaching quality at universities introduced by the U.K. government, as well as a thoughtful discussion. As I have a son and daughter-in-law teaching in a U.K. university and grandchildren either as students or potential students, I have more than an academic interest in this topic.

How are the rankings done?

Under the government’s Teaching Excellence Framework (TEF), universities in England and Wales will get one of three ‘awards’: gold, silver and bronze (apparently there are no other categories, such as tin, brass, iron or dross for those whose teaching really sucks). A total of 295 institutions opted to participate in the ratings.

Universities are compared on six quantitative metrics that cover:

  • retention rates
  • student satisfaction with teaching, assessment and academic support (from the National Student Survey)
  • rates of employment/post-graduate education six months after graduation.

However, awards are relative rather than absolute since they are matched against ‘benchmarks calculated to account for the demographic profile of their students and the mix of programs offered.’ 

This process generates a “hypothesis” of gold, silver or bronze, which a panel of assessors then tests against additional evidence submitted for consideration by the university (higher education institutions can make up to a 15-page submission to TEF assessors). Ultimately the decision of gold, silver or bronze is a human judgment, not the pure product of a mathematical formula.

What are the results?

Not what you might think. Although Oxford and Cambridge universities were awarded gold, so were some less prestigious universities such as the University of Loughborough, while some more prestigious universities received a bronze. So at least it provides an alternative ranking system to those that focus mainly on research and peer reputation.

What is the purpose of the rankings?

This is less clear. Ostensibly (i.e., according to the government) it is initially aimed at giving potential students a better way of knowing how universities stand with regard to teaching. However, knowing the Conservative government in the UK, it is much more likely to be used to link tuition fees to institutional performance, as part of the government’s free market approach to higher education. (The U.K. government allowed universities to set their own fees, on the assumption that the less prestigious universities would offer lower tuition fees, but guess what – they almost all opted for the highest level possible, and still were able to fill seats).

What are the pros and cons of this ranking?

For a more detailed discussion, see the article itself but here is my take on it.

Pros

First this is a more thoughtful approach to ranking than the other systems. It focuses on teaching (which will be many potential students’ initial interest in a university) and provides a useful counter-balance to the emphasis on research in other rankings.

Second it has a more sophisticated approach than just counting up scores on different criteria. It has an element of human judgement and an opportunity for universities to make their case about why they should be ranked highly. In other words it tries to tie institutional goals to teaching performance and tries to take into account the very large differences between universities in the U.K. in terms of student socio-economic background and curricula.

Third, it does provide a simple, understandable ‘award’ system of categorizing universities on their quality of teaching that students and their parents can at least understand.

Fourth, and most important of all, it sends a clear message to institutions that teaching matters. This may seem obvious, but for many universities – and especially faculty – the only thing that really matters is research. Whether though this form of ranking will be sufficient to get institutions to pay more than lip service to teaching remains to be seen.

Cons

However, there are a number of cons. First the national student union is against it, partly because it is heavily weighted by student satisfaction ratings based on the National Student Survey, which thousands of students have been boycotting (I’m not sure why). One would have thought that students in particular would value some accountability regarding the quality of teaching. But then, the NUS has bigger issues with the government, such as the appallingly high tuition fees (C$16,000 a year- the opposition party in parliament, Labour, has promised free tuition).

More importantly, there are the general arguments about university rankings that still apply to this one. They measure institutional performance not individual department or instructor performance, which can vary enormously within the same institution. If you want to study physics it doesn’t help if a university has an overall gold ranking but its physics department is crap or if you get the one instructor who shouldn’t be allowed in the building.

Also the actual quantitative measures are surrogates for actual teaching performance. No-one has observed the teaching to develop the rankings, except the students, and student rankings themselves, while one important measure, can also be highly misleading, based on instructor personality and the extent to which the instructor makes them work to get a good grade.

The real problem here is two-fold: first, the difficulty of assessing quality teaching in the first place: one man’s meat is another man’s poison. There is no general agreement, at least within an academic discipline, as to what counts as quality teaching (for instance, understanding, memory of facts, or skills of analysis – maybe all three are important but can how one teaches to develop these diverse attributes be assessed separately?).

The second problem is the lack of quality data on teaching performance – it just isn’t tracked directly. Since a student may take courses from up to 40 different instructors and from several different disciplines/departments in a bachelor’s program, it is no mean task to assess the collective effectiveness of their quality of teaching. So we are left with surrogates of quality, such as completion rates.

So is it a waste of time – or worse?

No, I don’t think so. People are going to be influenced by rankings, whatever. This particular ranking system may be flawed, but it is a lot better than the other rankings which are so much influenced by tradition and elitism. It could be used in ways that the data do not justify, such as justifying tuition fee increases or decreased government funding to institutions. It is though a first systematic attempt at a national level to assess quality in teaching, and with patience and care could be considerably improved. But most of all, it is an attempt to ensure accountability for the quality of teaching that takes account of the diversity of students and the different mandates of institutions. It may make both university administrations and individual faculty pay more attention to the importance of teaching well, and that is something we should all support.

So I give it a silver – a good try but there is definitely room for improvement. 

Thanks to Clayton Wright for drawing my attention to this.

Next up

I’m going to be travelling for the next three weeks so my opportunity to blog will be limited – but that has been the case for the last six months. My apologies – I promise to do better. However, a four hour layover at Pearson Airport does give me some time for blogging!

Update on Canadian survey of online learning

This update builds on two earlier posts:

The online questionnaire has now been distributed by e-mail to every public university and college in Canada, a total of 215 institutions in all. The questionnaire will have been routed through the office of the Provost or VP Education, although it is probable that several people will be involved in each institution in collecting data for the questionnaire. 

There are in fact five versions of the questionnaire:

  • anglophone universities
  • francophone universities
  • anglophone colleges
  • francophone colleges (outside Québec)
  • CEGEPs 

The questionnaire asks for data on

  • distance education enrolments, irrespective of method of delivery
  • online student enrolments (headcount and student course registrations) at different academic levels and in different program areas
  • how many years the institution has been offering online courses
  • the current status of blended and hybrid courses
  • the main technologies being used
  • information about any MOOCs offered
  • future institutional directions in online learning
  • benefits and challenges of online learning.

The deadline for completion has been set at June 12. 

We anticipate the main report will be ready in September, with sub-reports for the following sectors:

  • all universities (anglophone and francophone)
  • all colleges, institutes and CEGEPs
  • all francophone institutions (report in French)

We will also produce other sub-reports on request (for example, a provincial analysis) as well as infographics.

The reports will be available for free on request and the data will be housed at the Ontario College Application Service, and, subject to privacy requirements, will be open to other researchers.

There will be a full presentation of the report and its results at the ICDE Conference on Online Learning in Toronto in October.

We are reliant on e-mails and contact information being up-to-date and sometimes e-mails with attachments get filtered out as spam. So, if you are working in a Canadian public post-secondary institution and are not aware that this data is being collected for this survey, please contact your Provost’s Office to check that the invitation has been received. We need a high response rate from every institution to ensure that the results are valid.

However, to date we are pleased with the immediate response – we already have over 20 full responses within the first week.

One business case for OER examined

A video on electricity from the OpenLearn platform

Law, P. and Perryman, L.-A. (2017) How OpenLearn supports a business model for OER Distance Education, Vol. 38, No. 1

The journal: ‘Distance Education’

Distance Education is one of the oldest and most established journals in the field. It is the journal of the Open and Distance Learning Association of Australia (ODLAA) and over the years it has published some of the best research in distance education. However, it is not an open access journal, so I am providing my own personal review of one of the articles in this, generally excellent, edition. I should point out though that I am a member of the editorial board so do have an interest in supporting this journal.

Editorial

Som Naidu, the editor, does an excellent job of introducing the articles in the journal under the heading of ‘Openness and flexibility are the norm, but what are the challenges?’ He correctly points out that

While distance education is largely responsible for the articulation and spearheading of openness and flexibility as desirable value principles, these educational goals are fast becoming universally attractive across all sectors and modes of education.

The rapid move to blended and more flexible learning and the slow but increasing use of open educational resources (OER) in campus-based based institutions indeed is challenging the uniqueness of distance education in terms of openness and flexibility. It is easy to argue that distance education is now no more than just another delivery option. Nevertheless, there are still important differences, and Som Naidu draws out some interesting comparisons between the experience of on-campus and distance learning that are still valid.

A business model for OER?

In this latest issue of Distance Education, Patrina Law and Leigh-Anne Perryman have written a very interesting paper about the business case for OER based on three surveys of users (in 2013, 2014, and 2015) of the UK Open University’s OpenLearn project. First some information about OpenLearn:

  • OpenLearn is an open content platform. Initially it used samples of course content from the OU’s undergraduate and postgraduate ‘modules’ (courses) but now hosts specially commissioned audio, video and other interactive materials and short online courses including free certificates and badges;
  • OpenLearn now offers the equivalent of 850 free courses representing 5% of the undergraduate and graduate degree content;
  • 6 million people visit each year with a total of 46 million unique visitors since it was established in 2006; 
  • 13% of users go on to enquire about the OU’s formal degree programs (equivalent of about 1,000 student enrolments per year).

Law and Perryman provide an excellent review of the business cases for OER put forward by others such as the OECD and Creative Commons, then use the survey data from OpenLearn users to test these arguments. Here’s what they found:

  • provision of OER is complementary rather than competitive with the OU’s formal degree programming
  • over half the users are UK-based
  • about 20% reported a disability
  • median age was 36-45
  • about 20% indicated that English was not their first language
  • 70% had some form of post-secondary qualification
  • 16% were part-time or full-time students
  • about two-thirds of the users were ‘tasting’ or ‘testing’ content before making a decision about whether to take a formal program (at either the OU or another institution)
  • almost half (45%) used OpenLearn to find out more about the UK OU (22% had never heard of it before and altogether over half knew nothing or little previously about the OU)
  • the average cost of conversion to OER was between £1500-£2000 per course
  • 13% of OpenLearn users clicked through to make a formal enquiry resulting in about 1,000 new student registrations.

Comment

Very importantly, Law and Perryman link the growing use of OpenLearn to the sudden increase in tuition fees in the UK (£9,000 a year in general, and £5,000 per year for an OU full time degree). Students are not willing to risk this cost without being sure they stand a chance of success and have an interest in the subject. OpenLearn allows them to test this.

This is an important point. The UK government policy of very high tuition fees does appear to be negatively impacting access for many potential students, or at least making them think very carefully before committing to such a large investment. The OU in particular has lost student enrolments as its fees have gone up. There is a danger in my mind that OER can be politically used as a diversion from ‘true’ open education for credit that is available to everyone, irrespective of their means. The best form of open education remains a well-funded state system.

This leads to my one serious criticism of the article. Apart from the cost of conversion, no proper analysis of the true cost of OpenLearn is given so the title is misleading. It does not describe a business model, with full input costs and output benefits stated in monetary terms, but a business case which provides uncosted but positive arguments based on other than cost factors. 

This is a really important distinction because the business model depends heavily on adequate funding for the formal, degree programs which provide the base for the OpenLearn materials. Without that funding, and other costs, OpenLearn will quickly become unsustainable. It is not a parasite in the negative sense of the word but it can’t exist without the funding for the core function of the OU. Without a sense of the full cost of OpenLearn it remains difficult to judge whether the obvious benefits are worth the drain on the OU’s other resources, as the money has to come from somewhere.

Otherwise this is a very good article that should read carefully by anyone concerned with policy regarding the use of OER.

Latest data on USA distance education enrolments

An extract from the Digital Learning Compass infographic available from here

Digital Learning Compass (2017) Distance Education Enrolment Report 2017 Wellesley MA

A new partnership for the analysis of distance education data in the USA

First, a little background. Most of the readers of this blog will be familiar with the reports from the Babson Survey Research Group (BSRG) on the state of online learning in the USA. When the U.S. Department of Education’s Integrated Postsecondary Education Survey (IPEDS) began collecting data on distance learning enrolments in the Fall of 2012, BSRG stopped collecting its own data then formed a partnership with e-Literate and WCET to create Digital Learning Compass with the following goal:

To be the definitive source of information on the patterns and trends of U.S. postsecondary distance learning.

The Distance Education Enrolment Report 2017 is Digital Learning Compass’s analysis of the data collected by IPEDS for the fall of 2015.

Main results

In brief, in the USA in 2015:

  • distance education enrolments increased by almost 4% 
  • almost 30% of all post-secondary students in the USA are taking at least one DE course
  • 14% of all students are taking only DE courses
  • 83% of DE enrolments are in undergraduate courses
  • just over two-thirds of DE enrolments are in public universities or colleges
  • although there has been increased growth in DE enrolments for public and for non-profit private universities, DE enrolments in for-profit institutions declined in 2015 for the third year in a row, driven by substantial decreases in just a few of the for-profit institutions
  • almost half of all DE enrolments are concentrated in less than 5% of all institutions, with the top 47 institutions accounting for almost a third of all DE students
  • the following institutions saw the greatest year-on-year growth in DE enrolments:
    • University of Southern New Hampshire (from 11,286 to 56,371 in one year)
    • Western Governors University,
    • Brigham Young University-Idaho,
    • University of Central Florida,
    • Grand Canyon University
  • the number of students studying on a campus has dropped by almost one million (931,317) between 2012 and 2015.

More detailed analysis can be found from:

Comment

First a declaration of interest: I am working closely with both Jeff Seaman of Babson and Russ Poulin of WCET on the Canadian national survey of online and distance education in Canada.

Despite a small drop in overall enrolments in the USA in 2015, DE enrolments continued to grow, although in the three years from 2012 to 2015 the pace of growth has slowed. The main change was in the for-profit sector, probably affected by federal pressure on the use of student loans and congressional pressure for greater regulation of for-profit institutions under the Obama administration.

Indeed there has been a considerable shake-up in the for-profit sector in the USA, the purchase of Kaplan by Purdue, a state-funded university, being the latest example. It will be interesting to watch what happens to the for-profit DE enrolments under the more liberal regulatory environment being brought in by the Trump administration. Will they rebound? 

However perhaps the most shocking result is the drop in campus-based enrolments of almost one million, no doubt due to the increased cost of attending college in the USA – or is this in fact due to the impact of six million enrolments in distance education courses?

Once again, here in Canada we are peering over the wall at our much larger and richer neighbours, wondering what’s going on, but at least it is now a well lit property thanks to these reports.