September 21, 2017

Responses to the Canadian survey of online and distance learning

Hi, everyone, and welcome back. I hope you all had a great summer. As many readers will know, I am leading a team conducting a survey of online and distance learning in Canadian public post-secondary educational institutions. You can get more general information about the survey from earlier posts:

During the summer the survey team has been extremely busy. We have now completed the collection of data and have started on the analysis and report writing.

Thanks to support from Contact North, we are building a web site for the survey which will contain news about the survey, access to the reports, and opportunities to discuss the results and their implications. However this won’t be ready for a couple of weeks, so I wanted to provide an update on where we are at the moment, especially as I know some of you have been engaged in collecting data for the survey (many thanks!). 

Building a database of institutions

As this is the first year for the survey the focus is exclusively on provincially funded and accredited post-secondary educational institutions, which still represent by far the majority of post-secondary institutions and students in Canada.

One challenge the survey faced was the lack of a commonly used, publicly accessible database of all Canadian public post-secondary educational institutions. We worked our way through the membership listings of Universities Canada, Colleges and Institutes Canada (CICAN), Maclean’s EduHub, and provincial government web sites. From Statistics Canada we could find only aggregate data on student enrolments broken down by province and by part-time or full time students, but not data for individual institutions. 

We ended up with a list of 203 institutions, once we had eliminated duplications, incorporated affiliated colleges and universities with the main institution awarding the qualification, and removed institutions not funded by provincial governments. We also identified institutions by language (anglophone or francophone) and their total student headcount (full-time and part-time), almost entirely from information publicly available through provincial government web sites, although not all provinces provide this information. We then had to identify the appropriate contact person in each institution (usually Provosts or VPs Education).

This process resulted in 

  • 72 universities (35%),
  • 81 colleges outside Québec (40%), and
  • 50 CEGEPs/colleges within Québec (25%).

Of the 203 institutions, 70 (34%) were either francophone institutions or were bi-lingual institutions with a separate francophone program. 

One thing that became clear even at this stage is that there is no consistency between provinces and Statistics Canada on how data about students is collected or reported. Several different measures are used: student headcount (full time, or full time and part-time); student course enrolments; student FTEs (full-time equivalents); and student program enrolments, with variations within each of these broad categories. Also some data include non-credit, continuing education students as well as students taking courses for credit. All this variation in student statistics makes inter-provincial comparisons very difficult. In the end, for the database of all institutions, we used primarily official provincial student headcounts, the measure most common across all provinces.

Statistics Canada’s most recent figures for Canadian post-secondary student enrolments are for the fall of the 2014/2015 academic year (in our survey, we are looking at fall 2016 enrolments). Statistics Canada’s enrolment numbers are based on program counts and not student counts. If a student is enrolled in more than one program as of the snapshot date, then all of their programs are included in the count.

Table 1: Comparison of StatCan student enrolment numbers, and student headcount totals from institutions in the survey population base

Without knowing more about the basis on which Statistics Canada built its data, we cannot explain the difference between the two populations sets, but the differences are relatively small, except for CEGEPs. We are confident we have included all the CEGEP institutions but we probably do not have all enrolled students counted, just those for which the Québec provincial government provides funding, from which we derived the data. Nevertheless, if we take Statistics Canada data as the comparator, our population base appears to represent a very large proportion (93%) of students studying for institutional credit at Canadian public post-secondary institutions.

We will be providing on the survey web site a list of all the institutions we included in the population database.

Response rates

The questionnaire itself was online and was accessed using a link unique for each participant institution. The final cut-off date for the full questionnaire was June 30, 2017. At this point, for those institutions that had not responded, an invitation was sent to complete a shorter questionnaire that excluded questions on student enrolments.

Table 2: Response rate by type of institution

It can be seen that 128 institutions (63%) completed the full questionnaire, and 140 (69%) completed either the full or the shorter version of the questionnaire. The response rate was lower for small institutions (59% overall for institutions with less than 2,000  students, compared with 79% for institutions with more than 10,000 students). The responding institutions were spread proportionately across all provinces and nearly all territories.

If we look at the response rate by the number of student enrolments, Table 3 below indicates that the survey covered institutions with 78% of the overall Canadian student population in public post-secondary education.

Table 3: Student headcounts for institutions responding compared to overall student headcounts.

Conclusion

It should be remembered that this was a voluntary survey with no formal government requirement to complete. Our target was a 75% response rate, which we have achieved in terms of the number of students covered by the survey, although the number of institutions covered fell a little short of the target at 69%. Nevertheless we think we have a large enough response rate to make valid and reliable statements about the state of online and distance learning in Canadian post-secondary education.

This would not have been possible without first of all a huge effort by the institutions to provide the data, and secondly a great deal of support from the various professional associations such as CICAN, Universities Canada, the eCampuses in Ontario, Manitoba, Alberta and British Columbia, Contact North, REFAD, and others too numerous to describe in a short blog post.

Next steps

We are now in the process of analyzing the results. We expect to have a draft report that will go out to selected readers in two weeks time. We will then produce two ‘public’ reports:

  • a main executive report that covers the main findings (in English and French)
  • a full research report that provides an analysis of all the data collected from the survey.

Both these reports will be ready for publication and a launch at the ICDE World Conference on Online Learning in Toronto on October 17, 2017. 

We will also be developing a number of sub-reports, such as one on francophone institutions, and one on Ontario (which was a primary funder of the survey).

In the meantime, as soon as the survey web site is ready I will let you know. This will contain preliminary results and an update on activities surrounding the survey, such as future plans and developments, and, from October 17, copies of all the reports as they become available.

A better ranking system for university teaching?

Who is top dog among UK universities?
Image: © Australian Dog Lover, 2017 http://www.australiandoglover.com/2017/04/dog-olympics-2017-newcastle-april-23.html

Redden, E. (2017) Britain Tries to Evaluate Teaching Quality Inside Higher Ed, June 22

This excellent article describes in detail a new three-tiered rating system of teaching quality at universities introduced by the U.K. government, as well as a thoughtful discussion. As I have a son and daughter-in-law teaching in a U.K. university and grandchildren either as students or potential students, I have more than an academic interest in this topic.

How are the rankings done?

Under the government’s Teaching Excellence Framework (TEF), universities in England and Wales will get one of three ‘awards’: gold, silver and bronze (apparently there are no other categories, such as tin, brass, iron or dross for those whose teaching really sucks). A total of 295 institutions opted to participate in the ratings.

Universities are compared on six quantitative metrics that cover:

  • retention rates
  • student satisfaction with teaching, assessment and academic support (from the National Student Survey)
  • rates of employment/post-graduate education six months after graduation.

However, awards are relative rather than absolute since they are matched against ‘benchmarks calculated to account for the demographic profile of their students and the mix of programs offered.’ 

This process generates a “hypothesis” of gold, silver or bronze, which a panel of assessors then tests against additional evidence submitted for consideration by the university (higher education institutions can make up to a 15-page submission to TEF assessors). Ultimately the decision of gold, silver or bronze is a human judgment, not the pure product of a mathematical formula.

What are the results?

Not what you might think. Although Oxford and Cambridge universities were awarded gold, so were some less prestigious universities such as the University of Loughborough, while some more prestigious universities received a bronze. So at least it provides an alternative ranking system to those that focus mainly on research and peer reputation.

What is the purpose of the rankings?

This is less clear. Ostensibly (i.e., according to the government) it is initially aimed at giving potential students a better way of knowing how universities stand with regard to teaching. However, knowing the Conservative government in the UK, it is much more likely to be used to link tuition fees to institutional performance, as part of the government’s free market approach to higher education. (The U.K. government allowed universities to set their own fees, on the assumption that the less prestigious universities would offer lower tuition fees, but guess what – they almost all opted for the highest level possible, and still were able to fill seats).

What are the pros and cons of this ranking?

For a more detailed discussion, see the article itself but here is my take on it.

Pros

First this is a more thoughtful approach to ranking than the other systems. It focuses on teaching (which will be many potential students’ initial interest in a university) and provides a useful counter-balance to the emphasis on research in other rankings.

Second it has a more sophisticated approach than just counting up scores on different criteria. It has an element of human judgement and an opportunity for universities to make their case about why they should be ranked highly. In other words it tries to tie institutional goals to teaching performance and tries to take into account the very large differences between universities in the U.K. in terms of student socio-economic background and curricula.

Third, it does provide a simple, understandable ‘award’ system of categorizing universities on their quality of teaching that students and their parents can at least understand.

Fourth, and most important of all, it sends a clear message to institutions that teaching matters. This may seem obvious, but for many universities – and especially faculty – the only thing that really matters is research. Whether though this form of ranking will be sufficient to get institutions to pay more than lip service to teaching remains to be seen.

Cons

However, there are a number of cons. First the national student union is against it, partly because it is heavily weighted by student satisfaction ratings based on the National Student Survey, which thousands of students have been boycotting (I’m not sure why). One would have thought that students in particular would value some accountability regarding the quality of teaching. But then, the NUS has bigger issues with the government, such as the appallingly high tuition fees (C$16,000 a year- the opposition party in parliament, Labour, has promised free tuition).

More importantly, there are the general arguments about university rankings that still apply to this one. They measure institutional performance not individual department or instructor performance, which can vary enormously within the same institution. If you want to study physics it doesn’t help if a university has an overall gold ranking but its physics department is crap or if you get the one instructor who shouldn’t be allowed in the building.

Also the actual quantitative measures are surrogates for actual teaching performance. No-one has observed the teaching to develop the rankings, except the students, and student rankings themselves, while one important measure, can also be highly misleading, based on instructor personality and the extent to which the instructor makes them work to get a good grade.

The real problem here is two-fold: first, the difficulty of assessing quality teaching in the first place: one man’s meat is another man’s poison. There is no general agreement, at least within an academic discipline, as to what counts as quality teaching (for instance, understanding, memory of facts, or skills of analysis – maybe all three are important but can how one teaches to develop these diverse attributes be assessed separately?).

The second problem is the lack of quality data on teaching performance – it just isn’t tracked directly. Since a student may take courses from up to 40 different instructors and from several different disciplines/departments in a bachelor’s program, it is no mean task to assess the collective effectiveness of their quality of teaching. So we are left with surrogates of quality, such as completion rates.

So is it a waste of time – or worse?

No, I don’t think so. People are going to be influenced by rankings, whatever. This particular ranking system may be flawed, but it is a lot better than the other rankings which are so much influenced by tradition and elitism. It could be used in ways that the data do not justify, such as justifying tuition fee increases or decreased government funding to institutions. It is though a first systematic attempt at a national level to assess quality in teaching, and with patience and care could be considerably improved. But most of all, it is an attempt to ensure accountability for the quality of teaching that takes account of the diversity of students and the different mandates of institutions. It may make both university administrations and individual faculty pay more attention to the importance of teaching well, and that is something we should all support.

So I give it a silver – a good try but there is definitely room for improvement. 

Thanks to Clayton Wright for drawing my attention to this.

Next up

I’m going to be travelling for the next three weeks so my opportunity to blog will be limited – but that has been the case for the last six months. My apologies – I promise to do better. However, a four hour layover at Pearson Airport does give me some time for blogging!

A brighter future for Athabasca University?

Mid-career retraining is seen as one possible focus for Athabasca University’s future

Coates, K. (2017) Independent Third-Party Review of Athabasca University Saskatoon, SK

This report, 45 pages in length plus extensive appendices, was jointly commissioned by the Government of Alberta and the Governors of Athabasca University.

Why the report?

Because Athabasca University, established in 1971 as a fully distance, open university, has been in serious trouble over the last 10 years. In 2015, its Acting President issued a report saying that ‘Athabasca University (AU) will be unable to pay its debt in two years if immediate action is not taken.’ It needed an additional $25 million just to solve its IT problems. Two years earlier, the AU’s senior administrators were savagely grilled by provincial legislators about the financial management of the university, to such an extent that it seemed that the Government of Alberta might well pull the plug on the university.

However, comes a recent provincial election, comes a radical change of government, leading to a new Board and a new President with a five year term. Although these are essential changes for establishing a secure future of the university, in themselves they are not sufficient. The financial situation of the university is temporarily more secure, but the underlying problem of expenses not being matched by revenue remains. It desperately needs more money from a government that is short of revenues since the oil industry tanked. Also its enrolments have started to drop, due to competition from campus-based universities now offering fully online programs. Lastly it still has the same structural problems with an outdated course design and development model and poor student support services, especially on the academic side.

So although the newish government was willing to suspend judgement, it really needed an independent review before shovelling any new money AU’s way – hence this report.

What does the report say?

I will try to summarise briefly the main findings and recommendations, but as always, it is worth reading the full report, which is relatively concise and easy to read:

  • there is substantial student demand in Alberta, across Canada and internationally for AU’s programs, courses and services;
  • the current business model is not financially sustainable and will not support the institution in the coming decades – but ‘it has the potential if significant changes are made to its structure, approach and program mix, to be a viable, sustainable and highly relevant part of the Alberta post-secondary system’;
  • more money is needed to support its operations, especially if it is to remain headquartered in the (small and somewhat remote) Town of Athabasca; the present government funding arrangement is inadequate for the university’s mix of programs and students, especially regarding the support needed for disadvantaged students and those requiring more flexibility in delivery;
  • the emergence of dozens of credible online university alternatives has undermined AU’s competitive advantage – it no longer has a clear and obvious role within the Provincial post-secondary system;
  • AU should re-brand itself as the leading Canadian centre for online learning and 21st century educational technology, but although it has the educational technology professionals needed to provide leadership, it lacks the ICT model and facilities to rise to this opportunity;
  • Open access: AU should expand its activities associated with population groups that are under-represented in the Alberta and Canadian post-secondary system: women in STEM subject, new Canadians, Indigenous Peoples and students with disabilities;
  • diversification of the student body is necessary to achieve economies of scale; in other words it should expand its reach across Canada and internationally and not limit itself just to Alberta;
  • AU should expand its efforts to educate lifelong learners and should expand its career-focused and advanced educational opportunities – particularly mid-career training and training for new work;
  • although there is overwhelming faculty and staff support for AU’s mandate and general approach, there are considerable institutional and financial barriers to effecting a substantial reorientation in AU operations; however, such a re-orientation is critical for its survival.

My comments

Overall, this is an excellent report. Wisely, it does not dwell on the historical reasons why Athabasca University got itself into its current mess but instead focuses on what its future role should be, what it can uniquely contribute to the province, and what is needed to right the ship, including more money.

However, the main challenges, in my view, remain more internal than external. The Board of Governors, senior administration, faculty, staff and students still need to develop together a clear and shared vision for the future of the institution that presents a strong enough value proposition to the government to justify the increased operational and investment funding that is needed. Although the external reviewer does a good job suggesting what some of the elements of such a vision might be, it has to come from the university community itself. This is long overdue and cannot be delayed much longer otherwise the government’s patience will understandably run out. Money itself is not the issue – it is the value proposition that will persuade the government to prioritise funding for AU that still needs to be made by the university itself. In other words it’s a trust issue – if we give you more money, what will you deliver?

The second major challenge, while strongly linked to vision and funding, is the institutional culture. Major changes in course design, educational technology, student support and administration, marketing and PR are urgently needed to bring AU into advanced 21st century practice in online and distance learning. I fear that while there are visionary faculty and staff at AU who understand this, there is still too much resistance from traditionalists and those who see change as undermining academic excellence or threatening their comfort zone. Without these necessary structural and cultural changes though AU will not be able to implement its vision, no matter how persuasive it is. So there is also a competency issue – if we give you more money, can you deliver on your promises?

I think these are still open questions but at least the external review offers a vote of confidence in the university. Now it is up to the university community to turn this opportunity into something more concrete. But it needs to move fast. The window of opportunity is closing fast.

Update on Canadian survey of online learning

This update builds on two earlier posts:

The online questionnaire has now been distributed by e-mail to every public university and college in Canada, a total of 215 institutions in all. The questionnaire will have been routed through the office of the Provost or VP Education, although it is probable that several people will be involved in each institution in collecting data for the questionnaire. 

There are in fact five versions of the questionnaire:

  • anglophone universities
  • francophone universities
  • anglophone colleges
  • francophone colleges (outside Québec)
  • CEGEPs 

The questionnaire asks for data on

  • distance education enrolments, irrespective of method of delivery
  • online student enrolments (headcount and student course registrations) at different academic levels and in different program areas
  • how many years the institution has been offering online courses
  • the current status of blended and hybrid courses
  • the main technologies being used
  • information about any MOOCs offered
  • future institutional directions in online learning
  • benefits and challenges of online learning.

The deadline for completion has been set at June 12. 

We anticipate the main report will be ready in September, with sub-reports for the following sectors:

  • all universities (anglophone and francophone)
  • all colleges, institutes and CEGEPs
  • all francophone institutions (report in French)

We will also produce other sub-reports on request (for example, a provincial analysis) as well as infographics.

The reports will be available for free on request and the data will be housed at the Ontario College Application Service, and, subject to privacy requirements, will be open to other researchers.

There will be a full presentation of the report and its results at the ICDE Conference on Online Learning in Toronto in October.

We are reliant on e-mails and contact information being up-to-date and sometimes e-mails with attachments get filtered out as spam. So, if you are working in a Canadian public post-secondary institution and are not aware that this data is being collected for this survey, please contact your Provost’s Office to check that the invitation has been received. We need a high response rate from every institution to ensure that the results are valid.

However, to date we are pleased with the immediate response – we already have over 20 full responses within the first week.

One business case for OER examined

A video on electricity from the OpenLearn platform

Law, P. and Perryman, L.-A. (2017) How OpenLearn supports a business model for OER Distance Education, Vol. 38, No. 1

The journal: ‘Distance Education’

Distance Education is one of the oldest and most established journals in the field. It is the journal of the Open and Distance Learning Association of Australia (ODLAA) and over the years it has published some of the best research in distance education. However, it is not an open access journal, so I am providing my own personal review of one of the articles in this, generally excellent, edition. I should point out though that I am a member of the editorial board so do have an interest in supporting this journal.

Editorial

Som Naidu, the editor, does an excellent job of introducing the articles in the journal under the heading of ‘Openness and flexibility are the norm, but what are the challenges?’ He correctly points out that

While distance education is largely responsible for the articulation and spearheading of openness and flexibility as desirable value principles, these educational goals are fast becoming universally attractive across all sectors and modes of education.

The rapid move to blended and more flexible learning and the slow but increasing use of open educational resources (OER) in campus-based based institutions indeed is challenging the uniqueness of distance education in terms of openness and flexibility. It is easy to argue that distance education is now no more than just another delivery option. Nevertheless, there are still important differences, and Som Naidu draws out some interesting comparisons between the experience of on-campus and distance learning that are still valid.

A business model for OER?

In this latest issue of Distance Education, Patrina Law and Leigh-Anne Perryman have written a very interesting paper about the business case for OER based on three surveys of users (in 2013, 2014, and 2015) of the UK Open University’s OpenLearn project. First some information about OpenLearn:

  • OpenLearn is an open content platform. Initially it used samples of course content from the OU’s undergraduate and postgraduate ‘modules’ (courses) but now hosts specially commissioned audio, video and other interactive materials and short online courses including free certificates and badges;
  • OpenLearn now offers the equivalent of 850 free courses representing 5% of the undergraduate and graduate degree content;
  • 6 million people visit each year with a total of 46 million unique visitors since it was established in 2006; 
  • 13% of users go on to enquire about the OU’s formal degree programs (equivalent of about 1,000 student enrolments per year).

Law and Perryman provide an excellent review of the business cases for OER put forward by others such as the OECD and Creative Commons, then use the survey data from OpenLearn users to test these arguments. Here’s what they found:

  • provision of OER is complementary rather than competitive with the OU’s formal degree programming
  • over half the users are UK-based
  • about 20% reported a disability
  • median age was 36-45
  • about 20% indicated that English was not their first language
  • 70% had some form of post-secondary qualification
  • 16% were part-time or full-time students
  • about two-thirds of the users were ‘tasting’ or ‘testing’ content before making a decision about whether to take a formal program (at either the OU or another institution)
  • almost half (45%) used OpenLearn to find out more about the UK OU (22% had never heard of it before and altogether over half knew nothing or little previously about the OU)
  • the average cost of conversion to OER was between £1500-£2000 per course
  • 13% of OpenLearn users clicked through to make a formal enquiry resulting in about 1,000 new student registrations.

Comment

Very importantly, Law and Perryman link the growing use of OpenLearn to the sudden increase in tuition fees in the UK (£9,000 a year in general, and £5,000 per year for an OU full time degree). Students are not willing to risk this cost without being sure they stand a chance of success and have an interest in the subject. OpenLearn allows them to test this.

This is an important point. The UK government policy of very high tuition fees does appear to be negatively impacting access for many potential students, or at least making them think very carefully before committing to such a large investment. The OU in particular has lost student enrolments as its fees have gone up. There is a danger in my mind that OER can be politically used as a diversion from ‘true’ open education for credit that is available to everyone, irrespective of their means. The best form of open education remains a well-funded state system.

This leads to my one serious criticism of the article. Apart from the cost of conversion, no proper analysis of the true cost of OpenLearn is given so the title is misleading. It does not describe a business model, with full input costs and output benefits stated in monetary terms, but a business case which provides uncosted but positive arguments based on other than cost factors. 

This is a really important distinction because the business model depends heavily on adequate funding for the formal, degree programs which provide the base for the OpenLearn materials. Without that funding, and other costs, OpenLearn will quickly become unsustainable. It is not a parasite in the negative sense of the word but it can’t exist without the funding for the core function of the OU. Without a sense of the full cost of OpenLearn it remains difficult to judge whether the obvious benefits are worth the drain on the OU’s other resources, as the money has to come from somewhere.

Otherwise this is a very good article that should read carefully by anyone concerned with policy regarding the use of OER.