June 18, 2018

Some very good news for Athabasca University (and its students)

Athabasca University convocation

Graney, J. (2018) Athabasca University gets $4.9 million grant to upgrade outdated IT Edmonton Journal, June 8

One year on from the delivery of the Coates Report, an external review of the university, the Alberta provincial government has announced a one-off additional grant of almost $5 million to the university to help it overcome some of the problems it has been facing. The money is earmarked as follows:

  • $1.5 million to implement the university’s new strategic plan, which is in response to the recommendations in the Coates Report
  • $1.5 million to develop and implement a plan to improve student delivery services
  • $1.5 million to implement the university’s plan to upgrade its IT system, moving to a cloud-based system
  •  $400,000 to develop a long-range plan to renew the university’s teaching and learning framework.

The grant will enable Athabasca University to modernize significantly its digital learning environment and upgrade the existing IT infrastructure.

Marlin Schmidt, the Minister for Advanced Education in Alberta, is quoted as saying:

I am pleased with the progress made by the university to ensure that the recommendations in the Coates Report are implemented. I know these additional investments will support the university’s long-term success.

Comment

This is very good news for both the university and especially its students. It indicates that the Alberta government has confidence in the future of the university, and the funding provides necessary resources for modernizing and improving the quality of its teaching and other student services.

Once again though I am disappointed by the headline in the Edmonton Journal. ‘Athabasca University gets $4.9 million to become a world leader in digital learning’ would  have been a more accurate headline.

True, the university needs to upgrade its IT infrastructure, which was the subject of a scathing audit by the provincial auditor-general, but the majority of the funding has quite rightly gone to ensuring that the overall strategic plan is implemented, and to improving the quality of student services and the quality of teaching.

Congratulations to everyone at AU on getting this far so quickly since the Coates Report. Now you just have to do it.

 

A new survey of online learning in Canadian universities and colleges for 2018

The News

Following the success of the 2017 national survey of online learning in Canadian post-secondary education, an invitation to participate in the 2018 version of the survey will go out to all Canadian universities and colleges in the next few days.

The team

This year the team is being led by Tricia Donovan, formerly Director of eCampus Alberta, with support from Eric Martel, Denis Mayer, Vivian Forssman, Brian Desbiens, Ross Paul, Jeff Seaman, Russ Poulin, and myself.

Funding

With support so far confirmed from eCampus Ontario, Contact North, Campus Manitoba and BCcampus, we have the minimum funding required to guarantee the survey this year, but we are also in discussions with other sponsors.

Questionnaire

The questionnaire will be similar to last year but there will be some changes in the light of experience from last year. The focus however will still be on obtaining accurate data about online and distance learning enrolments, and institutional policies.

Distribution

As a result of the 2017 survey, we now have a more complete list of institutions and more accurate contact information for each institution. The invitation will go to the main contact in each institution, with a copy to other contacts on our list. The questionnaire will continue to have both anglophone and francophone versions. We have added to the existing database some federal institutions, some private colleges with significant public funding, and some institutions we missed last year, especially in Québec.

Once again, we will be asking a wide range of organizations to help in the promotion of the 2018 survey.

Response time

We will be asking all institutions to complete the survey within three weeks of receiving the invitation, as we did last year. We anticipate having the 2018 reports ready by November, 2018.

Organization

With the help of the Ontario College Admission System, we have established a non-profit organization, the Canadian Digital Learning Research Association/Association Canadienne de Recherche sur la Formation en Ligne, to administer the funding and management of the survey. The Directors of the Association are Tricia Donovan, Denis Mayer and myself.

We will also be establishing a longer-term advisory group, but our priority at the moment is to get out this year’s questionnaire.

Web sites

The two existing survey web sites, onlinelearningsurveycanada.ca and formationenlignecanada.ca, will continue. We will maintain all the 2017 reports and data, but we are creating new spaces for the 2018 survey.

What you can do

If you work in a Canadian university or college, please lend your support to this survey. Last year’s results have already had a tremendous impact on institutional and government policies.

In most cases the invitation will have gone to the Provost’s Office or the Office of the VP Education, with copies to other centres such as Continuing Studies, Institutional Research, the Registry or the Centre for Teaching and Learning, depending on the institutional organization.

If by June 22, 2018 you think your institution should have received an invitation to participate but you have heard nothing, and you should have done, please contact tricia.donovan01@gmail.com or tony.bates@ubc.ca.

We know that internal communication can sometimes be a problem!

And thank you!

If you are involved in providing data or answers to the questionnaire, we thank you sincerely for your efforts. We realise the survey involves quite a lot of work and we do really appreciate your efforts if you are involved

Is there online learning in North Korea?

An online lecture from a North Korean university

Kang, T-J. (2018) Online learning in North Korea The Diplomat, May 25

You may have noticed that North Korea has been in the news quite a bit recently, so the question arose in my mind, is there online learning in North Korea?

No, your intrepid reporter did not hop on a plane to Pyongyang and interview the Supreme Leader, Kim Jong-un. No need: this article from the Diplomat answered the question quite nicely.

Yes, North Korea has (fairly recently) started delivering streamed lectures at a distance through some of its more prestigious universities, such as Kim Il-sung University. This prestigious university recently awarded degrees to those who finished their program via a distance learning course for the first time. You can even watch a promotional video from a North Korean web site. (It helps if you speak Korean, which I don’t, and it took over 10 minutes to download the 48 second video.) Students can watch the programs on laptops, tablets or mobile phones.

But how many have mobile phones? The Diplomat reports that in 2015 the number of mobile phone subscribers in North Korea reached 3.24 million, (about 13%) and that about 60 percent of the population in Pyongyang, the capital, between 20 and 50 years old are using mobile phones. (If you deduct for government exaggeration and add for technology development since 2015, these figures are probably a reasonable estimate).

However, access to the Internet internationally is prohibited to students.

So while online learning may be allowing for more flexibility in delivery, it is not necessarily widening access. You still have to be admitted to a prestigious university to get the online courses.

Comment

North Korea appears to be in roughly the same position as China in the mid-1980s, when China created the Chinese Central Television and Radio University, which is now well established and has millions of students. Cuba also has online distance education, but students are not permitted to access the Internet internationally.

However, as in China in the 1980s, North Korea is using largely streamed or broadcast lectures, which do not exploit fully the power of the Internet and in particular put a heavy emphasis on information transmission at the expense of skills development and knowledge management – but then that’s not so different from the practice of many online courses in Canada and the USA.

The lesson clearly is that it is not enough just to use the technology; you also need to change the teaching method to get the full benefits of online learning. But at least North Korea is moving into online learning.

If anyone has more information about online learning in North Korea, please share!

Athabasca University’s Centre for Distance Education to close

The news

As my mother used to say when she had the goods on me, ‘A little birdie told me…’. Well, a (different) little birdie has told me that the Centre for Distance Education at Athabasca University is being closed on June 1 and the academic staff from the Centre are being moved into the Faculty of Humanities and Social Sciences.

What is the Centre for Distance Education and what does it do?

The Centre (CDE) has currently about 10 academic staff and several distinguished adjunct professors, such as Randy Garrison and George Siemens, and also some very distinguished emeriti professors such as: 

  • Dominique Abrioux – Former AU President
  • Terry Anderson – Former Editor of IRRODL and Professor, Centre for Distance Education (Retired 2016)
  • Jon Baggaley – Former Professor, Centre for Distance Education
  • Patrick Fahy – Former Professor, Centre for Distance Education (Retired 2017)
  • Tom Jones – Former Associate Professor, Centre for Distance Education (Retired 2017)
  • Robert Spencer – Former Chair/Director, Centre for Distance Education

CDE currently offers a Master of Education in Distance Education and a Doctor of Education in Distance Education as well as post-baccalaureate certificates and diplomas in educational technology and instructional design. It is therefore the major centre in Canada for the education and training of professionals in online learning, educational technology and distance education.

On a lesser scale, it has also been a major centre for research into distance education. The Canadian Initiative for Distance Education Research (CIDER) is a research initiative of the International Review of Research in Open and Distributed Learning (IRRODL) and the Centre for Distance Education. 

IRRODL is a globally recognised leading journal published by Ayhabasca University but run mainly out of the Centre (its editors are currently Rory McGreal and Dianne Conrad, both CDE academics).

Thus the Centre for Distance Education has been a critical part of the infrastructure for distance education in Canada, providing courses and programs, research and leadership in this field.

Why is it being closed?

Good question. This was a decision apparently made in the Provost’s Office but, as far as I know, no official reason has been given for its closure and the transfer of staff to the Faculty of Humanities and Social Sciences. It appears that the programs will continue, but under the aegis of the Faculty of Humanities and Social Sciences.

However, the CDE was a little bit of an organisational oddity, as it was not attached to any major faculty (there is no Faculty of Education at Athabasca) and thus the CDE made the AU’s organizational structure look a little bit untidy. There may have been financial reasons for its closure but it’s hard to see how moving existing staff and programs into another faculty is going to save money, unless the long-term goal is to close down the programs and research, which in my view would be catastrophic for the future of the university. 

Why does it matter?

Indeed at no time has AU been in greater need of the expertise in the CDE for building new, more flexible, digitally based teaching and learning models for AU (see my post on the independent third-party review of AU). In a sense, the reorganisation does move the Centre staff closer organisationally to at least some faculty members in one Faculty, but it really should have a university-wide mandate to support new learning designs across the university.

The issue of course is that it is primarily an academic unit, not a learning technology support unit, but it should not be impossible for it to be structured so that both functions are met (for instance see the Institute of Educational Technology at the British Open University). This might have meant the Centre – or a restructured unit – being either a part of the Provost’s Office or directly reporting to it, which is not going to happen once all the Centre’s faculty are housed in the Faculty of Humanities and Social Sciences.

What disturbs me most is that there does not seem to have been extensive consultation or discussion of the role of the CDE and its future before this decision was made. From the outside it appears to be a typical bureaucratic fudge, more to do with internal politics than with vision or strategy.

Given the importance of the CDE not just to Athabasca University but also to distance education in Canada in general, it is to be hoped that the administration at AU will come forward with a clear rationale and vision for the future of AU and explain exactly how the transfer of the Centre’s staff to the Faculty of Humanities and Social Sciences will help move this vision and strategy forward. The dedicated and expert academic staff in the Centre deserve no less, and the university itself will suffer if there is no such clear strategy for making the most of the expertise that previously resided in the CDE. 

Postscript

For the views of the Centre’s Director, and a response from the Provost, see the following article:

Lieberman, M. (2018) Repositioning a prominent distance education centre Inside Higher Education, May 23

Learning analytics, student satisfaction, and student performance at the UK Open University

There is very little correlation between student satisfaction and student performance. Image: Bart Rienties. Click on image to see the video.

Rienties, B. and Toetenel, L. (2016) The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules, Computers in Human Behaviour, Vol. 60, pp.333-341

Li, N. et al. (2017) Online learning experiences of new versus continuing learners: a large-scale replication study, Assessment and Evaluation in Higher Education, Vol. 42, No. 4, pp.657-672

It’s never too late to learn

It’s been a hectic month with two trips from Vancouver to Ontario and back and one to the UK and back, a total of four keynotes, two panel sessions and two one day consultancies. By the time I got to the end of the month’s travels, I had learned so much that at a conference in Toronto I had to go to my room and lie down  – I just couldn’t take any more!

At my age, it takes time to process all this new information, but I will try to summarise the main points of what I learned in the next three posts.

Learning analytics at the Open University

The Open University, with over 100,000 students and more than 1,000 courses (modules), and most of its teaching online in one form or another, is an ideal context for the application of learning analytics. Fortunately the OU has some of the world leaders in this field. 

At the conference on STEM teaching at the Open University that I attended as the opening keynote, the closing keynote was given by Bart Rienties, Professor of Learning Analytics at the Institute of Educational Technology at the UK Open University. Rienties and his team linked 151 modules (courses) and 111,256 students with students’ behaviour, satisfaction and performance at the Open University UK, using multiple regression models. 

His whole presentation (40 minutes, including questions) can be accessed online, and is well worth viewing, as it provides a clear summary of the results published in the two detailed papers listed above. As always, if you find my summary of results below of interest or challenging, I strongly recommend you view Bart’s video first, then read the two articles in more detail. Here’s what I took away.

There is little correlation between student course evaluations and student performance

This result is a bit of a zinger. The core dependent variable used was academic retention (the number of learners who completed and passed the module relative to the number of learners who registered for each module). As Rientes and Toetenel (p.340) comment, almost as an aside, 

it is remarkable that learner satisfaction and academic retention were not even mildly related to each other….Our findings seem to indicate that students may not always be the best judge of their own learning experience and what helps them in achieving the best outcome.’

The design of the course matters

One of the big challenges in online and blended learning is getting subject matter experts to recognise the importance of what the Open University calls ‘learning design.’ 

Conole (2012, p121) describes learning design as:

a methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies. LD is focussed on ‘what students do’ as part of their learning, rather than the ‘teaching’ which is focussed on the content that will be delivered.

Thus learning design is more than just instructional design.

However, Rienties at al. comment that ‘only a few studies have investigated how educators in practice are actually planning and designing their courses and whether this is then implemented as intended in the design phase.’ 

The OU has done a good job in breaking down some of the elements of learning design. The OU has mapped the elements of learning design in nearly 200 different courses. The elements of this mapping can be seen below (Rientes and Toetenal, 2016, p.335):

Rientes and Toetenel then analysed the correlations between each of these learning design elements against both learner satisfaction and learner performance. What they found is that what OU students liked did not match with learner performance. For instance, students were most satisfied with ‘assimilative’ activities, which are primarily content focused, and disliked communication activities, which are primarily social activities. However, better student retention was most strongly associated with communication activities, and overall, with the quality of the learning design.

Rientes and Toetenel conclude:

although more than 80% of learners were satisfied with their learning experience, learning does not always need to be a nice, pleasant experience. Learning can be hard and difficult at times, and making mistakes, persistence, receiving good feedback and support are important factors for continued learning….

An exclusive focus on learner satisfaction might distract institutions from understanding the impact of LD on learning experiences and academic retention. If our findings are replicated in other contexts, a crucial debate with academics, students and managers needs to develop whether universities should focus on happy students and customers, or whether universities should design learning activities that stretch learners to their maximum abilities and ensuring that they eventually pass the module. Where possible, appropriate communication tasks that align with the learning objectives of the course may seem to be a way forward to enhance academic retention.

Be careful what you measure

As Rientes and Toetenel put it:

Simple LA metrics (e.g., number of clicks, number of downloads) may actually hamper the advancement of LA research. For example, using a longitudinal data analysis of over 120 variables from three different VLE/LMS systems and a range of motivational, emotions and learning styles indicators, Tempelaar et al. (2015) found that most of the 40 proxies of simple” VLE LA metrics provided limited insights into the complexity of learning dynamics over time. On average, these clicking behaviour proxies were only able to explain around 10% of variation in academic performance.

In contrast, learning motivations, emotions (attitudes), and learners’ activities during continuous assessments (behaviour) significantly improved explained variance (up to 50%) and could provide an opportunity for teachers to help at-risk learners at a relatively early stage of their university studies.

My conclusions

Student feedback on the quality of a course is really important but it is more useful as a conversation between students and instructors/designers than as a quantitative ranking of the quality of a course.  In fact using learner satisfaction as a way to rank teaching is highly misleading. Learner satisfaction encompasses a very wide range of factors as well as the teaching of a particular course. It is possible to imagine a highly effective course where teaching in a transmissive or assimilative manner is minimal, but student activities are wide, varied and relevant to the development of significant learning outcomes. Students, at least initially, may not like this because this may be a new experience for them, and because they must take more responsibility for their learning. Thus good communication and explanation of why particular approaches to teaching have been chosen is essential (see my comment to a question on the video).

Perhaps though the biggest limitation of student satisfaction for assessing the quality of the teaching is the often very low response rates from students, limited evaluation questions due to standardization (the same questions irrespective of the nature of the course), and the poor quality of the student responses. This is no way to assess the quality of an individual teacher or a whole institution, yet far too many institutions and governments are building this into their evaluation of teachers/instructors and institutions.

I have been fairly skeptical of learning analytics up to now, because of the tendency to focus more on what is easily measurable (simple metrics) than on what students actually do qualitatively when they are learning. The focus on learning design variables in these studies is refreshing and important but so will be analysis of student learning habits.

Finally, this research provides quantitative evidence of the importance of learning design in online and distance teaching. Good design leads to better learning outcomes. Why then are we not applying this knowledge to the design of all university and college courses, and not just online courses? We need a shift in the power balance between university and college subject experts and learning designers resulting in the latter being treated as at least equals in the teaching process.

References

Conole, G. (2012). Designing for learning in an open world. Dordrecht: Springer

Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: learning analytics in a data-rich context. Computers in Human Behavior, 47, 157e167. http://dx.doi.org/10.1016/j.chb.2014.05.038.