June 18, 2018

Learning analytics, student satisfaction, and student performance at the UK Open University

There is very little correlation between student satisfaction and student performance. Image: Bart Rienties. Click on image to see the video.

Rienties, B. and Toetenel, L. (2016) The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules, Computers in Human Behaviour, Vol. 60, pp.333-341

Li, N. et al. (2017) Online learning experiences of new versus continuing learners: a large-scale replication study, Assessment and Evaluation in Higher Education, Vol. 42, No. 4, pp.657-672

It’s never too late to learn

It’s been a hectic month with two trips from Vancouver to Ontario and back and one to the UK and back, a total of four keynotes, two panel sessions and two one day consultancies. By the time I got to the end of the month’s travels, I had learned so much that at a conference in Toronto I had to go to my room and lie down  – I just couldn’t take any more!

At my age, it takes time to process all this new information, but I will try to summarise the main points of what I learned in the next three posts.

Learning analytics at the Open University

The Open University, with over 100,000 students and more than 1,000 courses (modules), and most of its teaching online in one form or another, is an ideal context for the application of learning analytics. Fortunately the OU has some of the world leaders in this field. 

At the conference on STEM teaching at the Open University that I attended as the opening keynote, the closing keynote was given by Bart Rienties, Professor of Learning Analytics at the Institute of Educational Technology at the UK Open University. Rienties and his team linked 151 modules (courses) and 111,256 students with students’ behaviour, satisfaction and performance at the Open University UK, using multiple regression models. 

His whole presentation (40 minutes, including questions) can be accessed online, and is well worth viewing, as it provides a clear summary of the results published in the two detailed papers listed above. As always, if you find my summary of results below of interest or challenging, I strongly recommend you view Bart’s video first, then read the two articles in more detail. Here’s what I took away.

There is little correlation between student course evaluations and student performance

This result is a bit of a zinger. The core dependent variable used was academic retention (the number of learners who completed and passed the module relative to the number of learners who registered for each module). As Rientes and Toetenel (p.340) comment, almost as an aside, 

it is remarkable that learner satisfaction and academic retention were not even mildly related to each other….Our findings seem to indicate that students may not always be the best judge of their own learning experience and what helps them in achieving the best outcome.’

The design of the course matters

One of the big challenges in online and blended learning is getting subject matter experts to recognise the importance of what the Open University calls ‘learning design.’ 

Conole (2012, p121) describes learning design as:

a methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies. LD is focussed on ‘what students do’ as part of their learning, rather than the ‘teaching’ which is focussed on the content that will be delivered.

Thus learning design is more than just instructional design.

However, Rienties at al. comment that ‘only a few studies have investigated how educators in practice are actually planning and designing their courses and whether this is then implemented as intended in the design phase.’ 

The OU has done a good job in breaking down some of the elements of learning design. The OU has mapped the elements of learning design in nearly 200 different courses. The elements of this mapping can be seen below (Rientes and Toetenal, 2016, p.335):

Rientes and Toetenel then analysed the correlations between each of these learning design elements against both learner satisfaction and learner performance. What they found is that what OU students liked did not match with learner performance. For instance, students were most satisfied with ‘assimilative’ activities, which are primarily content focused, and disliked communication activities, which are primarily social activities. However, better student retention was most strongly associated with communication activities, and overall, with the quality of the learning design.

Rientes and Toetenel conclude:

although more than 80% of learners were satisfied with their learning experience, learning does not always need to be a nice, pleasant experience. Learning can be hard and difficult at times, and making mistakes, persistence, receiving good feedback and support are important factors for continued learning….

An exclusive focus on learner satisfaction might distract institutions from understanding the impact of LD on learning experiences and academic retention. If our findings are replicated in other contexts, a crucial debate with academics, students and managers needs to develop whether universities should focus on happy students and customers, or whether universities should design learning activities that stretch learners to their maximum abilities and ensuring that they eventually pass the module. Where possible, appropriate communication tasks that align with the learning objectives of the course may seem to be a way forward to enhance academic retention.

Be careful what you measure

As Rientes and Toetenel put it:

Simple LA metrics (e.g., number of clicks, number of downloads) may actually hamper the advancement of LA research. For example, using a longitudinal data analysis of over 120 variables from three different VLE/LMS systems and a range of motivational, emotions and learning styles indicators, Tempelaar et al. (2015) found that most of the 40 proxies of simple” VLE LA metrics provided limited insights into the complexity of learning dynamics over time. On average, these clicking behaviour proxies were only able to explain around 10% of variation in academic performance.

In contrast, learning motivations, emotions (attitudes), and learners’ activities during continuous assessments (behaviour) significantly improved explained variance (up to 50%) and could provide an opportunity for teachers to help at-risk learners at a relatively early stage of their university studies.

My conclusions

Student feedback on the quality of a course is really important but it is more useful as a conversation between students and instructors/designers than as a quantitative ranking of the quality of a course.  In fact using learner satisfaction as a way to rank teaching is highly misleading. Learner satisfaction encompasses a very wide range of factors as well as the teaching of a particular course. It is possible to imagine a highly effective course where teaching in a transmissive or assimilative manner is minimal, but student activities are wide, varied and relevant to the development of significant learning outcomes. Students, at least initially, may not like this because this may be a new experience for them, and because they must take more responsibility for their learning. Thus good communication and explanation of why particular approaches to teaching have been chosen is essential (see my comment to a question on the video).

Perhaps though the biggest limitation of student satisfaction for assessing the quality of the teaching is the often very low response rates from students, limited evaluation questions due to standardization (the same questions irrespective of the nature of the course), and the poor quality of the student responses. This is no way to assess the quality of an individual teacher or a whole institution, yet far too many institutions and governments are building this into their evaluation of teachers/instructors and institutions.

I have been fairly skeptical of learning analytics up to now, because of the tendency to focus more on what is easily measurable (simple metrics) than on what students actually do qualitatively when they are learning. The focus on learning design variables in these studies is refreshing and important but so will be analysis of student learning habits.

Finally, this research provides quantitative evidence of the importance of learning design in online and distance teaching. Good design leads to better learning outcomes. Why then are we not applying this knowledge to the design of all university and college courses, and not just online courses? We need a shift in the power balance between university and college subject experts and learning designers resulting in the latter being treated as at least equals in the teaching process.

References

Conole, G. (2012). Designing for learning in an open world. Dordrecht: Springer

Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: learning analytics in a data-rich context. Computers in Human Behavior, 47, 157e167. http://dx.doi.org/10.1016/j.chb.2014.05.038.

 

Online learning in 2016: a personal review


global-peace-index-2016-aglobal-peace-initiative-b

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Image: © Institute for Economics and Peace. Canada is ranked seventh most peaceful. We don’t know where it ranks though in terms of online learning.

A personal review

I am not going to do a review of all the developments in online learning in 2016 (for this, see Audrey Watters’ excellent HackEducation Trends). What I am going to do instead is review what I actually wrote about in 2016 in this blog, indicating what to me was of particular interest in online learning during 2016. I have identified 38 posts I wrote in which I have explored in some detail issues that bubbled up (at least for me) in 2016.

1. Tracking online learning

Building a national survey of online learning in Canada (134 hits)

A national survey of university online and distance learning in Canada (1,529 hits)

In the USA, fully online enrollments continue to grow in 2014 (91 hits)

Are you ready for blended learning? (389 hits)

What the Conference Board of Canada thinks about online learning (200 hits)

I indulged my obsession with knowing the extent to which online learning is penetrating post-secondary education with five posts on this topic. In a field undergoing such rapid changes, it is increasingly important to be able to track exactly what is going on. Thus a large part of my professional activity in 2016 has been devoted to establishing, almost from scratch, a national survey of online learning in Canadian post-secondary institutions. I would have written more about this topic, but until the survey has been successfully conducted in 2017, I have preferred to keep a low profile on this issue.

However, during 2016 it did become clear to me, partly as a result of pilot testing of the questionnaire, and partly through visits to universities, that blended learning is not only gaining ground in Canadian post-secondary education at a much faster rate than I had anticipated, but is raising critical questions about what is best done online and what face-to-face, and how to prepare institutions and instructors for what is essentially a revolution in teaching.

This can be best summarized by what I wrote about the Conference Board of Canada’s report:

What is going on is a slowly boiling and considerably variable revolution in higher education that is not easily measured or even captured in individual anecdotes or interviews.

2. Faculty development and training

Getting faculty and instructors into online learning (183 hits)

Initiating instructors to online learning: 10 fundamentals (529 hits)

Online learning for beginners: 10. Ready to go (+ nine other posts on this topic = 4,238 hits)

5 IDEAS for a pedagogy of online learning (708 hits)

This was the area to which I devoted the most space, with ten posts on ‘Online Learning for Beginners’, aimed at instructors resisting or unready for online learning. These ten posts were then edited and published by Contact North as the 10 Fundamentals of Teaching Online.

Two fundamental conclusions: we need not only better organizational strategies to ensure that faculty have the knowledge and training they will need for effective teaching and learning in a digital age, but we also need to develop new teaching strategies and approaches that can exploit the benefits and even more importantly avoid the pitfalls of blended learning and learning technologies. I have been trying to make a contribution in this area, but much more needs to be done.

3. Learning environments

Building an effective learning environment (6,173 hits)

EDEN 2016: Re-imagining Learning Environments (597 hits)

Culture and effective online learning environments (1,260 hits)

Closely linked to developing appropriate pedagogies for a digital age is the concept of designing appropriate learning environments, based on learners’ construction of knowledge and the role of instructors in guiding and fostering knowledge management, independent learning and other 21st century skills.

This approach I argued is a better ‘fit’ for learners in a digital age than thinking in terms of blended, hybrid or fully online learning, and recognizes that not only can technology to be used to design very different kinds of learning environments from school or campus based learning environments, but also that technology is just one component of a much richer learning context.
Slide15

4. Experiential learning online

A full day of experiential learning in action (188 hits)

An example of online experiential learning: Ryerson University’s Law Practice Program (383 hits)

Is networked learning experiential learning? (163 hits)

These three posts explored a number of ways in which experiential learning is being done online, as this is a key methodology for developing skills in particular.

5. Open education

Acorns to oaks? British Columbia continues its progress with OERs (185 hits)

Talking numbers about open publishing and online learning (113 hits)

Towards an open pedagogy for online learning (385 hits)

These posts also tracked the development of open publishing and open educational resources, particularly in British Columbia, leading me to conclude that the OER ‘movement’ has far too narrow a concept of open-ness and that in its place we need an open pedagogy into which open educational resources are again just one component, and perhaps not the most significant.

6. Technology applications in online learning

An excellent guide to multimedia course design (659 hits)

Is video a threat to learning management systems? (603 hits)

Some comments on synchronous online learning technologies (231 hits)

Amongst all the hype about augmented reality, learning analytics and the application of artificial intelligence, I found it more useful to look at some of the technologies that are in everyday use in online learning, and how these could best be used.

7. Technology and alienation

Technology and alienation: online learning and labour market needs (319 hits)

Technology and alienation: symptoms, causes and a framework for discussion (512 hits)

Technology, alienation and the role of education: an introduction (375 hits)

Automation or empowerment: online learning at the crossroads (1,571 hits)

Why digital technology is not necessarily the answer to your problem (474 hits)

These were more philosophical pieces, prompted to some extent by the wider concerns of the impact of technology on jobs and how that has influenced Brexit and the Trump phenomena.

Nevertheless this issue is also very relevant to the teaching context. In particular I was challenging the ‘Silicon Valley’ assumption that computers will eventually replace the need for teachers, and in particular the danger of using algorithms in teaching without knowing who wrote the algorithms, what their philosophy of teaching is, and thus what assumptions have been built into the use of data.

Image: Applift

Image: Applift

8. Learning analytics

Learning analytics and learning design at the UK Open University (90 hits)

Examining ethical and privacy issues surrounding learning analytics (321 hits)

Continuing more or less the same theme of analysing the downside as well as the upside of technology in education, these two posts looked at how some institutions, and the UK Open University in particular, are being thoughtful about the implications of learning analytics, and building in policies for protecting privacy and gaining student ‘social license’ for the use of analytics.

9. Assessment

Developing a next generation online learning assessment system (532 hits)

This is an area where much more work needs to be done. If we are to develop new or better pedagogies for a digital age, we will also need better assessment methods. Unfortunately the focus once again appears to be more on the tools of assessment, such as online proctoring, where large gains have been made in 2016, but which still focus on proctoring traditional assessment procedures such as time-restricted exams, multiple choice tests and essay writing. What we need are new methods of assessment that focus on measuring the types of knowledge and skills that are needed in a digital age.

For instance, e-portfolios have held a lot of promise for a long time, but are still being used and evaluated at a painfully slow rate. They do offer though one method for assessment that reflects much better the needs of assessing 21st century knowledge and skills. However we need more imagination and creativity in developing new assessment methods for measuring the knowledge and skills needed for a digital age.

That was the year that was

Well, it was 2016 from the perspective of someone no longer teaching online or managing online learning:

  • How far off am I, from your perspective?
  • What were the most significant developments for you in online learning in 2016?
  • What did I miss that you think should have been included? Perhaps I can focus on this next year.

I have one more post looking at 2016 to come, but that will be more personal, looking at my whole range of online learning activities in 2016.

In the meantime have a great seasonal break and I will be back in touch some time in the new year.

Learning analytics and learning design at the UK Open University

Maxim Jean-Louis (President, Contact North) and myself outside Walton Hall, the headquarters of the UK Open University, in 2012. It was my first visit since I left the OU in 1989.

Maxim Jean-Louis (President, Contact North) and myself outside Walton Hall, the headquarters of the UK Open University, in 2012. It was my first visit since I left the OU in 1989.

The Open University (2016) Developing learning design and analytics for student success, Connections, Vol. 21, No. 3

The latest edition of the Commonwealth of Learning’s magazine, Connections, has an interesting if brief article of the effective use of learning analytics. There are four key points that I noted:

  • the OU has scaled up its predictive use of learning analytics to cover over 45,000 students, and it works as well in traditional universities as in the OU
  • learning analytics is used in connection with learning design to identify not only students at risk but also to improve the design of the learning materials:

    the OU for the first time can empirically analyse the design of its modules. By linking learning designs with student satisfaction and success measures, it became possible to systematically identify, measure and improve critical aspects of students’ learning experience.

  • the OU is the first university in the world to develop and adopt a policy relating to the ethical uses of student data for
    learning analytics, involving students themselves in the development of the policy. This makes the adoption and use of learning analytics much easier
  • the OU has appointed a reader in learning analytics, Dr. Bart Rienties: that is treating learning analytics really seriously.

Unfortunately there were no links or ways to follow up the article.

Report on SFU’s experiences of teaching with technology

Simon Fraser University (on a rare day when it wasn't raining)

Simon Fraser University’s Burnaby campus (on a rare day when it wasn’t raining)

I always enjoy going to a university or college and seeing how they are using learning technologies. I am always a little surprised and I am also usually intrigued by some unexpected application, and today’s DemoFest at Simon Fraser University was no exception.

About Simon Fraser University

SFU has just over 35,000 students with campuses in Burnaby, Vancouver downtown, and Surrey, all in the lower mainland of British Columbia, Canada.

For a long time it has had the largest distance education program in British Columbia, but the rapid development of fully online and blended learning in other BC and Canadian institutions means that other institutions are rapidly gaining ground. It is also the academic base for Linda Harasim, who is a Professor of Communications at SFU.

As with many Canadian universities, most of the DE programs are run out of the Centre for Online and Distance Learning in Continuing Studies at SFU. However, the university also has a large Teaching and Learning Centre, which provides a range of services including learning technology support to the faculty on campus.

The university recently adopted Canvas as its main LMS.

I was spending most of the day at SFU for two reasons:

  • to identify possible cases for Contact North’s ‘pockets of innovation’ project
  • to report on the survey of online learning in Canadian post-secondary institutions.

I will be giving more information on both these projects in separate blog posts coming shortly.

The DemoFest

DEMOfest 2016 is about how instructors are using ….technologies in ways that produce exciting and original educational experiences leading to student engagement and strong learning outcomes.

Making lectures interactive

Not surprisingly, several of the short, 10 minute presentations were focused on tools used in classroom teaching or lecturing. In particular, the tools are going mobile, in the form of apps that students can use on their mobile phones, tablets or laptops. I was particularly impressed with TopHat, which incorporates online quizzes and tests, attendance checks, and  discussion. REEF Polling is a similar development developed by iClicker, which is effectively a mobile app version of iClicker. Both provide students and instructors with an online record of their classroom activity on the app.

There was also a couple of sessions on lecture theatre technologies. As in other universities, lecturers can find a range of different interfaces for managing lecture theatre facilities. SFU has a project that will result in a common, simple interface that will be available throughout the different campuses of the universities, much to the relief of faculty and visiting speakers who at the moment have no idea what to expect when entering an unfamiliar lecture theatre or classroom.. There was also another session on the limits of lecture capture and how to use video to make learning more engaging.

Online learning development

However, I found nothing here (or anywhere else, for that matter) that has convinced me that there is a future in the large lecture class. Most of the technology enhancements, although improvements on the straight ‘talk’ lecture, are still just lipstick on a pig.

The online learning developments were much more interesting:

  • online proctoring: Proctorio. This was a demonstration of the ingenuity of students in cheating in online assessment and even greater ingenuity in preventing them from doing it. Proctorio is a powerful web-based automated proctoring system that basically takes control of whatever device the student is using to do an online assessment and records their entire online activity during the exam. Instructors/exam supervisors though have options as to exactly what features they can control, such as locked screens, blocking use of other urls, etc.. Students just sign in and take the exam at any time set by the instructor. Proctorio provides the instructor with a complete record of students’ online activity during the exam, including a rating of the ‘suspiciousness’ of the student’s online exam activity.
  • peer evaluation and team-based learning: SFU has a graduate diploma in business where students are required to work in teams, specifically to build team approaches to problem-solving and business solutions. Although the instructor assesses both the individual and group assignments, students evaluate each other on their contribution to the team activities. The demonstration also showed how peer assessment was handled within the Canvas LMS. It was a good example of best practices in peer-to-peer assessment.
  • Dialectical Map: an argument visualization tool developed at SFU. Joan Sharp, Professor of Biological Sciences, and her research colleague, Hui Niu, have developed a simple, interactive, web-based tool that facilitates the development of argumentation for science students. Somewhat to my surprise, research evidence shows that science students are often poor at argumentation, even in the upper years of an undergraduate program. This tool enables a question to be posed by an instructor at the tope of the map, such as ‘Should the BC government allow fracking for oil?’ or ‘Should the BC government stop the culling of wolves to protect caribou?’ The online map is split into two parts, ‘pro’ and ‘con’, with boxes for the rationale, and linked boxes for the evidence to support each rationale offered. Students type in their answers to the boxes (both pro and con) and have a box at the bottom to write their conclusion(s) from the argument. Students can rate the strength of each rationale. All the boxes in a map can be printed out, giving a detailed record of the arguments for and against, the evidence in support of the arguments and the student’s conclusion.  Hui Niu has done extensive research on the effectiveness of the tool, and has found that the use of the tool has substantially increased students’ performance on argument-based assignments/assessment.

General comments

I was very grateful for the invitation and enjoyed nearly all the presentations. The Teaching and Learning Centre is encouraging research into learning technologies, particularly developing a support infrastructure for OERs and looking at ways to use big data for the analysis and support of learning. This practical, applied research is being led by Lynda Williams, the Manager of the Learn tech team, and is being done in collaboration with both faculty and graduate students from different departments.

Students and a professor of computer science worked with the IT division and Ancillary Services to develop a student app for the university called SFU Snap, as part of a computer science course. This not only provides details of the bus services to and from SFU at any time, but also provides students with an interactive map so they can find their classrooms. Anyone who has tried to find their way around SFU (built at multi-levels into a mountain) will understand how valuable such an app must be, not just to students but also to visitors.

So thank you, everyone at the Teaching and Learning Centre at SFU for a very interesting and useful day.

 

Scary tales of online learning and educational technology

The Centre for Digital Media, Vancouver BC

The Centre for Digital Media, Vancouver BC

The Educational Technology Users Group (ETUG) of British Columbia held an appropriately Halloween-themed get together today called ‘The Little Workshop of Horrors’ at which participants were encouraged to share tales of failure and horror stories in the use of learning technologies.

This seemed to me a somewhat risky strategy but it actually worked really well. First the workshop was held in ‘the Hangar’, a large, covered space in (or rather beside) the Centre for Digital Media, a shared building used by UBC, Simon Fraser University, BCIT and the Emily Carr University of Art and Design. The Centre itself is a good example of collaboration and sharing in developing media-based programs, such as its Master of Digital Media. The Hangar lent itself to a somewhat spooky atmosphere, enhanced by a DJ who often accompanied presenters with ghoulish music.

Audrey’s Monsters

The workshop got off to an excellent start with a brilliant keynote from Audrey Watters on the Monsters of Educational Technology (The link will take you to her book on the subject). She identified a range of monsters (the examples are partly Audrey’s, partly mine):

  • Frankenstein’s monster that went wrong because its (hir?) master failed to provide it (em?) with love or social company (teaching machines?): in Audrey’s word’s ‘a misbegotten creature of a misbegotten science’,
  • vampires that suck the blood of students, e.g. by using their personal data (learning analytics?),
  • zombies, i.e. technologies or ed tech ideas that rise and die then rise again (e.g. technology will remove the need for schools, an idea that goes back to the early 1900s),
  • giants that become obsolete and die (Skinner, Merrill)
  • the Blob, which grows bigger and bigger and invades every nook and cranny (MOOCs?)
  • and the dragons, are the libertarian, free-market, Silicon-valley types that preach the ‘destruction’ and ‘re-invention’ of education.

Audrey Watters’ larger point is that if we are not careful, educational technology easily turns itself into a monster that drives out all humanity in the teaching and learning process. We need to be on constant watch, and, whenever we can, we need to take control away from large technology corporations whose ultimate purpose is not educational.

Not only was it a great, on topic, presentation, but it was also such a pleasure to meet at last Audrey in person, as I am a huge fan of her blog.

He was a monster, not because he was a machine, but because he wasn't loved

Confessions

Then came the confessional, at which a series of speakers confessed their sins – or rather, classic failures – about educational technology, often in very funny ways. What was interesting though about most of the tales was that although there was a disaster, in most cases out of the disaster came a lot of good things. (As one speaker said, ‘Success is failing many times without losing your optimism’; or ‘ A sailor gets to know the sea only after he has waded ashore.’).

One presenter reported going to a university to ‘sell’ Blackboard but was so nervous that her presentation was so bad they ended up going with Canvas (you see what I mean about some good coming out of these disasters!) Another described how over 20 years she has been trying to move faculty into more interactive and engaging technology than learning management systems, yet here she is still spending most of her time supporting faculty using an LMS.

One talked about spending years trying to promote IMS-based learning objects, only to find that Google’s search engine made meta-data identification redundant. Revealingly, he felt he knew at the time that the meta-data approach to learning objects was too complex to work, but he had to do it because that was the only way he could get funding. More than one speaker noted that Canada in the past has spent millions of dollars on programs that focused heavily on software solutions (anyone remember EduSource?) but almost nothing on evaluating the educational applications of technology or on research on new or even old pedagogies.

Another spoke about the demise of a new university, the Technical University of British Columbia, that was a purpose-built, new university deliberately built around an “integrated learning” approach, combining heavy use of on-line learning with mixed face-to-face course structures – in 1999. However, by 2002 it had only about 800 FTEs, and a new incoming provincial government, desperate to save money and eager to diminish the previous government’s legacy, closed the university and transferred the students (but not the programs) to Simon Fraser University. Nevertheless, the legacy did live on, with many of the learning technology staff moving later into senior positions within the Canadian higher education system.

I see instructional designers, educational technologists or learning ecology consultants (which was a new title for me) as the Marine Corps of the educational world. They have seen many battles and have (mostly) survived. They have even learned how to occasionally win battles. That’s the kind of wisdom of which academic leaders and faculty and instructors should make much better use.

One participant had such a bad experience at Simon Fraser University that she thinks of it as 'the haunted house on the hill.'

One participant had such a bad ed tech experience at Simon Fraser University that she thinks of it as ‘the haunted house on the hill.’

Happy Halloween, everyone!