June 21, 2018

Learning analytics, student satisfaction, and student performance at the UK Open University

There is very little correlation between student satisfaction and student performance. Image: Bart Rienties. Click on image to see the video.

Rienties, B. and Toetenel, L. (2016) The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules, Computers in Human Behaviour, Vol. 60, pp.333-341

Li, N. et al. (2017) Online learning experiences of new versus continuing learners: a large-scale replication study, Assessment and Evaluation in Higher Education, Vol. 42, No. 4, pp.657-672

It’s never too late to learn

It’s been a hectic month with two trips from Vancouver to Ontario and back and one to the UK and back, a total of four keynotes, two panel sessions and two one day consultancies. By the time I got to the end of the month’s travels, I had learned so much that at a conference in Toronto I had to go to my room and lie down  – I just couldn’t take any more!

At my age, it takes time to process all this new information, but I will try to summarise the main points of what I learned in the next three posts.

Learning analytics at the Open University

The Open University, with over 100,000 students and more than 1,000 courses (modules), and most of its teaching online in one form or another, is an ideal context for the application of learning analytics. Fortunately the OU has some of the world leaders in this field. 

At the conference on STEM teaching at the Open University that I attended as the opening keynote, the closing keynote was given by Bart Rienties, Professor of Learning Analytics at the Institute of Educational Technology at the UK Open University. Rienties and his team linked 151 modules (courses) and 111,256 students with students’ behaviour, satisfaction and performance at the Open University UK, using multiple regression models. 

His whole presentation (40 minutes, including questions) can be accessed online, and is well worth viewing, as it provides a clear summary of the results published in the two detailed papers listed above. As always, if you find my summary of results below of interest or challenging, I strongly recommend you view Bart’s video first, then read the two articles in more detail. Here’s what I took away.

There is little correlation between student course evaluations and student performance

This result is a bit of a zinger. The core dependent variable used was academic retention (the number of learners who completed and passed the module relative to the number of learners who registered for each module). As Rientes and Toetenel (p.340) comment, almost as an aside, 

it is remarkable that learner satisfaction and academic retention were not even mildly related to each other….Our findings seem to indicate that students may not always be the best judge of their own learning experience and what helps them in achieving the best outcome.’

The design of the course matters

One of the big challenges in online and blended learning is getting subject matter experts to recognise the importance of what the Open University calls ‘learning design.’ 

Conole (2012, p121) describes learning design as:

a methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies. LD is focussed on ‘what students do’ as part of their learning, rather than the ‘teaching’ which is focussed on the content that will be delivered.

Thus learning design is more than just instructional design.

However, Rienties at al. comment that ‘only a few studies have investigated how educators in practice are actually planning and designing their courses and whether this is then implemented as intended in the design phase.’ 

The OU has done a good job in breaking down some of the elements of learning design. The OU has mapped the elements of learning design in nearly 200 different courses. The elements of this mapping can be seen below (Rientes and Toetenal, 2016, p.335):

Rientes and Toetenel then analysed the correlations between each of these learning design elements against both learner satisfaction and learner performance. What they found is that what OU students liked did not match with learner performance. For instance, students were most satisfied with ‘assimilative’ activities, which are primarily content focused, and disliked communication activities, which are primarily social activities. However, better student retention was most strongly associated with communication activities, and overall, with the quality of the learning design.

Rientes and Toetenel conclude:

although more than 80% of learners were satisfied with their learning experience, learning does not always need to be a nice, pleasant experience. Learning can be hard and difficult at times, and making mistakes, persistence, receiving good feedback and support are important factors for continued learning….

An exclusive focus on learner satisfaction might distract institutions from understanding the impact of LD on learning experiences and academic retention. If our findings are replicated in other contexts, a crucial debate with academics, students and managers needs to develop whether universities should focus on happy students and customers, or whether universities should design learning activities that stretch learners to their maximum abilities and ensuring that they eventually pass the module. Where possible, appropriate communication tasks that align with the learning objectives of the course may seem to be a way forward to enhance academic retention.

Be careful what you measure

As Rientes and Toetenel put it:

Simple LA metrics (e.g., number of clicks, number of downloads) may actually hamper the advancement of LA research. For example, using a longitudinal data analysis of over 120 variables from three different VLE/LMS systems and a range of motivational, emotions and learning styles indicators, Tempelaar et al. (2015) found that most of the 40 proxies of simple” VLE LA metrics provided limited insights into the complexity of learning dynamics over time. On average, these clicking behaviour proxies were only able to explain around 10% of variation in academic performance.

In contrast, learning motivations, emotions (attitudes), and learners’ activities during continuous assessments (behaviour) significantly improved explained variance (up to 50%) and could provide an opportunity for teachers to help at-risk learners at a relatively early stage of their university studies.

My conclusions

Student feedback on the quality of a course is really important but it is more useful as a conversation between students and instructors/designers than as a quantitative ranking of the quality of a course.  In fact using learner satisfaction as a way to rank teaching is highly misleading. Learner satisfaction encompasses a very wide range of factors as well as the teaching of a particular course. It is possible to imagine a highly effective course where teaching in a transmissive or assimilative manner is minimal, but student activities are wide, varied and relevant to the development of significant learning outcomes. Students, at least initially, may not like this because this may be a new experience for them, and because they must take more responsibility for their learning. Thus good communication and explanation of why particular approaches to teaching have been chosen is essential (see my comment to a question on the video).

Perhaps though the biggest limitation of student satisfaction for assessing the quality of the teaching is the often very low response rates from students, limited evaluation questions due to standardization (the same questions irrespective of the nature of the course), and the poor quality of the student responses. This is no way to assess the quality of an individual teacher or a whole institution, yet far too many institutions and governments are building this into their evaluation of teachers/instructors and institutions.

I have been fairly skeptical of learning analytics up to now, because of the tendency to focus more on what is easily measurable (simple metrics) than on what students actually do qualitatively when they are learning. The focus on learning design variables in these studies is refreshing and important but so will be analysis of student learning habits.

Finally, this research provides quantitative evidence of the importance of learning design in online and distance teaching. Good design leads to better learning outcomes. Why then are we not applying this knowledge to the design of all university and college courses, and not just online courses? We need a shift in the power balance between university and college subject experts and learning designers resulting in the latter being treated as at least equals in the teaching process.

References

Conole, G. (2012). Designing for learning in an open world. Dordrecht: Springer

Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: learning analytics in a data-rich context. Computers in Human Behavior, 47, 157e167. http://dx.doi.org/10.1016/j.chb.2014.05.038.

 

‘Making Digital Learning Work’: why faculty and program directors must change their approach

Completion rates for different modes of delivery at Houston Community College

Bailey, A. et al (2018) Making Digital Learning Work Boston MA:The Boston Consulting Group/Arizona State University

Getting blended learning wrong

I’ve been to several universities recently where faculty are beginning to develop blended or ‘hybrid’ courses which reduce but do not eliminate time on campus. I must confess I have mixed feelings about this. While I welcome such moves in principle, I have been alarmed by some of the approaches being taken.

The main strategy appears to be to move some of the face-to-face lectures online, without changing either the face-to-face or the online lecture format. In particular there is often a resistance to asynchronous approaches to online learning.  In one or two cases I have seen, faculty have insisted that students watch the Internet lectures live so that there can be synchronous online discussion, thus severely limiting the flexibility of ‘any time, any place’ for students.

Even more alarming, academic departments seem to be approaching the development of new blended learning programs the same way as their on-campus programs – identify faculty to teach the courses and then let them loose without any significant faculty development or learning design support. Even worse, there is no project management to ensure that courses are ready on time. Why discuss the design of the online lectures when you don’t do that for your classroom lectures? 

Trying to move classroom lectures online without adaptation is bound to fail, as we saw from the early days of fully online learning (and MOOCs). I recognise that blended or hybrid learning is different from fully online learning, but it is also different from face-to-face teaching. The challenge is to identify what the added value is of the face-to-face component, when most teaching can be done as well or better, and much more conveniently for students, online, and how to combine the two modes of delivery to deliver better learning outcomes more cost-effectively.  In particular, faculty are missing the opportunity to change their teaching method in order to get better learning outcomes, such as the development of high-level intellectual skills.

The real danger here is that poorly designed blended courses or programs will ‘fail’ and it is ‘blended learning’ that is blamed, when really it’s ignorance of best teaching practices on the part of faculty, and program directors especially. The problem is that faculty, and particularly senior faculty such as Deans and program directors, don’t know what they don’t know, which is why the report, ‘Making Digital Learning Work’ is so important. The report provides evidence that digital learning needs a complete change in culture and approaches to course and program development and delivery for most academic departments. Here’s why.

The report

The Arizona State University Foundation and Boston Consulting, funded by the Melinda and Bill Gates Foundation, conducted a study of the return on investment (ROI) of digital learning in six different institutions. The methodology focused on six case studies of institutions that have been pioneers in post-secondary digital education:

  • Arizona State University
  • University of Central Florida
  • Georgia State University
  • Houston Community College
  • The Kentucky Community and Technical College System
  • Rio Salado Community College.

These are all large institutions (over 30,000 students each) and relatively early adopters of online learning. 

The study had three aims:

  • define what ROI means in terms of digital education, and identify appropriate metrics for measuring ROI
  • assess the impact of digital learning formats on institutions’ enrolments, student learning outcomes, and cost structures
  • examine how these institutions implemented digital learning, and identify lessons and promising practices for the field.

The study compared results from three different modes of delivery:

  • face-to-face courses
  • mixed-modality courses, offering a mix of online and face-to-face components, with the online component typically replacing some tradition face-to-face teaching (what I would call ‘hybrid learning)
  • fully online courses.

The ROI framework

The study identified three components of ROI for digital learning:

  • impact on student access to higher education
  • impact on learning and completion outcomes
  • impact on economics (the costs of teaching, administration and infrastructure, and the cost to students).

The report is particularly valuable in the way it has addressed the economic issues. Several factors were involved:

  • differences in class size between face-to-face and digital teaching and learning
  • differences in the mix of instructors (tenured and adjunct, full-time and part-time)
  • allocation of additional expenses such as faculty development and learning design support
  • impact of digital learning on classroom and other physical capacity 
  • IT costs specifically associated with digital learning.

The report summarised this framework in the following graphic:

While there are some limitations which I will discuss later, this is a sophisticated approach to looking at the return on investment in digital learning and gives me a great deal of confidence in the findings.

Results

Evidence from the six case studies resulted in the following findings, comparing digital learning with face-to-face teaching.

Digital learning resulted in:

  • equivalent or improved student learning outcomes
  • faster time to degree completion
  • improved access, particularly for disadvantaged students
  • a better return on investment (at four of the institutions): savings for online courses ranged from $12 to $66 per credit hour.

If you have problems believing or accepting these results then I recommend you read the report in full. I think you will find the results justified.

Conditions for success

This is perhaps the most valuable part of the report, because although most faculty may not be aware of this, those of us working in online learning have been aware for some time of the benefits of digital learning identified above. What this report makes clear though are the conditions that are needed for digital learning to succeed:

  • take a strategic portfolio approach to digital learning. This needs a bit of unpacking because of the terminology. The report argues that the greatest potential to improve access and outcomes while reducing costs lies in increasing the integration of digital learning into the undergraduate experience through mixed-modality (i.e. hybrid learning). This involves not just one single approach to course design but a mix, dependent on the demands of the subject and the needs of students. However, there should be somewhat standard course design templates to ensure efficiency in course design and to reduce risk.
  • build the necessary capabilities and expertise to design for quality in the digital realm. The experience of the six institutions emphasises that significant investment needs to be made in instructional design, learning sciences and digital tools and capacity (and – my sidebar – faculty need to listen to what instructional designers tell them)
  • provide adequate student support that takes account of the fact that students will often require that support away from the campus (and 24/7)
  • fully engage faculty and provide adequate faculty development and training by fostering a culture of innovation in teaching
  • tap outside vendors strategically: determine the strategic goals first for digital learning then decide where outside vendors can add value to in-house capacity
  • strengthen analytics and monitoring: the technology provides better ways to track student progress and difficulties

My comments on the report

This report should be essential reading for anyone concerned with teaching and learning in post-secondary education, but it will be particularly important for program directors. 

It emphasises that blended learning is not so much about delivery but about achieving better learning outcomes and increased access through the re-design of teaching that incorporates the best of face-to-face and online teaching. However this requires a major cultural change in the way faculty and instructors approach teaching as indicated by the following:

  • holistic program planning involving all instructors, instructional designers and probably students as well
  • careful advanced planning, and following best practices, including project management and learning design
  • focusing as much on the development of skills as delivering content
  • identifying the unique ‘affordances’ of face-to-face teaching and online learning: there is no general formula for this but it will require discussion and input from both content experts and learning designers on a course by course basis
  • systematic evaluation and monitoring of hybrid learning course designs, so best (and poor) practices can be identified

I have a few reservations about the report:

  • The case study institutions were carefully selected. They are institutions with a long history of and/or considerable experience in online learning. I would like to see more cases built on more traditional universities or colleges that have been able successfully to move into online and especially blended learning
  • the report did not really deal with the unique context of mixed-modularity. Many of the results were swamped by the much more established fully online courses. However, hybrid learning is still new so this presents a challenge in comparing results.

However, these are minor quibbles. Please print out the report and leave it on the desk of your Dean, the Provost, the AVP Teaching and Learning and your program director – after you’ve read it. You could also give them:

Bates, A. and Sangra, A. (2011) Managing Technology in Higher Education San Francisco: Jossey-Bass/John Wiley

But that may be too much reading for the poor souls, who now have a major crisis to deal with.

Tracking innovations in online learning in Canada

Rue St Jean, Québec City. Temperatures ranged from -17 C to -23 C -without wind chill added

I’ve not been blogging much recently because, frankly, I’ve been too busy, and not on the golf course or ski slopes, either. (Yeah, so what happened to my retirement? Failed again).

Assessing the state of online learning in Canada

I am working on two projects at the moment:

These two projects in fact complement one another nicely, with the first aiming to provide a broad and accurate picture of the extent of online learning in Canada, and the other focusing on the more qualitative aspects of innovation in online learning, and all in time for not only for the 150th anniversary of Confederation in Canada (which was really the creation of a new, independent state in North America) but also ICDE’s World Congress on Online Learning in Toronto in October, whose theme is, guess what, Teaching in a Digital Age (now there’s a co-incidence).

Of course, I’m not doing this on my own. In both projects I am working with a great group of people.

Methodology

My mandate for Contact North is to identify 8-12 cases of innovation in online learning from all of Canada other than Ontario. I started of course in British Columbia, early in January, and last week I visited six post-secondary institutions in four cities in Québec.

To find the cases, I have gone to faculty development workshops where instructors showcase their innovations, or I have contacted instructional designers I know in different institutions to recommend cases. The institutions are chosen to reflect provinces, and universities and colleges within each province.

Each visit involves an interview with the instructor responsible for the innovation, and where possible a demonstration or examples of the innovation. (One great thing about online learning is that it leaves a clear footprint that can be captured).

I then write up a short report, using a set of headings provided by Contact North, and then return that to the instructor to ensure that it is accurate. I then submit the case report to Contact North.

I am not sure whether Contact North will publish all the cases I report on its web site, as I will certainly cover much more than 8-12 cases in the course of this project. However, it is hoped that at least some of the instructors featured will showcase their innovations at the World Congress of Online Learning.

Progress to date

I have conducted interviews (but not finished the reports yet) for the following:

British Columbia

  • the use of an online dialectical map to develop argumentation skills in undergraduate science students (Simon Fraser University – SFU)
  • peer evaluation as a learning and assessment strategy for building teamwork skills in business school programs (SFU)
  • the development of a mobile app for teaching the analysis of soil samples (University of British Columbia)
  • PRAXIS: software to enable real-time, team-based decision-making skills through simulations of real-world emergency situations (Justice Institute of British Columbia)

Québec

  • comodal synchronous teaching, enabling students to choose between attending a live lecture or participating at the same time from home/at a distance (Laval University)
  • synchronous online teaching of the use of learning technologies in a teacher education program (Université du Québec à Trois-Rivières – UQTR)
  • achieving high completion rates in a MOOC on the importance of children’s play (UQTR)
  • a blended course on effective face-to-face teaching for in-service teachers (TÉLUQ)
  • use of iBook Author software for content management for cardiology students and faculty in a teaching hospital (Centre Hospitalier Universitaire de Sherbrooke – Sherbrooke University Hospital: CHUS)
  • a decision-making tool to develop active and coherent learning scenarios that leverage the use of learning technologies (Université de Montréal).
  • Mathema-TIC: francophone open educational resources for teaching mathematics in universities and colleges (Université de Montréal).

These visits would not have been possible without the assistance of France Lafleur, an online instructor from UQTR who not only arranged many of the meetings but also did all the driving. Anyone from outside Québec who has tried to drive across the province in winter, and especially tried to navigate and drive to several parts of Montréal the same day, will understand why this help was invaluable.

Response and reaction

Faculty and instructors often receive a lot of criticism for being resistant to change in their teaching. This project however starts from an opposite position. What are faculty and instructors actually doing in terms of innovation in their teaching? What can we learn from this regarding change and the development of new teaching approaches? What works and what doesn’t?

It is dangerous at this stage to start drawing conclusions. This is not a representative selection of even innovative projects, and the project – in terms of my participation – has just started. The definition of innovation is also imprecise. It’s like trying to describe an elephant to someone who’s never seen one: you might find it difficult to imagine, but you’ll know it when you see it.

However, even with such a small sample, some things are obvious:

  • innovation in online teaching is alive and well in Canadian post-secondary education: there is a lot going on. It was not difficult to identify these 11 cases; I could have easily found many more if I had the time;
  • the one common feature across all the instructors I have interviewed is their enthusiasm and passion for their projects. They are all genuinely excited by what they were doing. Their teaching has been galvanised by their involvement in the innovation; 
  • in some of the cases, there are measured improvements in student learning outcomes, or, more importantly, new ’21st century skills’ such as teamwork, evidence-based argumentation, and knowledge management are being developed as a result of the innovation;
  • although again these are early days for me, there seems to be a widening gap between what is actually happening on the ground and what we read or hear about in the literature and at conferences on innovation in online learning. The innovation I am seeing is often built around simple but effective changes, such as a web-based map, or a slight change of teaching approach, such as opening up a lecture class to students who don’t want to – or can’t – come in to the campus on a particular day. However, these innovations are radically changing the dynamics of classroom teaching;
  • blended learning is breaking out all over the place. Most of these cases involve a mix of classroom and online learning, but there is no standard model – such as flipped classrooms – emerging. They all vary quite considerably from each other; 
  • the innovations are still somewhat isolated although a couple have gone beyond the original instructor and have been adopted by colleagues; however there is usually no institutional strategy or process for evaluating innovations and making sure that they are taken up across a wider range of teaching, although instructional designers working together provide one means for doing this. Evaluation of the innovation though is usually just left to the innovator, with all the risks that this entails in terms of objectivity.

Next steps

I still have at least one more case from another institution in British Columbia to follow up, and I now have a backlog of reports to do. I hope to have these all finished by the end of this month.

I have two more trips to organise. The first will be to the prairie provinces:

  • Alberta, Saskatchewan and Manitoba, which I hope to do in mid-March.

The next will be to the Maritimes,

  • Nova Scotia, New Brunswick, PEI, and Newfoundland, which I will do probably in April or May.

No further cases or institutions have been identified at this moment, and I am definitely open to suggestions in these provinces if you have any. The criterion for choice is as follows:

  • The focus is first and foremost on practice, on actual teaching and learning applications – not policy, funding, planning issues, descriptions of broad services, or broader concerns.
  • The interest is in applications of pedagogy using technology for classroom, blended, and online learning with the emphasis on student learning, engagement, assessment, access, etc. The pedagogy is as important as the technology in terms of innovation.
  • The emphasis is on innovative practices that can be replicated or used by other instructors.
  • We are particularly looking for cases where some form of evaluation of the innovation has been conducted or where there is clear evidence of success.

If you can recommend a case that you think fits well these parameters, please drop me a line at tony.bates@ubc.ca.

In the meantime, look out for the case studies being posted to Contact North’s Pocket of Innovation web site over the next few months. There are also more cases from Ontario being done at the same time.

Report on SFU’s experiences of teaching with technology

Simon Fraser University (on a rare day when it wasn't raining)

Simon Fraser University’s Burnaby campus (on a rare day when it wasn’t raining)

I always enjoy going to a university or college and seeing how they are using learning technologies. I am always a little surprised and I am also usually intrigued by some unexpected application, and today’s DemoFest at Simon Fraser University was no exception.

About Simon Fraser University

SFU has just over 35,000 students with campuses in Burnaby, Vancouver downtown, and Surrey, all in the lower mainland of British Columbia, Canada.

For a long time it has had the largest distance education program in British Columbia, but the rapid development of fully online and blended learning in other BC and Canadian institutions means that other institutions are rapidly gaining ground. It is also the academic base for Linda Harasim, who is a Professor of Communications at SFU.

As with many Canadian universities, most of the DE programs are run out of the Centre for Online and Distance Learning in Continuing Studies at SFU. However, the university also has a large Teaching and Learning Centre, which provides a range of services including learning technology support to the faculty on campus.

The university recently adopted Canvas as its main LMS.

I was spending most of the day at SFU for two reasons:

  • to identify possible cases for Contact North’s ‘pockets of innovation’ project
  • to report on the survey of online learning in Canadian post-secondary institutions.

I will be giving more information on both these projects in separate blog posts coming shortly.

The DemoFest

DEMOfest 2016 is about how instructors are using ….technologies in ways that produce exciting and original educational experiences leading to student engagement and strong learning outcomes.

Making lectures interactive

Not surprisingly, several of the short, 10 minute presentations were focused on tools used in classroom teaching or lecturing. In particular, the tools are going mobile, in the form of apps that students can use on their mobile phones, tablets or laptops. I was particularly impressed with TopHat, which incorporates online quizzes and tests, attendance checks, and  discussion. REEF Polling is a similar development developed by iClicker, which is effectively a mobile app version of iClicker. Both provide students and instructors with an online record of their classroom activity on the app.

There was also a couple of sessions on lecture theatre technologies. As in other universities, lecturers can find a range of different interfaces for managing lecture theatre facilities. SFU has a project that will result in a common, simple interface that will be available throughout the different campuses of the universities, much to the relief of faculty and visiting speakers who at the moment have no idea what to expect when entering an unfamiliar lecture theatre or classroom.. There was also another session on the limits of lecture capture and how to use video to make learning more engaging.

Online learning development

However, I found nothing here (or anywhere else, for that matter) that has convinced me that there is a future in the large lecture class. Most of the technology enhancements, although improvements on the straight ‘talk’ lecture, are still just lipstick on a pig.

The online learning developments were much more interesting:

  • online proctoring: Proctorio. This was a demonstration of the ingenuity of students in cheating in online assessment and even greater ingenuity in preventing them from doing it. Proctorio is a powerful web-based automated proctoring system that basically takes control of whatever device the student is using to do an online assessment and records their entire online activity during the exam. Instructors/exam supervisors though have options as to exactly what features they can control, such as locked screens, blocking use of other urls, etc.. Students just sign in and take the exam at any time set by the instructor. Proctorio provides the instructor with a complete record of students’ online activity during the exam, including a rating of the ‘suspiciousness’ of the student’s online exam activity.
  • peer evaluation and team-based learning: SFU has a graduate diploma in business where students are required to work in teams, specifically to build team approaches to problem-solving and business solutions. Although the instructor assesses both the individual and group assignments, students evaluate each other on their contribution to the team activities. The demonstration also showed how peer assessment was handled within the Canvas LMS. It was a good example of best practices in peer-to-peer assessment.
  • Dialectical Map: an argument visualization tool developed at SFU. Joan Sharp, Professor of Biological Sciences, and her research colleague, Hui Niu, have developed a simple, interactive, web-based tool that facilitates the development of argumentation for science students. Somewhat to my surprise, research evidence shows that science students are often poor at argumentation, even in the upper years of an undergraduate program. This tool enables a question to be posed by an instructor at the tope of the map, such as ‘Should the BC government allow fracking for oil?’ or ‘Should the BC government stop the culling of wolves to protect caribou?’ The online map is split into two parts, ‘pro’ and ‘con’, with boxes for the rationale, and linked boxes for the evidence to support each rationale offered. Students type in their answers to the boxes (both pro and con) and have a box at the bottom to write their conclusion(s) from the argument. Students can rate the strength of each rationale. All the boxes in a map can be printed out, giving a detailed record of the arguments for and against, the evidence in support of the arguments and the student’s conclusion.  Hui Niu has done extensive research on the effectiveness of the tool, and has found that the use of the tool has substantially increased students’ performance on argument-based assignments/assessment.

General comments

I was very grateful for the invitation and enjoyed nearly all the presentations. The Teaching and Learning Centre is encouraging research into learning technologies, particularly developing a support infrastructure for OERs and looking at ways to use big data for the analysis and support of learning. This practical, applied research is being led by Lynda Williams, the Manager of the Learn tech team, and is being done in collaboration with both faculty and graduate students from different departments.

Students and a professor of computer science worked with the IT division and Ancillary Services to develop a student app for the university called SFU Snap, as part of a computer science course. This not only provides details of the bus services to and from SFU at any time, but also provides students with an interactive map so they can find their classrooms. Anyone who has tried to find their way around SFU (built at multi-levels into a mountain) will understand how valuable such an app must be, not just to students but also to visitors.

So thank you, everyone at the Teaching and Learning Centre at SFU for a very interesting and useful day.

 

Welcome back and what you may have missed in online learning over the summer

Working in my study

Not a lot of work done this summer!

I hope you all had a great summer break and have come back fully charged for another always challenging year in teaching. I thought it might be helpful to pull together some of the developments in online learning that occurred over the summer that you may have missed. My list, of course, is very selective and personal.

Online learning for beginners

During the summer I developed a series of ten posts aimed at those considering teaching online, or brand new to online teaching:

This was in response to concerns that many instructors and faculty were not well briefed or aware of best practices and what we already know about effective (and more importantly, ineffective) approaches to online teaching.

The posts of course were linked to my online, open textbook, Teaching in a Digital Age. However, the book itself is likely to appeal to those who have already made a major commitment to teaching well online. The blog posts in contrast aim to address some common myths and misconceptions about online learning and online teaching, and in particular to help instructors make decisions about whether or not to do online learning in the first place, and if so, what they need to know to do it well. Think of it as a prep for the book itself.

This won’t be directly relevant to most readers of this blog, but please direct any instructors or faculty in your institution who are struggling to decide whether or not to teach online, or must undertake it but are fearful, to these posts, as well as the book itself.

Contact North will be repackaging these blog posts and re-issuing them this fall; watch this space for more details.

Upcoming conferences

The big conference announcement is that the next ICDE World Conference in Online Learning and Distance Education will be held in Toronto in October, 2017, and the lead organiser is Contact North. This global conference is one of the major events in the world of online and distance learning and it’s the first time since 1982 that it’s been held in Canada. Next year’s theme is guess what? Teaching in a Digital Age. Well, that’s a coincidence, isn’t it?

Another major conference coming up at the end of this year is the OEB conference in Berlin in December.

Registration is also now open for the EDEN Research Workshop in Oldenburg, Germany, in October this year.

AACE’s World Conference on eLearning takes place in Virginia, USA, in November this year.

And, if you hurry, you might just make the 4th E-Learning Innovations Conference and Expo in Nairobi, Kenya from September 12-16.

Reports and journals

These are reports that have been published (or which I found) over the summer. I have blogged about one or two of them but for the rest I’ve not had the time. (Well, the weather’s been glorious here in Vancouver this summer and golf called and was answered.)

Centre for Extended Learning (2016) How do we create useful online learning experiences? Waterloo ON: University of Waterloo.

This is an excellent guide to multimedia course design, combining Peter Morville’s user experience (UX) honeycomb and Richard Mayer’s theory and research on the use of multimedia for learning, to create a well-designed set of guidelines for online course design.

Daniel, J. (2016) Combatting Corruption and Enhancing Integrity: A Contemporary Challenge for the Quality and Integrity of Higher Education: Advisory Statement for Effective International Practice: Washington DC/Paris: CHEA/UNESCO.

No need to say more other than some of these corruptions will almost certainly be found in your institution. A great read and very disturbing.

Contact North (2016) Connecting the Dots: Technology-enabled Learning and Student Success Toronto ON: Nelson.

This is the result of a symposium organized by Nelson in Toronto earlier in the year  and looks particularly at three main issues in online learning:1. The notion of “program”; 2. The role of faculty; 3. The nature of student support services.

Garrett, R. and Lurie, H. (2016) Deconstructing CBE  Boston MA: Ellucian/Eduventures/ACE.

This is a report on a three-year study to help higher education leaders better understand competency-based education (CBE), including the diversity of institutional practices and paths forward.

Bacigalupo, M. et al (2016) The Entrepreneurship Competence Framework Brussels: European Commission JRC Science for Policy.

“The EntreComp Framework is made up of 3 competence areas: ‘Ideas and opportunities’, ‘Resources’ and ‘Into action’. Each area includes 5 competences, which, together, are the building blocks of entrepreneurship as a competence.” Something concrete at last on one of the key 21st century skills. Don’t ask me though whether I believe it – read it for yourself, if you can stand European Commission English.

IRRODL, Vol. 17, No. 4

From Rory McGreal’s editorial: ‘This one is packed with 19 articles and a book review. We begin with three articles from Africa on access, entrepreneurship, and openness. Then the focus changes to the teacher with a critique and a look at expectations and perceptions. Learning design issues are the focus of the next group of articles, including open design and guidelines. Investigations into factors affecting learning follow…. Finally, mobile learning issues are addressed in the last two articles.’ Something for everyone here.

Distance Education, Vol. 37, No.2  (journal) Special issue on building capacity for sustainable distance e-learning provision.

This is a specially commissioned set of papers around the theme of the last ICDE conference in South Africa. I found it difficult though to identify a consistent message between what are individually interesting papers.

I am well aware that there are many other ‘must-read’ reports that slipped by without my paying attention to them. Any further suggestions from readers will be welcome.

So the world didn’t stop while you were away. Enjoy your teaching this academic year.