November 22, 2014

EDEN research papers: OERs (inc. MOOCs), quality/assessment, social media, analytics and research methods

Listen with webReader


EDEN has now published a second report on my review of papers submitted to the EDEN research workshop in Oxford a couple of weeks ago. All the full papers for the workshop can be accessed here.

Main lessons (or unanswered questions) I took away:

OERs and MOOCs

  • what does awarding badges of certificates for MOOCs or other OER actually mean? For instance will institutions give course exemption or credits for the awards, or accept such awards for admission purposes? Or will the focus be on employer recognition? How will participants who are awarded badges know what their ‘currency’ is worth?
  • can MOOCs be designed to go beyond comprehension or networking to develop other critical 21st century skills such as critical thinking, analysis and evaluation? Can they lead to ‘transformational learning’ as identified by Kumar and Arnold (see Quality and Assessment below)
  • are there better design models for open courses than MOOCs as currently structured? If so what would they look like?
  • is there a future for learning object repositories when nearly all academic content becomes open and online?

Quality and assessment

  • research may inform but won’t resolve policy issues
  • quality is never ‘objective’ but is value-driven
  • the level of intervention must be long and significant enough to result in significant learning gains
  • there’s lots of research already that indicates the necessary conditions for successful use of online discussion forums but if these conditions are not present then learning will not take place
  • the OU’s traditional model of course design constrains the development of successful collaborative online learning.

Use of social media in open and distance learning

There were surprisingly few papers on this topic. My main takeaway:

  • the use of social media needs to be driven by sound pedagogical theory that takes into account the affordances of social media (as in Sorensen’s study described in an earlier post under course design)

Data analytics and student drop-out

  • institutions/registrars must pay attention to how student data is tagged/labeled for analytic purposes, so there is consistency in definitions, aggregation and interpretation;
  • when developing or applying an analytics software program, consideration needs to be given to the level of analysis and what potential users of the data are looking for; this means working with instructional designers, faculty and administrators from the beginning
  • analytics need to be integrated with action plans to identify and support early at risk students

Research methods


If these bullets interest you at all, then I strongly recommend you go and read the original papers in full – click here. My summary is of necessity personal and abbreviated and the papers provide much greater richness of context.



Conference in Crete on quality in open education

Listen with webReader

Heraklion, Crete

Heraklion, Crete: but the conference may not be here

What: SCOPE 2014: Changing the trajectory: quality for opening up education

‘In order to make open learning and education more relevant and feasible for organizations as well as learners, innovations have to be combined with well-proven learning traditions and flexible quality standards. In addition new models for recognition of open learning are needed: education institutions need a better understanding of how open education processes can contribute to excellent learning and high quality education provision, and certification schemes need to incorporate more flexible concepts of open education.

Who: EFQUEL (European Foundation for Quality in e-Learning)

When: 7-9 May, 2014

Where: Somewhere on the Greek island of Crete in the Mediterranean: the exact venue will be announced soon 

How: Submissions of scientific papers related to the conference (max 8 pages) must be sent to by February 10th, 2014 using the official template (see Interactive workshop proposals can make use of another template also available on the website.

Tom Carey’s reflections on the HEQCO report on online learning and productivity: 2 – What we left out – and why.

Listen with webReader
©, 2013

©, 2013

Carey, T., & Trick, D. (2013). How Online Learning Affects Productivity, Cost and Quality in Higher Education: An Environmental Scan and Review of the Literature. Toronto: Higher Education Quality Council of Ontario.

Tom Carey is one of the authors of the above study, and as an example of the best of reflective practice, he has kindly provided his thoughts about the report, now that it is finished. His first thoughts were published yesterday. This is the second part.

Tom Carey: Part II of Reflections on Researching and Writing on Emerging Developments in Online Learning

In yesterday’s guest post, I provided some reflections on the process and product of a research project for the Higher Education Quality Council of Ontario (HEQCO): How Online Learning Affects Productivity, Cost and Quality in Higher Education: An Environmental Scan and Review of the Literature. That post described the Results that Surprised in the report, from my perspective as an author. Today’s post provides some insights about what we chose to not include in the report, following the old advice of expert film editors that the most interesting scenes in a movie may be those “left behind on the cutting room floor”.

Developments We Didn’t Include

There were some emerging developments on our original target list for which we could not find compelling examples at scale: Semantic Web, Mobile Learning, Ubiquitous Connective, etc. I am sure these are going to be important, but in the interests of preserving the Teachable Moment aspects we focused only on developments with convincing data for impacts on learning outcomes and productivity: convincing in the context of Ontario higher education institutions.  For example, the Ithaka study of the Open Learning Initiative software allowed us to highlight Adaptive Learning Systems at scale (and the recent follow-up book by William Bowen contains several other insights we could cite if we were starting now).

Similarly, the report only deals with Open Educational Resources as a sideline in the discussion of Open and Affordable Textbooks: the rationale was that the textbook developments were a hot topic in “peer” higher education systems ‒ British Columbia, California, New York, etc. ‒ represented low hanging fruit in terms of potential for building collaborations amongst students, faculty and academic leaders within institutions.  And we didn’t do any justice to the Canadian developments in connectivist MOOCs, mostly because we had our hands full trying to help our target audience make sense of the instructionist MOOCs that were hogging the headlines and couldn’t work out how to get beyond that without losing their attention. (I have already apologized to George for this: Stephen, Dave et al can consider themselves included in the apology.)   These choices about which innovations to highlight may have bypassed the disruptive in favour of the radical, but helping decision makers to make sense of – and act on – opportunities for radical change was more than enough for us to bite off.

Issues We Couldn’t Include

Some of the content we wrote but could not include in the report was just not clear enough or complete enough to be included in the public document. For a few topics, we were keenly aware that more work had to be done but that we had not made sense of what that work might be. We didn’t want to go on at length in the report about these points for several reasons: calling for further research sounds like too familiar an ending to a Research Report, including more than a quick mention for what is not yet clear seemed to detract from our Teachable Moment goal, and some of the further exploration needed would be an outcome of our proposed Call for Action through collaborative sense-making across institutions.  For those interested in ‘where to next’ in terms of understanding the impact of emerging developments, here is my personal list of high priority issues that need more clarity.

The different roles of various online learning interactions in various contexts: I would like to have referenced the work by Terry Anderson and others on how increases in one form of learning interaction can result in a decreased need for another type of interaction. This was implicit in our Call to Action around understanding and leveraging scalability: use more scalable interactions where appropriate in order to redirect resources – especially time – into other interactions which are less readily scaled.

Here is my current woefully incomplete attempt to reframe our analysis of emerging developments in online learning using different types of learning interactions – whether online or not:

  • Learner-content interactions can be used effectively to advance Quality and Productivity for technical mastery outcomes, e.g., performance tasks with single solutions and predictable pathways to completion (allowing adaptive systems to provide task guidance)
  • Learner-learner interactions can be used effectively to advance Quality and Productivity for (some) of the question-and-answer and formative feedback roles traditionally carried out with learner-instructor interactions, and seem to be essential (at the moment?) for outcomes involving complex challenges with diverse ways of knowing.
  • Learner-instructor interactions appear to be essential for outcomes involving deep personal change related to learning itself:  grappling with threshold concepts, enhancing learning mindsets and strategies, and ‘getting better at getting better’ for knowledge-intensive work
  • Learner-expert interactions are required for formation of learners’ identity and practice as members of knowledge-building communities, whether in professional/career contexts or in their roles as community members and global citizens.

Much more work to be done in this area, including ensuring that the outcomes listed above that are not readily scaled don’t get left out in the quest for greater productivity:  if we neglect such outcomes, where would the ‘higher’ be in higher education?

Institutional productivity gains may be possible @ scale in traditional institutions: you may have noticed that the list of interactions above has unbundled the role of “instructor” (who can apply expertise in pedagogical content knowledge) from the role of “practice expert” who can help learners transition into full engagement with knowledge-practice networks. Traditional institutions may struggle to unbundle such roles, or even to respect their differing contributions.

Traditional institutions – and in Ontario higher education, that is all we have at the moment ‒ may also struggle to reinvest the results of productivity gains from online learning beyond a course context. As long as we think of ‘workload’ in terms of ‘courses taught’, any savings in effort may disappear into other localized activities. How can we reframe the work, and workload, of teaching to optimize educator and learner time, without resorting to an alien ‘managerial’ language? (Mention “activity-based-costing” in a budget meeting and the challenges to reinvesting productivity gains become all too clear J.)

I tried adapting the idea of Constructive Alignment from instructional design, with an expansion into “Productive Alignment” where educators also include in their designs the goal of optimizing resource usage. If certain students can achieve certain learning outcomes with reduced learner-instructor interactions, e.g., with MOOC resources used in a hybrid course format, then effective instructional design requires that we achieve this Productive Alignment to optimize time and resources. I couldn’t explain this notion effectively enough to include it in the report, but I am convinced that some such changes in the ways we talk about educator roles and responsibilities are going to be needed if the full potential of online learning is to be achieved. And this is going to be both more necessary and more difficult with the ‘higher’ learning experiences and outcomes listed above, which develop slowly over the course of a program and are not readily described as discrete competencies to be tested in a short-term performance task.

Dealing with Quality and Productivity in tandem is a fiscal, political and pedagogical necessity: finally, I wish we had been able to make a better case for the pedagogical rationale for dealing with Scalability, Quality and Productivity issues in parallel. We did include some rationale for determined action now on systematic collaborations across institutions to understand and leverage the emerging potential of online learning.

However, that argument was framed mostly in terms of fiscal and political realities. The fiscal reality across higher education systems requires that we get more focused on deploying the least resources to achieve the highest level of outcomes, and the political reality requires that we in public higher education either ‘do or be done to’ in our dealings with Quality and Productivity.

But there is another rationale that is more closely linked to our purposes and ideals in higher education. Even if we did not have our current fiscal constraints and the expectations of stricter constraints in the future…even if public higher education had the full confidence of political leaders as to our ability to change and adapt to our changing circumstances…if our students see us clinging to traditional practices and structures rather than taking on our challenges with boldness and confidence, what model are we presenting to them about how to deal with challenges in their workplaces, their families and communities, in the earth’s environment and the global knowledge economy?  Will our plea for engagement with the knowledge and wisdom of the past, present and future fall on deaf ears if we don’t practice what we preach?

Making that case was beyond our reach in this project, but it remains on my personal to-do list. I keep thinking of Parker Palmer’s concise formulation in The Courage to Teach: “How we teach is a critical part of what we teach”. That is the pedagogical rationale for our taking charge of higher education’s fate by applying emerging knowledge and wisdom about online learning…with care, compassion and courage.


Tom Carey’s reflections on the HEQCO report on online learning and productivity: 1-Catching a teachable moment

Listen with webReader

© Leblanc, P., EDUCAUSE, 2013

Carey, T., & Trick, D. (2013). How Online Learning Affects Productivity, Cost and Quality in Higher Education: An Environmental Scan and Review of the Literature. Toronto: Higher Education Quality Council of Ontario.

Tom Carey is one of the authors of the above study, and as an example of the best of reflective practice, he has kindly provided his thoughts about the report, now that it is finished. The reflection is in two parts. The second part will follow tomorrow.

Tom Carey:

Tony’s last post presented a summary and response to a report authored by David Trick and myself for the Higher Education Quality Council of Ontario (HEQCO): How Online Learning Affects Productivity, Cost and Quality in Higher Education: An Environmental Scan and Review of the Literature. In parallel, I prepared this guest post with my reflections on what surprised me during the process of researching and writing the Environmental Scan of emerging developments in online learning. Tomorrow’s guest post will reflect on what we did not or could not include in the final version – partly because we were still left with a lot of sense-making still to do by the time we had hit our limits for pages and time (looking for help here, folks).

David may perhaps want to chime in on the surprises and omissions of the Literature Review where he so capably took the lead. Tony was also involved as an expert advisor, along with George Siemens and Ed Walker – our thanks again to all of you. We hope that our complementary perspectives will spark some more dialogue about these important issues: there have also been comments from Terry Anderson and visual notes by Giulia Forsythe.

Results that Surprised

My colleague Carl Bereiter pointed out to me long ago that the most interesting question to ask about a research project concerns the “results that surprise” – I think this was his opening query at every Ph.D. thesis defense. As someone with a long involvement in online learning, OER and distance education, here is my list of surprises:

We didn’t end up writing a research report: I’d better clarify that for our helpful partners at HEQCO, who sponsored the report and supported us in the process. We did write a Research Report as stated in the contract, but we found the decisions we made about content and tone were influenced equally by the mandate for a Research Report, the opportunity for a Teachable Moment and the growing sense that a Call for Collective Action through collaboration across institutions was needed.

The Teachable Moment came in part from the clamour about MOOCs, where the participation of prominent institutions had caused a sudden jump in attention – and perhaps even credibility – for online learning amongst some academic and political leaders. We tried to seize that moment, however brief it might be, to make the case for online learning as an ally in the challenges faced by higher education across the world: how to educate more students, with deeper and more lasting learning outcomes, in the face of fiscal constraints driven by demographic and economic factors over which we had no control. (More on the Call for Collective Action below…)

It seemed to help if we presented MOOCs as a cumulative development: as we worked on the report, we came to the conclusion that one way to make sense of MOOCs was to consider them as the aggregation of several other factors with longer histories and more evidence of success. The report positions MOOCs (the instructionist type, at least) as building on the other developments in online learning that we highlighted: Affordable Online Texts, Adaptive Systems, Learning Analytics and a variety of approaches to optimize student-instructor interactions. Framing MOOCs this way, as leveraging all of these advances at once to drive down the marginal cost of more students, seemed to reduce the magic of offering courses without charge. We make no claim that this is a complete portrayal of the MOOC phenomenon as of May 2013 – it certainly was not complete – and leave it to readers to assess how helpful that particular perspective may be in diminishing the image of MOOCs as a totally new (and alien?) invention.

There are still open questions about ‘significant difference’: as someone who had institutional oversight for online learning, I was already familiar with the body of research demonstrating that online learners could do at least as well as their counterparts in traditional courses. There were some caveats of course, and David summarized very clearly what the research said about what kinds of students might be best able to take advantage of online learning.  While data at the course level was available, we did not find similar evidence about ‘no significant difference’ at the program level…or for deeper learning outcomes that do not lend themselves to large scale objective measures that can be replicated across different student cohorts.

Of course, a big part of that lack of data derives from the difficulty of measuring such outcomes within a program – let alone across different ways to offer and support programs over time. It happened that as we were wrestling with these issues I was also working on program-level outcomes with several professional schools at a university on Canada’s west coast. Many of the outcomes we were trying to define and assess, such as ‘professional formation’ and ‘epistemic fluency’ (a term I learned about from Peter Goodyear of the University of Sydney), struck me as requiring interactions quite different from any of the course examples where ‘no significant difference’ results had appeared. Steve Gilbert of the TLT Group shared a label for this: The Learning That Matters Most. I describe below how this evolved into The Learning That Scales Least (at least as far as our current evidence shows).

The concluding Call to Collective Action was about scalability much more than technology: my thanks go out to the members of the Ontario Universities Council on E-Learning for getting on to me about this when I presented the report’s conclusions to them earlier this month. They convinced me to soften the focus on online learning in the concluding recommendations, in favour of a more level playing field around the evidence – at this moment and subject to change – about scaling up teaching and learning environments. If I were writing that part of the report now, my recommendation for the collaborative action required across the colleges and universities in the province would be something like this (but more concise, clearer, etc.):

We need to work together on understanding and leveraging emerging developments to scale up learning experiences ‒ wherever appropriate ‒ so that the resulting gains in productivity will allow us to sustain and advance the student outcomes requiring other kinds of learning experiences (that are not readily scaled).

The “collective” part of this conclusion came out of our difficulty in making sense of the emerging developments that we had highlighted: to paraphrase race car driver Mario Andretti, we concluded that “if you think you know what is happening, you’re not going deep enough”. We didn’t put anything that folksy in the report, but we did conclude that we couldn’t get much further on definitive statements about “where are MOOCs going” and the like. If we couldn’t make sense of emerging directions, perhaps we could “make sense of making sense”: outline a course of action in which institutions could collaborate to share the risks of the necessary investment in systematic experimentation. (There is more on this Call to Action in tomorrow’s guest post on Themes We Couldn’t Include…).

Thanks, Tom!


A review of the HEQCO report on productivity and quality in online learning in higher education

Listen with webReader

The view from HEQCO, Toronto

The view from HEQCO, Toronto

Carey, T., & Trick, D. (2013). How Online Learning Affects Productivity, Cost and Quality in Higher Education: An Environmental Scan and Review of the Literature. Toronto: Higher Education Quality Council of Ontario

Why this paper is important

In July, the Higher Education Quality Council of Ontario published the above report. This is a very important development for online learning in post-secondary education as it takes a very hard look at quality, cost and productivity and comes forward with recommendations to government. This is a paper that is likely to be read (and should be read) by legislators, state and government policy makers, university and college boards and senior university and college administrators.

I am also exploring through a series of blogs the issue of productivity and online learning, partly because of dissatisfaction with the current state of thinking about this issue, which became apparent working with this project.

For this reason, I am setting aside my hat as an Advisory Board member who commented on the penultimate draft, and and am here providing a full analytic review of the paper. To do this, I have had to reproduce key parts of the document, but I strongly recommend that the HEQCO document is read in full. Quotes from the actual paper are in italics, although I have edited and abbreviated in part.

The paper focuses on the following questions:

  • What are the cost implications of a shift to online learning? Specifically, does a greater use of online instruction save institutions or systems money and, if so, under what circumstances?
  • What do we know about the relationship between online learning and important variables that are often considered when discussing the “quality” of an institution or of a system?

Main findings

  • The evidence reviewed suggests that, for a range of students and learning outcomes, fully online instruction produces learning that is on par with face-to-face instruction.
  • the students most likely to benefit are those who are academically well prepared and highly motivated to learn independently. Students who are not well prepared to learn at the postsecondary level or do not devote the necessary time to learning are less likely to benefit from online learning and may in fact do better in a face-to-face setting.
  • the provincial government… should have an interest in making sure [well-prepared and motivated students] have online learning opportunities available to them. These opportunities should serve students’ learning needs, and – if carried out at large scale – should produce cost efficiencies for higher education institutions, the student or both.
  • there is no evidence that all of the learning outcomes expected of postsecondary students in Ontario can be achieved solely by online learning. 

Main recommendations to the Ontario provincial government and Ontario universities and colleges

  • set a target that, within three years, a specified list of high-demand university and college programs that are primarily or entirely online will be available to Ontario students.
  • set a target that, within three years, a specified list of high-demand courses will be available online and will be accepted for credit at all Ontario universities and colleges that offer a program in that discipline.
  • a set of high-quality degree programs that qualify the student for admission to any Ontario graduate school, and a set of high-quality courses that are accepted for credit by every Ontario institution, will be preferable to a multiplicity of courses and programs that operate on a small scale.
  • By working with other institutions in Ontario and elsewhere, Ontario colleges and universities can leverage and help shape emerging developments in online learning.
  • Coordination will be required to ensure that economies of scale are achieved in an environment of rapid technological change. 
  • Ontario colleges and universities should be encouraged to work with peer institutions to ensure that engagement with advances in online learning fully supports the province’s strategic goals for quality and access in a time of constrained funding. 
  • An effective government strategy will begin by adapting existing regulatory infrastructure to remove unnecessary barriers to high-quality online education. 
  • Hybrid courses that blend online learning with face-to-face instruction should also be encouraged where they improve learning outcomes. Hybrid courses fit well within the government’s existing regulatory structure and so present fewer policy challenges. 


We have looked especially for meta-analyses which compare traditional versus online education at a system, course or activity level. We have made only secondary use of studies and reports from individual instances or instructors where institutionalization and sustained use have not been addressed.


There is remarkably little empirical literature that documents the costs of online education relative to face-to-face education. So very little evidence on costs is available in this report

The authors though do provide an extensive list of barriers to cost reduction.

The authors conclude this section as follows:

To the extent that online education reduces costs, there is no consensus about who should or would benefit from the reduction. Students seek lower tuition fees; governments seek reduced subsidies for higher education; university employees seek better compensation. This situation presents a principal-agent problem: it is difficult to motivate change when those affected by change will not receive the contemplated financial benefit.

Emerging developments

The following emerging developments are discussed:

  • Affordable and open textbooks
  • Adaptive interactions with learning resources
  • Optimizing student-instructor interaction time
  • Targeting instructional effort based on student program data
  • Minimizing marginal costs via Massive Open Online Courses

The authors also identify several common themes across the individual developments:

  • Aligning Support to the Student’s Individual Learning Needs
  • “Thinking Globally, Acting Locally” to Achieve Benefits at Scale
  • Transparency and Knowledge Intensity in Instructional Design
  • Reputational Capital From and For Online Learning
  • The Challenge of Investment at/for Scale

Observations (for recommendations, see above)

Fully online education presents opportunities for major economies of scale. By definition, these economies can only be achieved if a large scale is reached.

Fully online education has the potential to provide a high-quality education – for some students, in some fields of study – at significantly lower unit costs than traditional forms of instruction. The cost savings have the potential to help fund the cost of improving traditional learning, including the costs of introducing hybrid models that lead to better learning outcomes. The challenge is to make it happen.

What is striking to us about these viewpoints is the agreement that what is least likely to be done effectively at scale and with technological mediation is precisely what matters most in higher education.

My critique

Overall, this is an excellent report that will be valuable to policy makers, if they read it in full. The danger is that they will jump to the recommendations, which are not really the strength of this report. Its value lies in exploring assumptions and beliefs about online learning and productivity and providing data and evidence that sometimes supports such beliefs, and other times challenges them. The section on emerging developments is particularly strong, especially the analysis of common themes across the individual developments.

Comparative quality of online learning

Although the authors focused their literature review on ‘meta-analyses of rigorous experimental studies’, the result is a master lesson on why such studies are usually a waste of time, particularly with regard to ‘quality’ defined in this report as to whether online learning achieves equal learning outcomes to face-to-face teaching. Such studies on using different media and technologies to deliver education date back until the early 1970s, and results are consistent: mode of delivery is less important than method of teaching and multiple other factors. In statistical terms, variance within experimental groups is larger than variations between experimental groups. In plain language, the pedagogy matters, a point recognized by the authors later in the document when they acknowledge the importance of instructional design.

This is one reason why I am cautious about the research on ‘non-traditional’ students that suggests that online learning works less well for them. While I do not disagree with this in general, it can work well for some in this group when designed to meet their specific needs. The problem is that the Jaggars and Di Xu research quoted to support the conclusion in the HEQCO report is based on data from U.S. community colleges, many of which have a very poor record of using instructional design and best practices in online learning. You have to look at the quality of the teaching (in both modes), not just the delivery method.The HEQCO authors also correctly note that while many of Jaggars/Di Xu findings point to performance differences between online and face-to-face learning that are statistically significant, the differences are fairly small.


This is by far the most disappointing part of the study. The report draws on only two actual studies of the costs of online learning (both from the USA), neither of which are very helpful.

For reasons of time pressure and consistency, the authors decided to limit their research review to studies published in the last five years. As a result, studies such as my own on the cost of the University of British Columbia’s fully online Master in Educational Technology (which was originally published in 2003) are not included, even though the study provides a comprehensive analysis of the costs and more importantly the cost structures of a program that is still running on much the same cost basis as in 2003. This program has been remarkably successful with the following features:

  • fully cost-recoverable (including overheads and planning) from tuition fees alone
  • tuition fees the same as for on-campus graduate programs (fee level regulated by government)
  • over 300 students in the program each year with over 900 course enrolments
  • courses can be taken and paid for individually
  • 70-80 admissions a year, and 70-80 graduates a year, thus with a degree completion rate (for those enrolling in the full degree program) of over 90%.

This program alone has more than doubled the number of graduate students in the whole of the Faculty of Education and UBC has adopted this cost model for a number of its other professionally based masters programs, such as rehab science and creative writing. Not to include this because the study was done 10 years ago is almost perverse, because it shows that for certain kinds of courses, and certain kinds of students, online learning can be far more productive than face-to-face teaching. It is perverse, because real productivity gains only become apparent over time – a five year window is often too small to see the full benefits.

Emerging developments

For me, this was by far the strongest part of the paper, particularly the analysis of common themes across the developments. The paper is worth reading for this section alone.


Although I would support all the recommendations, they are very cautious.Partly because of the weakness or lack of research into online learning, costs and productivity, the recommendations necessarily have to be cautious.

However, since HEQCO itself is a government-funded policy research organization, perhaps an obvious recommendation would have been for more research on the costs of online learning, given the paucity of studies. Another area for research would be on institutional barriers and government policies that prevent greater scalability or adoption of online learning in Ontario universities.

It is still shocking to me that Ontario has such a poor system of credit transfer even between universities that make it almost impossible to set up consortium programs or enable student students to select combinations of courses/programs from different universities, given that a main advantage of online learning is that students could take courses from any university in Ontario. Maybe government regulation is necessary in this area, since the universities and colleges were given $65 million I believe over a year ago to solve this problem and haven’t done so yet.

None of the recommendations really addresses the issue of scale. I’m not sure I agree with the statement on p. 43:  What is striking to us about these viewpoints is the agreement that what is least likely to be done effectively at scale and with technological mediation is precisely what matters most in higher education, i.e. modelling, coaching, enabling students to construct knowledge, etc. Certainly, most MOOCs don’t do this, but it doesn’t necessarily mean that with a focused effort on instructional design, we could not design more cost-effective, high quality learning experiences through online programs on a larger scale than at present but not necessarily at massive level. This would combine lower cost per student with higher quality learning: the Nirvana of educational productivity

Thus I would like to have seen a recommendation to government and the institutions to put in the same level of investment as for MOOCs, but to develop a model that combines best practices in online learning combined with new technologies such as social media, to build partly self-supporting student learning communities on a larger scale than current campus-based programs, with high quality learning outcomes and completion rates. I think it could be done, but it needs substantial investment beyond the risk level of most individual universities, which is why government should be a partner.


Despite my criticisms this is an excellent report on a difficult topic and completed within a tight timeframe. It provides grist for productive discussions on costs and quality and really advances our understanding of the challenges of increasing productivity without losing quality in higher education.


Bates, A. and Sangra, A. (2013) Managing Technology in Higher Education San Francisco: Jossey-Bass (especially Chapter 7: Resources, Money and Decision-Making).