August 27, 2014

Developing intellectual and practical skills in a digital age

Listen with webReader

 

Skills 2

The story so far

Chapter 5 of my open textbook, ‘Teaching in a Digital Age’ is about the design of teaching and learning, which I am currently writing and publishing as I go.

I started Chapter 5 by suggesting that instructors should think about design through the lens of constructing a comprehensive learning environment in which teaching and learning will take place. I have started to work through the various components of a learning environment, focusing particularly on how the digital age affects the way we need to look at some of these components.

I started by looking at how the characteristics of our learners are changing, and followed that by examining how our perspectives on content are being influenced by the digital age. In this post, I look at how both intellectual and practical skills can be developed to meet the needs of a digital age. The following posts will do the same for learner support, resources and assessment respectively.

This will then lead to a discussion of different models for designing teaching and learning. These models aim to provide a structure for and integration of these various components of a learning environment.

Scenario: Developing historical thinking

© Wenxue City: China During the Early Days of the Reform

© Wenxue City: China During the Early Days of the Reform

Ralph Goodyear is a professor of history in a public Tier 1 research university in the central United States. He has a class of 120 undergraduate students taking HIST 305, ‘Historiography’.

For the first three weeks of the course, Goodyear had recorded a series of short 15 minute video lectures that covered the following topics/content:

  • the various sources used by historians (e.g. earlier writings, empirical records including registries of birth, marriage and death, eye witness accounts, artifacts such as paintings, photographs, and physical evidence such as ruins.)
  • the themes around which historical analysis tend to be written,
  • some of the techniques used by historians, such as narrative, analysis and interpretation
  • three different positions or theories about history (objectivist, marxist, post modernist).

Students downloaded the videos according to a schedule suggested by Goodyear. Students attended two one hour classes a week, where specific topics covered in the videos were discussed. Students also had an online discussion forum in the course space on the university’s learning management system, where Goodyear had posted similar topics for discussion. Students were expected to make at least one substantive contribution to each online topic for which they received a grade that went towards their final grade.

Students also had to read a major textbook on historiography over this three week period.

In the fourth week, he divided the class into twelve groups of six, and asked each group to research the history of any city outside the United States over the last 50 years or so. They could use whatever sources they could find, including online sources such as newspaper reports, images, research publications, and so on, as well as the university’s own library collection. In writing their report, they had to do the following:

  • pick a particular theme that covered the 50 years and write a narrative based around the theme
  • identify the sources they finally used in their report, and discuss why they selected some sources and dismissed others
  • compare their approach to the three positions covered in the lectures
  • post their report in the form of an online e-portfolio in the course space on the university’s learning management system

They had five weeks to do this.

The last three weeks of the course were devoted to presentations by each of the groups, with comments, discussion and questions, both in class and online (the in class presentations were recorded and made available online). At the end of the course, students assigned grades to each of the other groups’ work. Goodyear took these student gradings into consideration, but reserved the right to adjust the grades, with an explanation of why he did the adjustment. Goodyear also gave each student an individual grade, based on both their group’s grade, and their personal contribution to the online and class discussions.

Goodyear commented that he was surprised and delighted at the quality of the students’ work. He said: ‘What I liked was that the students weren’t learning about history; they were doing it.’

Based on an actual case, but with some embellishments.

Skills in a digital age

In Chapter 1, Section 1.4, I listed some of the skills that graduates need in a digital age, and argued that this requires a greater focus on developing such skills, at all levels of education, but particularly at a post-secondary level, where the focus is often on specialised content. Although skills such as critical thinking, problem solving and creative thinking have always been valued in higher education, the identification and development of such skills is often implicit and almost accidental, as if students will somehow pick up these skills from observing faculty themselves demonstrating such skills or through some form of osmosis resulting from the study of content. I also pointed out in the same section, though, that there is substantial research on skills development but the knowledge deriving from such research is at best applied haphazardly, if at all, to the development of intellectual skills.

Furthermore the skills required in a digital age are broader and more wide ranging than the abstract academic skills traditionally developed in higher education. For instance, they need to be grounded just as much in digital communications media as in traditional writing or lecturing, and include the development of digital competence and expertise within a subject domain, as well as skills such as independent learning and knowledge management. These are not so much new skills as a different emphasis, focus or direction.

It is somewhat artificial to separate content from skills, because content is the fuel that drives the development of intellectual skills. At the same time, in more traditionally vocational training, we see the reverse trend in a digital age, with much more focus on developing high level conceptual thinking as well as manual skills development. My aim here is not to downplay the importance of content, but to ensure that skills development receives as much focus and attention from instructors, and that we approach intellectual skills development in the same rigorous and explicit way as apprentices are trained in manual skills.

Setting goals for skills development

Thus a critical step is to be explicit about what skills a particular course or program is trying to develop, and to define these goals in such a way that they can be implemented and assessed. In other words it is not enough to say that a course aims to develop critical thinking, but to state clearly what this would look like in the context of the particular course or content area, in ways that are clear to students. In particular the ‘skills’ goals should be capable of assessment and students should be aware of the criteria or rubrics that will be used for assessment.

Thinking activities

A skill is not binary, in the sense that you either have it or you don’t. There is a tendency to talk about skills and competencies in terms of novice, intermediate, expert, and master, but in reality skills require constant practice and application and there is, at least with regard to intellectual skills, no final destination. So it is critically important when designing a course or program to design activities that require students to develop, practice and apply thinking skills on a continuous basis, preferably in a way that starts with small steps and leads eventually to larger ones. There are many ways in which this can be done, such as written assignments, project work, and focused discussion, but these thinking activities need to be thought about, planned and implemented on a consistent basis by the instructor.

Practical activities

It is a given in vocational programs that students need lots of practical activities to develop their manual skills. This though is equally true for intellectual skills. Students need to be able to demonstrate where they are along the road to mastery, get feedback on it, and retry as a result. This means doing work that enables them to practice specific skills.

In the scenario above, students had to cover and understand the essential content in the first three weeks, do research in a group, develop an agreed project report, in the form of an e-portfolio, share it with other students and the instructor for comments, feedback and assessment, and present their report orally and online. Ideally, they will have the opportunity to carry over many of these skills into other courses where the skills can be further refined and developed. Thus, with skills development, a longer term horizon than a single course will be necessary, so integrated program as well as course planning is important.

Discussion as a tool for developing intellectual skills

Discussion is a very important tool for developing thinking skills. However, not any kind of discussion. It was argued in Chapter 2 that academic knowledge requires a different kind of thinking to everyday thinking. It usually requires students to see the world differently, in terms of underlying principles, abstractions and ideas. Thus discussion needs to be carefully managed by the instructor, so that it focuses on the development of skills in thinking that are integral to the area of study. This requires the instructor to plan, structure and support discussion within the class, keeping the discussions in focus, and providing opportunities to demonstrate how experts in the field approach topics under discussion, and comparing students’ efforts.

Figure 5.3: Online threaded discussion forums provide students with opportunities for developing intellectual skills, but the instructor needs to design and manage such forums carefully for this to happen

Figure 5.3: Online threaded discussion forums provide students with opportunities for developing intellectual skills, but the instructor needs to design and manage such forums carefully for this to happen

In conclusion

There are many opportunities in even the most academic courses to develop intellectual and practical skills that will carry over into work and life activities in a digital age, without corrupting the values or standards of academia. Even in vocational courses, students need opportunities to practice intellectual or conceptual skills such as problem-solving, communication skills, and collaborative learning. However, this won’t happen merely through the delivery of content. Instructors need to:

  • think carefully about exactly what skills their students need,
  • how this fits with the nature of the subject matter,
  • the kind of activities that will allow students to develop and improve their intellectual skills, and
  • how to give feedback and to assess those skills, within the time and resources available.

This is a very brief discussion of how and why skills development should be an integral part of any learning environment. We will be discussing skills and skill development in more depth in later chapters.

Over to you

Your views, comments and criticisms are always welcome. In particular:

  • how does the history scenario work for you? Does it demonstrate adequately the points I’m making about skills development?
  • are the skills being developed by students in the history scenario relevant to a digital age?
  • is this post likely to change the way you think about teaching your subject, or do you already cover skills development adequately? If you feel you do cover skills development well, does your approach differ from mine?

Love to hear from you.

Next up

Learner support in a digital age

 

WCET’s analysis of U.S. statistics on distance education

Listen with webReader

IPEDS 2

U.S.Department of Education (2014) Web Tables: Enrollment in Distance Education Courses, by State: Fall 2012 Washington DC: U.S.Department of Education National Center for Education Statistics

Hill, P. and Poulin, R. (2014) A response to new NCES report on distance education e-Literate, June 11

The U.S. Department of Education’s Institute of Education Sciences operates a National Center for Education Statistics which in turn runs the Integrated Postsecondary Education Data System (IPEDS). IPEDS is:

a system of interrelated surveys conducted annually by the U.S. Department’s National Center for Education Statistics (NCES). IPEDS gathers information from every college, university, and technical and vocational institution that participates in the federal student financial aid programs. The Higher Education Act of 1965, as amended, requires that institutions that participate in federal student aid programs report data on enrollments, program completions, graduation rates, faculty and staff, finances, institutional prices, and student financial aid. These data are made available to students and parents through the College Navigator college search Web site and to researchers and others through the IPEDS Data Center

Recently IPEDS released “Web Tables” containing results from their Fall Enrollment 2012 survey. This was the first survey in over a decade to include institutional enrollment counts for distance education students. In the article above, Phil Hill of e-Literate and Russell Poulin of WCET have co-written a short analysis of the Web Tables released by IPEDS.

The Hill and Poulin analysis

The main points they make are as follows:

  • overall the publication of the web tables in the form of a pdf is most welcome, in particular by providing a breakdown of IPEDS data by different variables such as state jurisdiction, control of institution, sector and student level
  • according to the IPEDS report there were just over 5.4 million students enrolled in distance education courses in the fall semester 2012 (NOTE: this number refers to students, NOT course enrollments).
  • roughly a quarter of all post-secondary students in the USA are enrolled in a distance education course.
  • the bulk of students in the USA taking distance education courses are in publicly funded institutions (85% of those taking at least some DE courses), although about one third of those taking all their classes at a distance are in private, for-profit institutions (e.g. University of Phoenix)
  • these figures do NOT include MOOC enrollments
  • as previously identified by Phil Hill in e-Literate, there is major discrepancy in the number of students taking at least one online course between the IPEDS study and the regular annual surveys conducted by Allen and Seaman at Babson College – 7.1 million for Babson and 5.5 million for IPEDS. Jeff Seaman, one of the two Babson authors, is also quoted in e-Literate on his interpretation of the differences. Hill and Poulin comment that the NCES report would have done well to at least refer to the significant differences.
  • Hill and Poulin claim that there has been confusion over which students get counted in IPEDS reporting and which do not. They suspect that there is undercounting in the hundreds of thousands, independent of distance education status.

Comment

There are lies, damned lies and statistics. Nevertheless, although the IPEDS data may not be perfect, it does a pretty good job of collecting data on distance education students across the whole of the USA. However, it does not distinguish between mode of delivery of distance education (are there still mainly print-based courses around)?

So we now have two totally independent analyses of distance education students in the USA, with a minimum number of 5.5 million and a maximum number of 7.1 million, i.e. between roughly a quarter and a third of all post-secondary students. From the Allen and Seaman longitudinal studies, we can also reasonably safely assume that online enrollments have been increasing between 10-20% per annum over the last 10 years, compared with overall enrollments of 2-5% per annum.

By contrast, in Canada we have no national data on either online or distance education students. It’s hard to see how Canadian governments or institutions can take evidence-based policy decisions about online or distance education without such basic information.

Lastly, thank you, Phil and Russ, for a very helpful analysis of the IPEDs report.

Update

For a more detailed analysis, see also:

Haynie, D. (2014) New Government Data Sheds Light on Online Learners US News, June 13

 

Opening up: chapter one of Teaching in a Digital Age

Listen with webReader
The view when I was writing Chapter 1, from the Island of Braç, Croatia

The view when I was writing Chapter 1, from the Island of Braç, Croatia

I’ve not been blogging much recently, because (a) I’ve been on holiday for a month in the Mediterranean and (b) I’ve been writing my book.

Teaching in a Digital World

As you are probably aware, I’m doing this as an open textbook, which means learning to adapt to a new publishing environment. As well as writing a darned good book for instructors on teaching in in a digital age, my aim is to push the boundaries a little with open publishing, to move it out of the traditional publishing mode into a a truly open textbook, with the help of the good folks at BCcampus who are running their open textbook project.

You will see that there’s still a long way to go before we can really exploit all the virtues of openness in publishing, and I’m hoping you can help me – and BCcampus- along the way with this.

What I’d like you to do

What I’m hoping you will do is find the time to browse the content list and preface (which is not yet finalized) and read more carefully Chapter 1, Fundamental Change in Higher Education, then give me some feedback. To do this, just go to: http://opentextbc.ca/teachinginadigitalage/

The first thing you will realise is that there is nowhere to comment on the published version. (Ideally I would like to have a comment section after every section of each chapter.) I will be publishing another post about some of the technical features I feel are still needed within PressBooks, but in the meantime, please use the comment page on this post (in which case your comment will be public), or use the e-mail facility  at the bottom of the chapter or preface (in which case your comment will be private). Send to tony.bates@ubc.ca .

What kind of feedback?

At this stage, I’m looking more for comments on the substance of the book, rather than the openness (my next post will deal with the technical issues). To help you with feedback, here are some of the questions I’m looking for answers to:

  1. Market: from what you’ve read so far, does there appear to be a need for this type of book? Are there other books that already do what I’m trying to do?
  2. Structure: does Chapter 1 have the right structure? Does it flow and is it logically organized?How could it be improved?
  3. Content: is there anything missing, dubious or just plain wrong? References that I have missed that support (or challenge) the content would also be useful.
  4. Do the activities work for you? Are there more interesting activities you can think of? How best to provide feedback? (e.g. does the use of ‘Parts’ work for this?)
  5. Presentation: are there other media/better images I could use? Is the balance between text and media right?

What’s in it for you?

First, I hope the content will be useful. Chapter 1 is probably the least useful of all the chapters to come for readers of this blog, because it’s aimed at instructors who are not comfortable with using technology, but if the material is useful to you, you are free to use it in whatever way you wish, within the constraints of a Creative Commons license.

Second, the whole point of open education is to share and collaborate. I’m opening up my book and the process; in return can I get some help and advice? In anticipation and with a degree of nervousness I look forward to your comments.

A balanced research report on the hopes and realities of MOOCs

Listen with webReader

Columbia MOOCs 2

Hollands, F. and Tirthali, D. (2014) MOOCs: Expectations and Reality New York: Columbia University Teachers’ College, Center for Benefit-Cost Studies of Education, 211 pp

We are now beginning to see a number of new research publications on MOOCs. The journal Distance Education will be publishing a series of research articles on MOOCs in June, but now Hollands and Tirthali have produced a comprehensive research analysis of MOOCs.

What the study is about

We have been watching for evidence that MOOCs are cost-effective in producing desirable educational outcomes compared to face-to-face experiences or other online interventions. While the MOOC phenomenon is not mature enough to afford conclusions on the question of long-term cost-effectiveness, this study serves as an exploration of the goals of institutions creating or adopting MOOCs and how these institutions define effectiveness of their MOOC initiatives. We assess the current evidence regarding whether and how these goals are being achieved and at what cost, and we review expectations regarding the role of MOOCs in education over the next five years. 

The authors used interviews with over 80 individuals covering 62 institutions ‘active in the MOOCspace’, cost analysis, and analysis of other research on MOOCs to support their findings. They identified six goals from the 29 institutions in the study that offered MOOCs, with following analysis of success or otherwise in accomplishing such goals:

1. Extending reach (65% 0f the 29 institutions)

Data from MOOC platforms indicate that MOOCs are providing educational opportunities to millions of individuals across the world. However, most MOOC participants are already well-educated and employed, and only a small fraction of them fully engages with the courses. Overall, the evidence suggests that MOOCs are currently falling far short of “democratizing” education and may, for now, be doing more to increase gaps in access to education than to diminish them. 

2. Building and maintaining brand (41%)

While many institutions have received significant media attention as a result of their MOOC activities, isolating and measuring impact of any new initiative on brand is a difficult exercise. Most institutions are only just beginning to think about how to capture and quantify branding-related benefits.

3. Reducing costs or increasing revenues (38%)

….revenue streams for MOOCs are slowly materializing but we do not expect the costs of MOOC production to fall significantly given the highly labor-intensive nature of the process. While these costs may be amortized across multiple uses and multiple years, they will still be additive costs to the institutions creating MOOCs. Free, non-credit bearing MOOCs are likely to remain available only from the wealthiest institutions that can subsidize the costs from other sources of funds. For most institutions, ongoing participation in the current MOOC experimentation will be unaffordable unless they can offer credentials of economic value to attract fee-paying participants, or can use MOOCs to replace traditional offerings more efficiently, most likely by reducing expensive personnel. 

4. Improving educational outcomes (38%)

for the most part, actual impact on educational outcomes has not been documented in any rigorous fashion. Consequently, in most cases, it is unclear whether the goal of improving educational outcomes has been achieved . However, there were two exceptions, providing evidence of improvement in student performance as a result of adopting MOOC strategies in on-campus courses

5. Innovation in teaching and learning (38%)

It is abundantly clear that MOOCs have prompted many institutions and faculty members to engage in new educational activities. The strategies employed online such as frequent assessments and short lectures interspersed with questions are being taken back on-campus. It is less clear what has been gained by these new initiatives because the value of innovation is hard to measure unless it can be tied to a further, more tangible objective. We …. conclude that most institutions are not yet making any rigorous attempt to assess whether MOOCs are more or less effective than other strategies to achieve these goals. 

6. Research on teaching and learning (28%)

A great deal of effort is being expended on trying to improve participant engagement and completion of MOOCs and less effort on determining whether participants actually gain skills or knowledge from the courses ….While the potential for MOOCs to contribute significantly to the development of personalized and adaptive learning is high, the reality is far from being achieved. 

Cost analysis

The report investigates the costs of developing MOOCs compared to those for credit-based online courses, but found wide variations and lack of reliable data.

Conclusions from the report

The authors came to the following conclusions:

1. there is no doubt that online and hybrid learning is here to stay and that MOOCs have catalyzed a shift in stance by some of the most strongly branded institutions in the United States and abroad.

2. MOOCs could potentially affect higher education in more revolutionary ways by:

  • offering participants credentials of economic value

  • catalyzing the development of true adaptive learning experiences

However, either of these developments face substantial barriers and will require major changes in the status quo.

My comments on the report

First this is an excellent, comprehensive and thoughtful analysis of the expectations and realities of MOOCs. It is balanced, but where necessary critical of the unjustified claims often made about MOOCs. This report should be required reading for anyone contemplating offering MOOCs.

Different people will take away different conclusions from this report, as one would expect from a balanced study. From my perspective, though, it has done little to change my views about MOOCs. MOOC providers to date have made little effort to identify the actual learning that takes place. It seems to be enough for many MOOC proponents to just offer a course, on the assumption that if people participate they will learn.

Nevertheless, MOOCs are evolving. Some of the best practices that have been used in credit-based online courses are now being gradually adopted as more MOOC players enter the market with experience of credit-based online learning. MOOCs will eventually occupy a small but important niche as an alternative form of non-formal, continuing and open education. They have proved valuable in making online learning more acceptable within traditional institutions that have resisted online learning previously. But no-one should fear them as a threat to credit-based education, either campus-based or online.

Tracking online learning in the USA – and Ontario

Listen with webReader

Babson 2012 enrollment graph Allen, I. and Seaman, J. (2014) Grade Change: Tracking Online Learning in the United States Wellesley MA: Babson College/Sloan Foundation

This is the eleventh annual report in this invaluable series on tracking online education in the United States of America. It is invaluable, because, through the consistent support of the Sloan Foundation, the Babson College annual survey provides a consistent methodology that allows for the tracking of the growth and development of online learning in the USA over more than a decade.

There is nothing comparable in Canada, but nevertheless I will use this post to try and draw some comparisons between the development of online earning in the USA and at least the largest system in Canada, that of Ontario, which does have at least some data. Also, Ontario has just established Ontario Online, a system wide initiative aimed at strengthening Ontario’s online learning activities. The Sloan/Babson surveys have important lessons for Ontario’s new initiative.

Methodology

The survey is sent to the Chief Academic Officer (CAO) of every higher education institution in the USA (private and public, universities and two year colleges), over 4,600 in all. Over 2,800 responses were received from institutions that accounted for just over 80% of all higher education enrollments in the USA (most non-responses came from small institutions, i.e. institutions with 1,500 students or less, who were far less likely to have online courses, as a sector).

An online course is defined in this report as one in which at least 80 percent of the course content is delivered online as a normal part of an institution’s program. MOOCs are therefore considered a completely different category from the ‘normal’ credit-based online courses in this report.

What is the report about?

The scope of the report can best be described from the questions the report seeks to answer:

  • What is Online Learning, what is a MOOC?
  • Is Online Learning Strategic?
  • Are Learning Outcomes in Online Comparable to Face-to-Face?
  • Schools Without Online Offerings
  • How Many Students are Learning Online?
  • Do Students Require More Discipline to Complete Online Courses?
  • Is Retention of Students Harder in Online Courses?
  • What is the Future of Online Learning?
  • Who offers MOOCs?
  • Objectives for MOOCs
  • Role of MOOCs

Main findings

This relatively short report (40 pages, including tables) is so stuffed with data that it is somewhat invidious to pick and choose results. Because it is short and simply written you are strongly recommended to read it yourself in full. However, here are the main points I take away:

Growth of credit-based online learning continues but is slowing

Sounds a bit like an economic report on China, doesn’t it? Allen and Seaman claim that a total of 7.1 million students are now taking at least one online course, or roughly 34% of all enrollments. (Note: ‘% taking at least one course’ is not the same as ‘% of all course enrollments’ which would be a better measure.) Online learning enrollments were up 6.5% in 2013, a slowing of the rate of growth which had been in the 10-15% range per annum in recent years. Nevertheless, online enrollments are still growing five times faster that enrollments in general in the USA, and most CAOs anticipate that this growth in online learning enrollments will continue into the future.

MOOCs are still a very small component of online learning

The number of institutions offering MOOCs rose from 2.6% in 2012 to 5% in 2103. The majority of institutions offering MOOCs are doctoral/research and there is a high proportion in the private, not-for-profit sector. This sector has been historically less involved in credit-based online learning.

Graph sectors with online learning

Less than a quarter of CAOs believe that MOOCs represent a sustainable method for offering online courses, down from 28 percent in 2012, and a majority of academic leaders (64%) have concerns that credentials for MOOC completion will cause confusion about higher education degrees.

Sector differences

The report identifies some clear differences between the different sectors in the USA’s very diverse post-secondary education system. Small institutions (less than 1,500) and doctoral/research institutions are far less likely to offer online courses. CAOs from institutions not offering online learning tend to be more critical of the quality of online learning and far less likely to think it essential to their future.

Of the CAOs from institutions offering online courses, nearly one-quarter believe online outcomes to be superior, slightly under 20 percent think them inferior, with the remainder (57%) reporting that the learning outcomes are the same as for classroom delivery

What about Canada – and Ontario in particular?

I have long lamented that we have no comparable data on online learning in Canada. The government of Ontario did do a census of all its universities and colleges in 2010 and found just under 500,000 online course registrations, or 11% of all university and college enrollments, with online enrollments in universities (13%) higher than in two-year colleges (7%). If we extrapolate from the USA figures (highly dubious, I know) which showed a 16% increase in online enrollments between fall 2010 and fall 2012, this would put Ontario’s online enrollments in 2012 at approximately 563,000.

More significantly, the Ontario government survey provided hard data on course completion rates:

  • the median in the college sector for the 20 colleges that responded to the question was 76.1% with most institutions reporting results between 70% and 79%.
  • the median in the university sector for the 15 universities that responded was 89% with most universities reporting results from 85% to 95%.

Contact North did a ‘cross-country check-up’ in 2012. It concluded (p.14):

Using proxy data (estimates provided by a variety of different organizations and a standard measure of full-time equivalent student set at 9.5 course registrations per FTE), we can estimate that there are between 875,000 and 950,000 registered online students in Canada (approximately 92,105 – 100,000 full-time students) at college and universities studying a purely online course at any one time.

The problem though is that these are one-off studies. While the government of Ontario is to be congratulated on doing the 2010 survey, it decided not to continue it in the following years (or more accurately, it did not decide to repeat it.) The Contact North data is at best a rough estimate, again valuable in itself, but needs to done on a more systematic and regular basis across the country (Canada’s higher education system is devolved to each of 12 provinces with no federal responsibility or office for post-secondary education, and Statistics Canada has been cut back in recent years by the current Conservative Government).

However, there is now hope. The government of Ontario has just established Ontario Online, a collaborative Centre of Excellence that will be governed and operated by the province’s colleges and universities. It has a start-up budget of $42 million. One of the first things it should do is to repeat and expand the 2010 survey, to establish a baseline for measuring the province’s progress in online learning. The expansion should include also measurement of hybrid/blended learning (preferably using the same definitions as the Babson survey for comparative purposes.) To do this accurately, institutions will need to categorize the type of courses they are offering in their courses’ database, if they have not already done this to date. Without such a baseline of data, it will be almost impossible to assess not just the success of Ontario Online, but of online learning in general in Ontario.

I would also hope that as the country’s largest province, with probably the greatest number of online courses and enrollments, Ontario will take leadership at the national Council of Ministers of Education, Canada (CMEC) to get the survey it has developed adopted and administered by all provinces across Canada. Politicians and experts can huff and puff all they like about the importance of online learning, but if they don’t measure it, it’s all just hot air.

In summary, many thanks to Sloan and Babson College for their invaluable work. Ontario has done far more than any other province in Canada to identify the extent of online learning, and is poised to make an even greater breakthrough through its new Ontario Online initiative. However, systematic data collection is essential for measuring the success of any online learning initiatives or strategies.