March 24, 2017

What counts when you cost online learning?

Poulin, R. and Straut, T. (2017) Distance Education Price and Cost Report Boulder CO: WCET

This highly controversial report has generated considerable discussion in WCET’s own Forum, and has received a good deal of media coverage. When you read the report you will see why.

Much of the media coverage has focused on the finding that respondents to the survey on which this report is based were by and large of the opinion that distance education costs more than classroom teaching. But you need to read the report more carefully to understand why respondents responded in this way. It all comes down to how you cost distance education or online learning. In particular, you need to understand the context of the report. 

As always, you should read the report itself, not my summary, especially if you disagree with what’s in my summary.

The context

The context for this report is very political and very American (by which I mean USAnian, i.e. applying specifically to the USA). The report is more about price – what institutions charge students – than it is about cost.

The cost of tuition (the fees students or their parents pay to the institution) continues to increase in the USA way beyond the rate of inflation, and many institutions not only charge the same tuition rates for online or distance education courses, but also add additional fees. In other words, many American institutions increase the price for an online or distance course compared to its face-to-face equivalent.

However, the political perception, especially in state legislatures, is that distance education is cheaper than on-campus teaching, so some states (e.g. Wisconsin and Florida) have introduced legislation or initiatives to reduce the price of online learning courses below that of face-to-face programs.

As the authors note:

Historically, distance education’s mission has been to overcome the barriers of place or time. The mission was not to control costs. In fact, to reach some locations is costly. Distance education should not be held accountable to a mission it was never given.

distance education professionals are caught in a higher education economics ethos that shuns open examination of price and cost…and are expected to answer to a “controlling cost” mission that was not given them in the first place.

It is within this context that WCET decided to do the study in order to challenge some of the assumptions about the price and cost of distance education.


As always, you need to know the methodology in order to interpret the results. The report indeed is very transparent about its methodology, which is not tucked away in an appendix or not discussed at all (which seems to be a practice that is increasing in many so-called ‘studies’ these days), but is front and centre in the report.


The authors provide the following definition:

  • Price – This is the amount of money that is charged to a student for instruction. The components are tuition and fees. In the questions, we will be clear as to which “price” component (tuition, fees, or total price) is being queried. 
  • Cost – This is the amount of money that is spent by the institution to create, offer, and support instruction. 
  • Distance Education – When thinking of “distance education,” we favor the Babson Survey Research Group definition of 80% or more of the course being taught at a distance.


WCET surveyed mainly its own members and members of other distance education consortia. Overall, 197 responded.

We had hoped for more participation in the survey. It is important to note that the responses provided represent only the institutional representatives who answered the survey questions. Even though we provide comparisons between the responding population and the overall higher education population, we do not assert that the results may be generalized to the universe of all institutions of higher education in the U.S. and Canada that offer distance education courses.

What can be said is that the response came mainly from distance education and educational technology professionals, rather than faculty or senior administrators, mainly in public HE institutions.

Main results

I will deal with these very briefly, although the detailed findings are more nuanced.

  1. The price of DE is generally higher than for face-to-face teaching. More than half (54%) of the respondents reported that their institution charged more for distance education courses than for on-campus courses.
  2. A majority of respondents believed that the cost of DE was higher than for face-to-face teaching on certain defined components (e.g. faculty development, technologies, instructional design, assessments, state authorization – a long and complex process of ‘accrediting’ DE courses unique to the USA).
  3. ‘Experts’ in the costs of DE tended to disagree that costs of DE are necessarily higher
  4. The experts also noted that cost discussions are often avoided by higher education leadership and that more could be done to control costs, not just in distance education.

The reports main conclusions

The conclusions were split into recommendations for legislators and institutions:

For legislators

  • focus questions on future costs and in particular the likely impact of investing in buildings vs distance education in terms of the impact of the cost to students
  • provide more incentives for institutions to reduce the price to students
  • don’t be prescriptive but help institutions develop a vision for state higher education that is realistic and shared

For institutions

  • pay as much attention to the cost to students as to the cost to the institution of various delivery methods
  • be more open about costs and track them for all modes of delivery
  • changing the cost structure requires structural changes in how we design and deliver programs; this requires leadership from the senior administration.

My comments on the report

The report is right to draw attention to the creeping costs to students (e.g. price) resulting from institutional policies in the USA. What is also apparent is that there is a large disconnect between institutions and government about the cost of distance education. Many educators believe that DE is more expensive; government thinks it should be cheaper. Somewhere in the middle is a discussion about quality: does cheaper mean worse?

Cherry-picking costs

Unfortunately, though, for methodological reasons, I fear the report has confused rather than clarified the discussion about costs and price. In particular, by focusing on components that are specific to distance education, such as faculty support, the use of technologies, and the cost of state authorization of DE, the report has clearly given the impression that most educators believe that distance education is more expensive. It can be, but it doesn’t have to be.

It is unfortunate that the report has given this impression because you cannot just look at the costs of specific components of distance education without looking also at specific components of face-to-face teaching that are not represented in the costs of distance education, in particular the very substantial ‘sunk’ costs of buildings, parking, etc. There are better ways of measuring the costs of distance education and online programs – see Chapter 7 in Bates and Sangra (2011).

Making DE cost-effective

While we can develop cost-effective fully online programs, this normally depends on generating new revenues from new students. Offering online courses as an alternative to already existing students on campus, while increasing access and student flexibility, is much more financially risky.

Again, this can be managed cost-effectively, but it depends on having enough students taking both on-campus and online versions of the course, and the use of additional adjunct professors for online courses with more than 30 students. Bringing in new students who you wouldn’t get without the courses being online is the best bet to ensure economic viability. ‘Diluting’ your on-campus students by offering the same course online will add costs unless the numbers can justify it.

What about the costs of blended learning?

One last point. I think we are going to have a period of considerable cost turmoil as we move to blended learning, because this really does add costs unless there are dramatic redesigns, especially of the large first and second year classes. Carol Twigg of the National Centre for Academic Transformation for many years has been able to bring down costs – or more often increase effectiveness for the same cost – for these large lecture classes by using blended learning designs (although there are some criticisms of her costing methodology).

By and large though, while fully online courses can maybe increase enrolments by 10-15% and therefore help pay their way, we will have major cost or academic time problems if we move to nearly all courses being blended, without increased training for faculty, so they can manage without the same level of support provided by instructional designers, etc. that are normally provided for fully online courses (see ‘Are we ready for blended learning?‘).

Moving forward 

I’m glad then that WCET has produced a report that focuses not only on the costs of distance education to institution but also on pricing policies. There is in my view no economic justification for charging more for an online course than a face-to-face course as a matter of principle. You need to do the sums and institutions are very bad at doing this in a way that tracks the cost of activities rather than throwing everything into one bucket then leaking it out at historical rates to different departments.

Institutions need to develop more rigorous methods for tracking the costs of different modes of delivery while also building in a measure of the benefits as well. If the report at least moves institutions towards this, it will have been well worth it.

In the USA, fully online enrollments continue to grow in 2014

Image: WCET, 2015

Image: WCET, 2015

Straut, T.T. and Poulin, R. (2015) Highlights of Distance Education Trends from IPEDS Fall 2014, WCET Frontiers, 21 December


WCET (the Western Co-operative for Educational Technology) has once again done an excellent job in analysing the U.S. Department of Education’s National Center for Educational Statistics (NCES)’  Integrated Postsecondary Education Data System (IPEDS) data that reports Distance Education (DE) student enrollment for the Fall of 2014.


Enrollments by students ‘Exclusively in Distance Education’ continued to rise in 2014. There were 2,824,334 fully online enrollments in 2014, compared to 2,659,203 in 2013, representing a 6% increase in just one year, or just under 13% of total enrollments.

Students taking at least some fully online courses but not an entirely fully online program also increased, from 2,806,048 in 2013 to 2,926,083 in 2014 (a 4% increase). [Note: these are not students taking blended or hybrid courses, but taking some fully online courses as well as campus-based courses.]

At the same time overall enrollments dropped slightly (just under 1%). Thus online learning continues to grow faster than conventional higher education. Taken together at least 28% of all U.S. higher education students are taking at least some fully online courses.

Image: WCET, 2015

Image: WCET, 2015

However, perhaps more interesting is where this growth occurred. The biggest increase in fully online courses came from the more prestigious private, non-profit sector (22% increase), while the for-profit sector (UofPhoenix, etc.) actually declined by 11%.  Indeed, the for-profit sector now accounts for less than one third of all fully online enrollments.


The IPEDS data is relatively new (this is the third year of reporting). There are problems of definition (‘distance education’ and ‘fully online’ are not necessarily the same), and there appears in past years to have been inconsistent reporting across institutions.

WCET will be following up on this initial report with more detailed reports in 2016, including an analysis of the reliability of the data.


Despite the cautions, this data, based on a census of all U.S. higher education institutions, is probably the most reliable to date.

Despite the (assumed) growth in blended learning, fully online learning appears to be more than holding its own. One reason is clear. Many of the more prestigious private, non-profit institutions have room to grow in their adoption of online learning, being slower initially to move in this direction.

To what extent this growth of online learning in the private, non-profit sector is owed to the publicity from or experience with MOOCs remains to be assessed, but the growth of for-credit online learning in this sector is an indication of the increasingly broad acceptance now of fully online learning.

What is needed now is more data on – and clearer definitions of – blended learning, as it seems reasonable to assume that as on-campus programs become more flexible through blended learning, this will impact eventually on fully online enrollments. But kudos to the U.S. Department of Education for setting up these surveys and to WCET in helping with the analysis. Now if only Canada…….Justin?

Lies, Damned Lies and Statistics: WCET’s analysis of distance education enrolments in the USA

Out-of-state students 2

Russell Poulin and Terri Straut have done an invaluable analysis of recent data on distance education enrolments in the USA in the following three blog posts:

Straut, T. and Poulin, R. (2015) IPEDS Fall 2013: Higher Ed Sectors Vary Greatly in Distance Ed Enrollments Boulder CO: Western Co-operative for Educational Technologies

Straut, T. and Poulin, R. (2015) IPEDS Fall 2013: Distance Education Data Reveals More Than Overall Flat Growth Boulder CO: Western Co-operative for Educational Technologies

Straut, T. and Poulin, R. (2015) IPEDS Fall 2013: Less than Half of Fully Distant Students Come from Other States Boulder CO: Western Co-operative for Educational Technologies

These reports should be read in conjunction with these equally valuable posts:

Hill, P. and Poulin, R. (2014) Investigation of IPEDS Distance Education Data: System Not Ready for Modern Trends Boulder CO: Western Co-operative for Educational Technologies/e-Literate

Allen, I.E. and Seaman, J. (2013) Changing Course: Ten Years of Tracking Online Education in the United States  Wellesley MA: Babson College/Quahog Research Group

I am pulling this together in this one post for convenience, but I strongly recommend that you read carefully the original reports.

There are serious methodological issues in the USA data

Over the last ten years or so, the most consistent analyses of enrolments in online learning have been the annual Babson College surveys conducted by Elaine Allen and Jeff Seaman, with support from the Sloan Foundation. However, this was a voluntary survey, based on a carefully drawn sample of chief academic officers across the USA. The Babson Surveys showed consistent growth of online course enrolments in the order of 10-20 per cent per annum over a the last 10 years, compared with around 2-3 per cent growth in on-campus enrolments, with in 2013 approximately one third of all higher education students in the USA taking at least one fully online course.

However, since the Babson surveys were voluntary, sample-based and dependent on the good will of participating institutions, there was always a concern about the reliability of the data, and especially that the returns might be somewhat biased towards enrolments from institutions actively engaged in online learning, thus suggesting more online enrolments than in reality. Despite these possible limitations the Babson Surveys were invaluable because they provided a comparable set of national data across several years. So while the actual numbers may be a little shaky, the trends were consistent.

Then in 2012 the U.S. Federal Integrated Postsecondary Education Data System (IPEDS) survey, conducted by the National Center for Education Statistics, a division of the U.S. Federal Department of Education, for the first time included distance education in its compulsory annual survey of enrolments in higher education. (One might ask why it took until 2012 to ask for data on distance education, but hey, it’s a start.) Since this is a census rather than a survey, and since it is obligatory, one would expect that the IPEDS data would be more reliable than the Babson surveys.

However, it turns out that there are also major problems with the IPEDS survey. Phil Hill (of the blog e-Literate) and Russell Poulin have indicated the following limitations with IPEDS:

  • problems of definition: Babson focused only on students enrolled in fully online courses; IPEDS asks for enrolments in distance education. Although many institutions have moved their print-based courses online, there are still many print-based distance education courses still out there. How many? We don’t know. Also the IPEDS definition rules out reporting on blended or hybrid courses, and is not precise enough to ensure that different institutions don’t interpret who to include and who to exclude on a consistent basis
  • under-reporting: IPEDS collected data on the assumption that all students enrolled through continuing education departments were taking non-credit distance education courses, and therefore these enrolments were to be excluded. However, in many institutions, continuing education departments have continued to administer for-credit online courses, which institutions have seen as just another form of distance education. (In other institutions, distance education departments have been integrated with central learning technology units, and are thus included in enrolment counts.)
  • the IPEDS survey does not work for innovative programs such as those with continuous enrolments, competency-based learning, or hybrid courses.

Hill and Poulin come to the following conclusions about the 2012 survey:

  • we don’t know the numbers – there are too many flaws in the the data collection methods
  • thus the 2012 numbers are not a credible baseline for future comparisons
  • there are hundreds of thousands of students who have never been reported on any IPEDS survey that has ever been conducted.

It is against this background that we should now examine the recent analyses by Straut and Poulin on the IPEDS data for  2013. However, note their caveat:

Given the errors that we found in colleges reporting to IPEDS, the Fall 2012 distance education reported enrollments create a very unstable base for comparisons.

Main results for 2013

1. Most DE enrolments are in public universities

For those outside the USA, there are quite different types of HE institution, dependent on whether they are publicly funded or privately funded, and whether they operate for profit or not for profit. Distance education is often associated in the USA with diploma mills, or offered by for-profit private institutions, such as the University of Phoenix or Kaplan. As it turns out, this is a fundamental mis-conception. Nearly three-quarters of all DE enrolments are in publicly funded universities. Less than 10% of all DE enrolments are in for-profit private institutions.

2. Students studying exclusively at a distance

Students studying exclusively at a distance constitute about 13% of all enrolments. However, non-profits rely much more on distance students, who make up half their enrolments. Less than 10% of students in public universities are studying exclusively at a distance. The significance of this is that for most students in public universities, DE is a relatively small part of their studies, an option that they exercise occasionally and as needed, and is not seen as a replacement for campus-based studies. On the other hand, there is a substantial if small minority for whom DE is the only option, and for many of these, the for-profits are their the only option if their local public universities do not offer such programs in the discipline they want.

3. DE enrolments were down slightly in 2013

IPEDS shows an overall decrease in DE enrolments of 4% from 2012 to 2013. The biggest area was the for-profits, which declined by 17%. The drop in public universities for those taking fully online courses was a marginal 2%. However, this is a major difference from the trends identified by the Babson Surveys.

This is probably the most contentious of the conclusions, because the differences are relatively small and probably within the margin of error, given the unreliability of the data. The for-profit sector has been particularly badly hit by changes to federal financial aid for students.

However, I have been predicting that the rate of students taking fully online courses in the USA (and Canada) is likely to slow in the future for two reasons:

  • there is a limit to the market for fully online studies and after 10 years of fairly large gains, it is not surprising that the rate now appears to be slowing down
  • as more and more courses are offered in a hybrid mode, students have another option besides fully online for flexible study.

The counter trend is that public universities still have much more scope for increasing enrolments in fully online professional masters programs, as well as for certificates, diplomas and badges.

4. Students studying fully online are still more likely to opt for a local university

Just over half of all students enrolled exclusively in DE courses take their courses from within state. This figure jumps to between 75-90% for those enrolled in a public university. On the other hand, 70% of students enrolled in a DE course in a for-profit take their courses from out-of-state. This is not surprising, since although non-profits have to have their headquarters somewhere, they operate on a national basis.

The proportion of institutions reporting that they serve students who are outside the U.S. remains small, no more than 2% in any sector. This again may be a reporting anomaly, as 21% of institutions reported that they have students located outside the U.S. Probably of more concern is that many institutions did not report data on the location of their DE students. This may have something to do with the need for authorization for institutions to operate outside the home state, and this is a uniquely American can of worms that I don’t intend to open.

Not good, but it’s better than nothing

I have an uncomfortable feeling about the IPEDS data. It needs to be better, and it’s hard to draw any conclusions or make policy decisions on what we have seen so far.

However, it’s easy for someone outside the USA to criticise the IPEDS data, but at least it’s an attempt to measure what is an increasingly significant – and highly complex – area of higher education. We have nothing similar in Canada. At least the IPEDS data is likely to improve over time, as institutions press for clearer definitions, and are forced to collect better and more consistent data.

Also, I can’t praise too highly first of all Elaine Allen and Jeff Seaman for their pioneering efforts to collect data in this area, and Phil Hill, Russell Poulin and Terri Straut for guiding us through the minefield of IPEDS data.

For a nice infographic on this topic from WCET, click on the image below:

WCET infographic 2

WCET’s analysis of U.S. statistics on distance education


U.S.Department of Education (2014) Web Tables: Enrollment in Distance Education Courses, by State: Fall 2012 Washington DC: U.S.Department of Education National Center for Education Statistics

Hill, P. and Poulin, R. (2014) A response to new NCES report on distance education e-Literate, June 11

The U.S. Department of Education’s Institute of Education Sciences operates a National Center for Education Statistics which in turn runs the Integrated Postsecondary Education Data System (IPEDS). IPEDS is:

a system of interrelated surveys conducted annually by the U.S. Department’s National Center for Education Statistics (NCES). IPEDS gathers information from every college, university, and technical and vocational institution that participates in the federal student financial aid programs. The Higher Education Act of 1965, as amended, requires that institutions that participate in federal student aid programs report data on enrollments, program completions, graduation rates, faculty and staff, finances, institutional prices, and student financial aid. These data are made available to students and parents through the College Navigator college search Web site and to researchers and others through the IPEDS Data Center

Recently IPEDS released “Web Tables” containing results from their Fall Enrollment 2012 survey. This was the first survey in over a decade to include institutional enrollment counts for distance education students. In the article above, Phil Hill of e-Literate and Russell Poulin of WCET have co-written a short analysis of the Web Tables released by IPEDS.

The Hill and Poulin analysis

The main points they make are as follows:

  • overall the publication of the web tables in the form of a pdf is most welcome, in particular by providing a breakdown of IPEDS data by different variables such as state jurisdiction, control of institution, sector and student level
  • according to the IPEDS report there were just over 5.4 million students enrolled in distance education courses in the fall semester 2012 (NOTE: this number refers to students, NOT course enrollments).
  • roughly a quarter of all post-secondary students in the USA are enrolled in a distance education course.
  • the bulk of students in the USA taking distance education courses are in publicly funded institutions (85% of those taking at least some DE courses), although about one third of those taking all their classes at a distance are in private, for-profit institutions (e.g. University of Phoenix)
  • these figures do NOT include MOOC enrollments
  • as previously identified by Phil Hill in e-Literate, there is major discrepancy in the number of students taking at least one online course between the IPEDS study and the regular annual surveys conducted by Allen and Seaman at Babson College – 7.1 million for Babson and 5.5 million for IPEDS. Jeff Seaman, one of the two Babson authors, is also quoted in e-Literate on his interpretation of the differences. Hill and Poulin comment that the NCES report would have done well to at least refer to the significant differences.
  • Hill and Poulin claim that there has been confusion over which students get counted in IPEDS reporting and which do not. They suspect that there is undercounting in the hundreds of thousands, independent of distance education status.


There are lies, damned lies and statistics. Nevertheless, although the IPEDS data may not be perfect, it does a pretty good job of collecting data on distance education students across the whole of the USA. However, it does not distinguish between mode of delivery of distance education (are there still mainly print-based courses around)?

So we now have two totally independent analyses of distance education students in the USA, with a minimum number of 5.5 million and a maximum number of 7.1 million, i.e. between roughly a quarter and a third of all post-secondary students. From the Allen and Seaman longitudinal studies, we can also reasonably safely assume that online enrollments have been increasing between 10-20% per annum over the last 10 years, compared with overall enrollments of 2-5% per annum.

By contrast, in Canada we have no national data on either online or distance education students. It’s hard to see how Canadian governments or institutions can take evidence-based policy decisions about online or distance education without such basic information.

Lastly, thank you, Phil and Russ, for a very helpful analysis of the IPEDs report.


For a more detailed analysis, see also:

Haynie, D. (2014) New Government Data Sheds Light on Online Learners US News, June 13


WCET’s WOW awards for innovative uses of educational technologies

 WCET Advance

The WICHE Cooperative for Educational Technologies (WCET) has announced the recipients of the 2013 WCET Outstanding Work (WOW) award, a competition that recognizes innovative uses of educational technologies in higher education.

1. The Open Educational Resource (OER) Faculty Fellowship at Lane Community College (Eugene, Oregon).

This program was born out of a student need for textbook affordability.  The College has made a minor annual investment over the past few years that – as of today – saves students $326,400 per year in textbook costs. Additionally, the program has a plan to recruit more faculty each year, extending the benefit to the college and the students. This is an excellent example of a college developing a strategy for OERs that has led to valuable, pragmatic and measurable outcomes.

2. Obojobo, University of Central Florida

Obojobo  is a Learning Object system designed, crafted, and maintained by The University of Central Florida. It provides a platform for the collaborative design, sharing, and distribution of instructional components in a variety of academic areas. Obojobo has allowed key departments such as the Library and Student Development and Enrollment Services to create and share learning resources that faculty can incorporate into their courses. The system also collects valuable data related to student performance for  feedback on their learning. Obojobo has also allowed UCF to provide different self-paced faculty development programs. This is another example of why the University of Central Florida is a world leader in blended and distributed learning.

3. University of North Carolina: The Online Proctoring Network

The University of North Carolina’s Exam Proctoring Network promotes academic integrity by providing a standardized and streamlined proctoring process for students, faculty and proctors.   The UNC network is the only “one-stop” proctoring solution across a state-wide system.  The secure system allows students, faculty and proctors to schedule appointments, securely transfer documents and receive automated reminders when an action is required. The system is in use at six campuses, and three of the remaining campuses will go live fall 2013. The focus is on ensuring academic integrity in online student assessment.

Videos about each of these projects can be viewed from WCET’s media release

The WOW awardees will be recognized by WCET’s national community of higher education innovators during the  WCET 25th Annual Meeting in Denver, CO, November 13-15, 2013.