November 20, 2017

Latest data on USA distance education enrolments

An extract from the Digital Learning Compass infographic available from here

Digital Learning Compass (2017) Distance Education Enrolment Report 2017 Wellesley MA

A new partnership for the analysis of distance education data in the USA

First, a little background. Most of the readers of this blog will be familiar with the reports from the Babson Survey Research Group (BSRG) on the state of online learning in the USA. When the U.S. Department of Education’s Integrated Postsecondary Education Survey (IPEDS) began collecting data on distance learning enrolments in the Fall of 2012, BSRG stopped collecting its own data then formed a partnership with e-Literate and WCET to create Digital Learning Compass with the following goal:

To be the definitive source of information on the patterns and trends of U.S. postsecondary distance learning.

The Distance Education Enrolment Report 2017 is Digital Learning Compass’s analysis of the data collected by IPEDS for the fall of 2015.

Main results

In brief, in the USA in 2015:

  • distance education enrolments increased by almost 4% 
  • almost 30% of all post-secondary students in the USA are taking at least one DE course
  • 14% of all students are taking only DE courses
  • 83% of DE enrolments are in undergraduate courses
  • just over two-thirds of DE enrolments are in public universities or colleges
  • although there has been increased growth in DE enrolments for public and for non-profit private universities, DE enrolments in for-profit institutions declined in 2015 for the third year in a row, driven by substantial decreases in just a few of the for-profit institutions
  • almost half of all DE enrolments are concentrated in less than 5% of all institutions, with the top 47 institutions accounting for almost a third of all DE students
  • the following institutions saw the greatest year-on-year growth in DE enrolments:
    • University of Southern New Hampshire (from 11,286 to 56,371 in one year)
    • Western Governors University,
    • Brigham Young University-Idaho,
    • University of Central Florida,
    • Grand Canyon University
  • the number of students studying on a campus has dropped by almost one million (931,317) between 2012 and 2015.

More detailed analysis can be found from:

Comment

First a declaration of interest: I am working closely with both Jeff Seaman of Babson and Russ Poulin of WCET on the Canadian national survey of online and distance education in Canada.

Despite a small drop in overall enrolments in the USA in 2015, DE enrolments continued to grow, although in the three years from 2012 to 2015 the pace of growth has slowed. The main change was in the for-profit sector, probably affected by federal pressure on the use of student loans and congressional pressure for greater regulation of for-profit institutions under the Obama administration.

Indeed there has been a considerable shake-up in the for-profit sector in the USA, the purchase of Kaplan by Purdue, a state-funded university, being the latest example. It will be interesting to watch what happens to the for-profit DE enrolments under the more liberal regulatory environment being brought in by the Trump administration. Will they rebound? 

However perhaps the most shocking result is the drop in campus-based enrolments of almost one million, no doubt due to the increased cost of attending college in the USA – or is this in fact due to the impact of six million enrolments in distance education courses?

Once again, here in Canada we are peering over the wall at our much larger and richer neighbours, wondering what’s going on, but at least it is now a well lit property thanks to these reports.

 

Lies, Damned Lies and Statistics: WCET’s analysis of distance education enrolments in the USA

Out-of-state students 2

Russell Poulin and Terri Straut have done an invaluable analysis of recent data on distance education enrolments in the USA in the following three blog posts:

Straut, T. and Poulin, R. (2015) IPEDS Fall 2013: Higher Ed Sectors Vary Greatly in Distance Ed Enrollments Boulder CO: Western Co-operative for Educational Technologies

Straut, T. and Poulin, R. (2015) IPEDS Fall 2013: Distance Education Data Reveals More Than Overall Flat Growth Boulder CO: Western Co-operative for Educational Technologies

Straut, T. and Poulin, R. (2015) IPEDS Fall 2013: Less than Half of Fully Distant Students Come from Other States Boulder CO: Western Co-operative for Educational Technologies

These reports should be read in conjunction with these equally valuable posts:

Hill, P. and Poulin, R. (2014) Investigation of IPEDS Distance Education Data: System Not Ready for Modern Trends Boulder CO: Western Co-operative for Educational Technologies/e-Literate

Allen, I.E. and Seaman, J. (2013) Changing Course: Ten Years of Tracking Online Education in the United States  Wellesley MA: Babson College/Quahog Research Group

I am pulling this together in this one post for convenience, but I strongly recommend that you read carefully the original reports.

There are serious methodological issues in the USA data

Over the last ten years or so, the most consistent analyses of enrolments in online learning have been the annual Babson College surveys conducted by Elaine Allen and Jeff Seaman, with support from the Sloan Foundation. However, this was a voluntary survey, based on a carefully drawn sample of chief academic officers across the USA. The Babson Surveys showed consistent growth of online course enrolments in the order of 10-20 per cent per annum over a the last 10 years, compared with around 2-3 per cent growth in on-campus enrolments, with in 2013 approximately one third of all higher education students in the USA taking at least one fully online course.

However, since the Babson surveys were voluntary, sample-based and dependent on the good will of participating institutions, there was always a concern about the reliability of the data, and especially that the returns might be somewhat biased towards enrolments from institutions actively engaged in online learning, thus suggesting more online enrolments than in reality. Despite these possible limitations the Babson Surveys were invaluable because they provided a comparable set of national data across several years. So while the actual numbers may be a little shaky, the trends were consistent.

Then in 2012 the U.S. Federal Integrated Postsecondary Education Data System (IPEDS) survey, conducted by the National Center for Education Statistics, a division of the U.S. Federal Department of Education, for the first time included distance education in its compulsory annual survey of enrolments in higher education. (One might ask why it took until 2012 to ask for data on distance education, but hey, it’s a start.) Since this is a census rather than a survey, and since it is obligatory, one would expect that the IPEDS data would be more reliable than the Babson surveys.

However, it turns out that there are also major problems with the IPEDS survey. Phil Hill (of the blog e-Literate) and Russell Poulin have indicated the following limitations with IPEDS:

  • problems of definition: Babson focused only on students enrolled in fully online courses; IPEDS asks for enrolments in distance education. Although many institutions have moved their print-based courses online, there are still many print-based distance education courses still out there. How many? We don’t know. Also the IPEDS definition rules out reporting on blended or hybrid courses, and is not precise enough to ensure that different institutions don’t interpret who to include and who to exclude on a consistent basis
  • under-reporting: IPEDS collected data on the assumption that all students enrolled through continuing education departments were taking non-credit distance education courses, and therefore these enrolments were to be excluded. However, in many institutions, continuing education departments have continued to administer for-credit online courses, which institutions have seen as just another form of distance education. (In other institutions, distance education departments have been integrated with central learning technology units, and are thus included in enrolment counts.)
  • the IPEDS survey does not work for innovative programs such as those with continuous enrolments, competency-based learning, or hybrid courses.

Hill and Poulin come to the following conclusions about the 2012 survey:

  • we don’t know the numbers – there are too many flaws in the the data collection methods
  • thus the 2012 numbers are not a credible baseline for future comparisons
  • there are hundreds of thousands of students who have never been reported on any IPEDS survey that has ever been conducted.

It is against this background that we should now examine the recent analyses by Straut and Poulin on the IPEDS data for  2013. However, note their caveat:

Given the errors that we found in colleges reporting to IPEDS, the Fall 2012 distance education reported enrollments create a very unstable base for comparisons.

Main results for 2013

1. Most DE enrolments are in public universities

For those outside the USA, there are quite different types of HE institution, dependent on whether they are publicly funded or privately funded, and whether they operate for profit or not for profit. Distance education is often associated in the USA with diploma mills, or offered by for-profit private institutions, such as the University of Phoenix or Kaplan. As it turns out, this is a fundamental mis-conception. Nearly three-quarters of all DE enrolments are in publicly funded universities. Less than 10% of all DE enrolments are in for-profit private institutions.

2. Students studying exclusively at a distance

Students studying exclusively at a distance constitute about 13% of all enrolments. However, non-profits rely much more on distance students, who make up half their enrolments. Less than 10% of students in public universities are studying exclusively at a distance. The significance of this is that for most students in public universities, DE is a relatively small part of their studies, an option that they exercise occasionally and as needed, and is not seen as a replacement for campus-based studies. On the other hand, there is a substantial if small minority for whom DE is the only option, and for many of these, the for-profits are their the only option if their local public universities do not offer such programs in the discipline they want.

3. DE enrolments were down slightly in 2013

IPEDS shows an overall decrease in DE enrolments of 4% from 2012 to 2013. The biggest area was the for-profits, which declined by 17%. The drop in public universities for those taking fully online courses was a marginal 2%. However, this is a major difference from the trends identified by the Babson Surveys.

This is probably the most contentious of the conclusions, because the differences are relatively small and probably within the margin of error, given the unreliability of the data. The for-profit sector has been particularly badly hit by changes to federal financial aid for students.

However, I have been predicting that the rate of students taking fully online courses in the USA (and Canada) is likely to slow in the future for two reasons:

  • there is a limit to the market for fully online studies and after 10 years of fairly large gains, it is not surprising that the rate now appears to be slowing down
  • as more and more courses are offered in a hybrid mode, students have another option besides fully online for flexible study.

The counter trend is that public universities still have much more scope for increasing enrolments in fully online professional masters programs, as well as for certificates, diplomas and badges.

4. Students studying fully online are still more likely to opt for a local university

Just over half of all students enrolled exclusively in DE courses take their courses from within state. This figure jumps to between 75-90% for those enrolled in a public university. On the other hand, 70% of students enrolled in a DE course in a for-profit take their courses from out-of-state. This is not surprising, since although non-profits have to have their headquarters somewhere, they operate on a national basis.

The proportion of institutions reporting that they serve students who are outside the U.S. remains small, no more than 2% in any sector. This again may be a reporting anomaly, as 21% of institutions reported that they have students located outside the U.S. Probably of more concern is that many institutions did not report data on the location of their DE students. This may have something to do with the need for authorization for institutions to operate outside the home state, and this is a uniquely American can of worms that I don’t intend to open.

Not good, but it’s better than nothing

I have an uncomfortable feeling about the IPEDS data. It needs to be better, and it’s hard to draw any conclusions or make policy decisions on what we have seen so far.

However, it’s easy for someone outside the USA to criticise the IPEDS data, but at least it’s an attempt to measure what is an increasingly significant – and highly complex – area of higher education. We have nothing similar in Canada. At least the IPEDS data is likely to improve over time, as institutions press for clearer definitions, and are forced to collect better and more consistent data.

Also, I can’t praise too highly first of all Elaine Allen and Jeff Seaman for their pioneering efforts to collect data in this area, and Phil Hill, Russell Poulin and Terri Straut for guiding us through the minefield of IPEDS data.

For a nice infographic on this topic from WCET, click on the image below:

WCET infographic 2

Tracking online learning in the USA – and Ontario

Babson 2012 enrollment graph Allen, I. and Seaman, J. (2014) Grade Change: Tracking Online Learning in the United States Wellesley MA: Babson College/Sloan Foundation

This is the eleventh annual report in this invaluable series on tracking online education in the United States of America. It is invaluable, because, through the consistent support of the Sloan Foundation, the Babson College annual survey provides a consistent methodology that allows for the tracking of the growth and development of online learning in the USA over more than a decade.

There is nothing comparable in Canada, but nevertheless I will use this post to try and draw some comparisons between the development of online earning in the USA and at least the largest system in Canada, that of Ontario, which does have at least some data. Also, Ontario has just established Ontario Online, a system wide initiative aimed at strengthening Ontario’s online learning activities. The Sloan/Babson surveys have important lessons for Ontario’s new initiative.

Methodology

The survey is sent to the Chief Academic Officer (CAO) of every higher education institution in the USA (private and public, universities and two year colleges), over 4,600 in all. Over 2,800 responses were received from institutions that accounted for just over 80% of all higher education enrollments in the USA (most non-responses came from small institutions, i.e. institutions with 1,500 students or less, who were far less likely to have online courses, as a sector).

An online course is defined in this report as one in which at least 80 percent of the course content is delivered online as a normal part of an institution’s program. MOOCs are therefore considered a completely different category from the ‘normal’ credit-based online courses in this report.

What is the report about?

The scope of the report can best be described from the questions the report seeks to answer:

  • What is Online Learning, what is a MOOC?
  • Is Online Learning Strategic?
  • Are Learning Outcomes in Online Comparable to Face-to-Face?
  • Schools Without Online Offerings
  • How Many Students are Learning Online?
  • Do Students Require More Discipline to Complete Online Courses?
  • Is Retention of Students Harder in Online Courses?
  • What is the Future of Online Learning?
  • Who offers MOOCs?
  • Objectives for MOOCs
  • Role of MOOCs

Main findings

This relatively short report (40 pages, including tables) is so stuffed with data that it is somewhat invidious to pick and choose results. Because it is short and simply written you are strongly recommended to read it yourself in full. However, here are the main points I take away:

Growth of credit-based online learning continues but is slowing

Sounds a bit like an economic report on China, doesn’t it? Allen and Seaman claim that a total of 7.1 million students are now taking at least one online course, or roughly 34% of all enrollments. (Note: ‘% taking at least one course’ is not the same as ‘% of all course enrollments’ which would be a better measure.) Online learning enrollments were up 6.5% in 2013, a slowing of the rate of growth which had been in the 10-15% range per annum in recent years. Nevertheless, online enrollments are still growing five times faster that enrollments in general in the USA, and most CAOs anticipate that this growth in online learning enrollments will continue into the future.

MOOCs are still a very small component of online learning

The number of institutions offering MOOCs rose from 2.6% in 2012 to 5% in 2103. The majority of institutions offering MOOCs are doctoral/research and there is a high proportion in the private, not-for-profit sector. This sector has been historically less involved in credit-based online learning.

Graph sectors with online learning

Less than a quarter of CAOs believe that MOOCs represent a sustainable method for offering online courses, down from 28 percent in 2012, and a majority of academic leaders (64%) have concerns that credentials for MOOC completion will cause confusion about higher education degrees.

Sector differences

The report identifies some clear differences between the different sectors in the USA’s very diverse post-secondary education system. Small institutions (less than 1,500) and doctoral/research institutions are far less likely to offer online courses. CAOs from institutions not offering online learning tend to be more critical of the quality of online learning and far less likely to think it essential to their future.

Of the CAOs from institutions offering online courses, nearly one-quarter believe online outcomes to be superior, slightly under 20 percent think them inferior, with the remainder (57%) reporting that the learning outcomes are the same as for classroom delivery

What about Canada – and Ontario in particular?

I have long lamented that we have no comparable data on online learning in Canada. The government of Ontario did do a census of all its universities and colleges in 2010 and found just under 500,000 online course registrations, or 11% of all university and college enrollments, with online enrollments in universities (13%) higher than in two-year colleges (7%). If we extrapolate from the USA figures (highly dubious, I know) which showed a 16% increase in online enrollments between fall 2010 and fall 2012, this would put Ontario’s online enrollments in 2012 at approximately 563,000.

More significantly, the Ontario government survey provided hard data on course completion rates:

  • the median in the college sector for the 20 colleges that responded to the question was 76.1% with most institutions reporting results between 70% and 79%.
  • the median in the university sector for the 15 universities that responded was 89% with most universities reporting results from 85% to 95%.

Contact North did a ‘cross-country check-up’ in 2012. It concluded (p.14):

Using proxy data (estimates provided by a variety of different organizations and a standard measure of full-time equivalent student set at 9.5 course registrations per FTE), we can estimate that there are between 875,000 and 950,000 registered online students in Canada (approximately 92,105 – 100,000 full-time students) at college and universities studying a purely online course at any one time.

The problem though is that these are one-off studies. While the government of Ontario is to be congratulated on doing the 2010 survey, it decided not to continue it in the following years (or more accurately, it did not decide to repeat it.) The Contact North data is at best a rough estimate, again valuable in itself, but needs to done on a more systematic and regular basis across the country (Canada’s higher education system is devolved to each of 12 provinces with no federal responsibility or office for post-secondary education, and Statistics Canada has been cut back in recent years by the current Conservative Government).

However, there is now hope. The government of Ontario has just established Ontario Online, a collaborative Centre of Excellence that will be governed and operated by the province’s colleges and universities. It has a start-up budget of $42 million. One of the first things it should do is to repeat and expand the 2010 survey, to establish a baseline for measuring the province’s progress in online learning. The expansion should include also measurement of hybrid/blended learning (preferably using the same definitions as the Babson survey for comparative purposes.) To do this accurately, institutions will need to categorize the type of courses they are offering in their courses’ database, if they have not already done this to date. Without such a baseline of data, it will be almost impossible to assess not just the success of Ontario Online, but of online learning in general in Ontario.

I would also hope that as the country’s largest province, with probably the greatest number of online courses and enrollments, Ontario will take leadership at the national Council of Ministers of Education, Canada (CMEC) to get the survey it has developed adopted and administered by all provinces across Canada. Politicians and experts can huff and puff all they like about the importance of online learning, but if they don’t measure it, it’s all just hot air.

In summary, many thanks to Sloan and Babson College for their invaluable work. Ontario has done far more than any other province in Canada to identify the extent of online learning, and is poised to make an even greater breakthrough through its new Ontario Online initiative. However, systematic data collection is essential for measuring the success of any online learning initiatives or strategies.

Survey of the digital lives of professors

Allen, I.E. and Seaman, J. (2012) Digital Faculty: Professors, Teaching and Technology 2012  Inside Higher Ed, Babson Survey Research Group and Quahog Research Group, LLC.

Kolowich, S. (2012) Digital Faculty: Professors and Technology 2012, Inside Higher Education, August 24

This is a report of a survey of 4,564 faculty members, composing a nationally representative sample spanning various types of institutions; and 591 administrators who are responsible for academic technology at their institutions. An earlier report focused on faculty views of online education. This survey focuses on how digital technology is affecting the lives of faculty in more general terms. The Kolowich article is a fairly extensive summary of the report.

The report suggests that in general, faculty are fairly positive towards many of the digital developments in academia, such as ‘flipped’ classrooms which allow for more in-class discussion, and the growth of learning analytics (although not described as such in this report). There was also general support for the move towards e-publishing and e-textbooks.

One finding that struck me is that administrators consistently over-estimate faculty engagement with digital technologies such as an LMS.

Another finding that struck me is how relatively few e-mails faculty received from students, even when teaching online courses – rarely more than 25 a day.

There’s a lot of data in the original report and it is worth reading in full.