January 22, 2018

Corruption in higher education: a wake-up call

Staff at Pavol Jozef Safarik University, Kosice, Slovakia were accused of taking bribes to admit students to its Medical School

Staff at Pavol Jozef Safarik University, Kosice, Slovakia have been accused of taking bribes to admit students to the Medical School

Daniel, J. (2016) Combatting Corruption and Enhancing Integrity: A Contemporary Challenge for the Quality and Integrity of Higher Education: Advisory Statement for Effective International Practice: Washington DC/Paris: CHEA/UNESCO

Daniel, J. (2016) Lutter contre la corruption et renforcer l’intégrité : un défi contemporain pour la qualité et la crédibilité de l’enseignement supérieur: Déclaration consultative pour des pratiques internationales efficaces Washington DC/Paris: CHEA/UNESCO

Those of us working in online learning are often berated by academic colleagues about the possible lack of integrity in online learning due to issues such as plagiarism, diploma mills, or ‘easy’ qualifications lacking rigorous academic process. Such cases do occur, but having read this document, it seems that the more traditional areas of higher education are prone to far more egregious forms of corruption.

Where do we find corruption?

At the end of this report, there is a list of references chronicling corruption in higher education in Australia, China, the Czech Republic, Egypt, France, Germany, India, Kenya, Nigeria, Russia, Slovakia, South Africa, and the USA. And those are just the ones who have been recently caught.

The report puts it bluntly:

This Advisory Statement is a wake-up call to higher education worldwide – particularly to quality assurance bodies. HEIs [higher education institutions], governments, employers and societies generally, in both developed and developing countries, are far too complacent about the growth of corrupt practices, either assuming that these vices occur somewhere else or turning a deaf ear to rumours of malpractice in their own organizations.

What kinds of corruption?

You name it, it’s in this report. In fact, the report describes 29 different kinds of corrupt practices. Here are just a few examples:

  • giving institutions licenses, granting degree-awarding powers, or accrediting programmes in return for bribes or favours.
  • altering student marks in return for sexual or other favours.

  • administrative pressure on academics to alter marks for institutional convenience.

  • publishing false recruitment advertising.

  • impersonation of candidates and ghost writing of assignments.

  • political pressures on higher education institutions to award degrees to public figures.

  • publication by supervisors of research by graduate students without acknowledgement.

  • higher education institutions publishing misleading news releases or suppressing inconvenient news.

Who is sounding the alarm?

Although the writer of the report is Sir John Daniel, a fellow Research Associate at Contact North, and former Vice-Chancellor, the Open University, Assistant Director-General for Education at UNESCO and President of the Commonwealth of Learning, the report draws on meetings of expert groups from the following organizations:

  • UNESCO’s International Institute for Educational Planning (IIEP)
  • the International Quality Group of the US Council for Higher Education Accreditation (CHEA/CIQG).

What’s causing this?

Corruption is as much about lack of ethical behaviour and rampant self-interest as about policies and practices. The report though points to two key factors that are contributing to corruption:

  • the huge appetite for higher education among the young populations of the developing world puts great pressures on admissions processes;
  • the steadily developing sophistication and borderless nature of information and communications technology (ICT) has expanded the opportunities for fraudsters in all walks of life.

What are the recommended solutions?

There are of course no easy solutions here. The report points out that there are both ‘upstream’ possibilities for corruption at the level of government and accrediting agencies, and downstream, from individuals desperate to get into and succeed within an increasingly competitive higher education system. In the middle are the institutions themselves.

The report separates its recommendations for combatting corruption then into several target areas:

  1. the regulation of higher education systems
  2.  the teaching role of higher education institutions
  3. student admissions and recruitment
  4. student assessment
  5. credentials and qualifications
  6. research theses and publications
  7. through increased public awareness

It is interesting that while the report emphasizes the importance of internal quality assurance processes within HEIs, it also notes that the more ‘mature’ an HE system becomes, the more external quality assurance agencies, such as accreditation boards and government ministries, tend to pass quality assurance responsibilities back to the institutions. The report notes that students themselves have a very important role to play in demanding transparency and whistle-blowing.

A call to action

The report ends with the following:

  • governments, quality assurance agencies and HEIs worldwide must become more aware of the threat that corruption poses to the credibility, effectiveness and quality of higher education at a time when its importance as a driver of global development has never been higher.

  • external quality assurance agencies should do more to review the risks of corruption in their work and HEIs must ensure that their IQA [internal quality assurance] frameworks are also fit for the purpose of combatting corruption.

  • training and supporting staff in identifying and exposing corrupt practices should be stepped up.

  • creating networks of organizations that are fighting corruption and greater North-South collaboration in capacity building for this purpose are highly desirable.

So next time some sanctimonious academic sneers at the academic integrity of online learning, just point them in the direction of this report.

Online learning in 2012: a retrospective

© The Greening of Gavin, 2012

Well, 2012 was certainly the year of the MOOC. Audrey Watters provides a comprehensive overview of what happened with MOOCs in 2012, so I won’t repeat what she has done. Instead in this post I will focus mainly on trying to explain with regards to MOOCs what appears to me to be highly irrational organizational behaviour, more akin to lemmings than pillars of higher learning.

Why MOOCs?

For those of us who work mainly in universities and colleges, the hype around MOOCs is like living in two parallel universes: what we do every day in online learning, and what we read or hear about in the media. (I leave you to judge which is the true reality.) Even organizations that should know better think that online learning started at MIT in 2002 with OpenCourseWare. So why have MOOCs in particular got so much press?

This is an exercise in social anthropology.

To quote from Wikipedia:

It is unknown why lemming populations fluctuate with such variance roughly every four years, before plummeting to near extinction.

Now some evidence suggests their predators’ populations, particularly the stoat, may be more closely involved in changing the lemming population

Lemmings can swim and may choose to cross a body of water in search of a new habitat. In such cases, many may drown if the body of water is so wide as to stretch their physical capability to the limit.

 I believe there are several themes that have led to MOOC hysteria in 2012:

  • they appear to be free. The direct costs of higher education, especially but not only in the USA and the UK, have been systematically transferred from the tax payer to the individual student or parents through cuts in government funding and increases in tuition fees. In other words, the cost of higher education has become more transparent. It’s really expensive. Free of course is better than expensive. MOOCs have been promoted as being free. However, there are no free services. All services have a true cost. At least to date, MOOCs are the opposite of transparency on the true cost. We do know that over a hundred million dollars have been invested this year alone in MOOCs, but what are the costs of the professors’ time, the cost of managing large numbers of students, and above all, the cost of ensuring student learning (however it is measured)? We just don’t know. Until we do, it’s a shell game
  • it’s also a numbers game: input matters more than output. The focus of the media has been on the massive numbers enrolling. However, there has been little focus on what students are actually learning. All we know is that completion rates are pathetic (less than 10%), and many of those that do complete are already well educated. Nevertheless it is argued that on a global perspective, the completion numbers are still large. However, so are the numbers in traditional higher education, and also in credit-based online learning. Sloan and Babson have been tracking the online credit numbers for years. They have been growing at a steady rate of between 12-20% a year. Ontario alone has over 500,000 online course registrations in its public universities and colleges, with completion rates in the 75-85%, matching completion rates in face-to-face classes. Millions are taking online courses for credit in Asia. But does this get mass coverage in the media? No.
  • technology triumphs over teaching: MOOCs in general have been driven by computer scientists who believe that just ‘delivering’ content over the Internet equates to learning. It doesn’t, but broadcast content delivery is something that lazy reporters can easily understand.
  • it’s all about the elite institutions. The media love to focus on the ivy league universities to the almost total neglect of the rest of the system (the cult of the superstar). Here is an appalling irony. The top tier research universities have by and large ignored online learning for the last 15 years. Suddenly though when MIT, Stanford and Harvard jump in, all the rest follow like lemmings. MOOCs are seen as an easy, low risk way for these universities not only to catch up, but to jump into the front line. But they are hugely wrong. Moving from broadcasting to learning is not going to be easy. More importantly, MOOCs are a side issue, a distraction. The real change for universities is going to come from hybrid learning – a mix of on-campus and online learning. Those top tier research universities though are going to miss out on this, by sidelining their online learning to a peripheral, continuing education activity.
  • don’t forget the politics: There’s just been a presidential election in the USA. A number of corporate leaders and some in the Republican party want to privatize the US higher education system. Anything that will undermine it is heavily promoted. MOOCs to some extent have been a tool in the hands of the media for suggesting that education need not be expensive and could be ‘free’, or at least much lower cost, if left to business. This fits the agenda of the right.

Having said all this, I believe that there is a future for MOOCs, but that’s for another post, my outlook for 2013, which comes in January.

In the meantime, there were, believe it or not, several other interesting developments in online learning, but before exploring those as well, let’s see how right I was in my outlook for 2012.

What I predicted

  1. The year of the tablet: 99% probability
  2. Learning analytics: 90% probability
  3. Growth of open education: 70% probability (depending on definition of open education)
  4. Disruption of the LMS market: 60% probability
  5. Integration of social media into formal learning: 66% probability
  6. The digital university: 10% probability
  7. Watch India
  8. The great unknown: 10% probability

Well, not a great record at prediction. I suppose you could include MOOCs within ‘growth of open education’. But look at what I actually wrote:

open access to high quality (all right, highly qualified) instructors is likely to be limited to idealistic volunteers, or to limited events (e.g. a MOOC), mainly because of a mis-match between supply and demand. Too many people want access to what they may incorrectly assume to be high quality instructors at elite institutions, for instance. This is partly an institutional barrier, as institutions try to protect their ‘star’ faculty, which is why this form of openness depends largely on individual volunteers.

Not actually wrong, but it certainly didn’t capture the mania that would develop around MOOCs in 2012.

Although there have been lots of interesting individual uses of tablets, particularly in k-12, they certainly haven’t taken off to the extent to which I predicted, at least in post-secondary education. However, so much in prediction depends on timing – maybe it will happen this year. For instance, mobile learning, one of my predictions for 2011, certainly expanded in many institutions in 2012, and will certainly continue to grow in 2013. The use of data analytics definitely increased, but still in a minority of institutions, in 2012, but learning analytics are still being used by a very tiny minority. The technology isn’t quite ready yet. (Again, this depends on definition – I’m talking about the hope that learning analytics will help instructors to achieve better learning outcomes, or put another way, will help students to improve their learning.)

What you read

Another way at looking at 2012 is to see what you chose to read. There are just over 1,800 posts on the site. Here are the top 14 posts in 2012, with the number of hits. (If you missed one, just click on it.)

Recommended graduate programs in e-learning


What’s right and what’s wrong about Coursera-style MOOCs


e-learning outlook for 2012: will it be a rough ride?


New technologies for e-learning in 2012 (and a little beyond)


A short critique of the Khan Academy


Can you teach ‘real’ engineering at a distance?


What Is Distance Education?


Why learning management systems are not going away


E-learning quality assurance standards, organizations and research


A personal view of e-learning in Saudi Arabia


A student guide to studying online


10 types of plagiarism (and why I’m pleading guilty to at least one charge)


Daniel’s comprehensive review of MOOC developments


Designing online learning for the 21st century


The numbers of course are skewed by their date of  posting. Those posted early in the year have more chance of being accessed than those posted later. Timing also matters in terms of external events. Despite all the hype about MOOCs, only two of the top 14 posts were specifically on MOOCs (although there were several others posted). I am though surprised at the amount of interest in prediction, especially given how bad I am at it!

The inclusion of ‘Can you teach real engineering at a distance?’ at no. 6 is really interesting. This was posted originally on July 5, 2009, but it has sustained a long discussion that is still active today. I was also pleased to see that designing online learning for the 21st century squeezed in, as this was about design of online learning. I’m glad there’s still at least some interest in this issue. There is also evidence that the site is being used by  a lot of online students (or potential students), which is very gratifying. I need to do more posts targeted to students next year.

What I did

Since I’m not free and open (except here), this is some indication of what institutions were interested in this year (at least enough to pay me for it).

Site visits for consultancies or discussions with faculty/staff on strategies or designs for online learning

  • Mexico City: to develop a business plan for a national Mexican virtual university
  • Edmonton: Campus St-Jean, University of Alberta: informal review of online learning activities
  • Université de Sherbrooke, l’université Laval and Université de Québec en Abitibi-Témiscamingue, Québec
  • Vancouver Community College, Kwantlen Polytechnic University, and University of British Columbia, BC
  • University of Manitoba, Winnipeg
  • EFQUEL conference, Granada, Spain
  • COHERE conference, Calgary, Alberta

Online consultancies

MOOCs and Webinars

  • planning and managing online learning: participant in #Change 11 cMOOC
  • costs of online learning: guest instructor for University of Maryland University College/University of Oldenberg, Germany
  • Elections Canada: online course design

Institutional site visits and reports on gamechanging institutions

  • Western Governors University
  • Open University, UK
  • Open University of Catalonia, Spain
  • London Knowledge lab, Institute of Education, London, UK.

It can be seen there was a great deal of interest in:

  • strategies and management,
  • new course designs,
  • design and organization of online institutions,
  • the costs of online learning

during 2012. These issues are not likely to disappear next year, either.

Politics and economics

In 2012, there were major developments in both the politics and economics of online learning. Governments in the USA and Europe accelerated cost cutting in post-secondary education. Nearly one billion dollars has been cut from the community college system in California alone since 2008. Student tuition fees have risen dramatically over the last five years in both the USA and the U.K. Even in Canada, provincial governments are facing the need to constrain public funding.

In Ontario, Canada’s largest province, the government threw down a challenge to the post-secondary institutions. Enrollments will need to increase, quality must be obtained, but there will be no new money. What can the institutions do to increase productivity through innovation? It’s a good question. Business cannot go on as usual. There is surely room for improvement and change in our institutions.

This theme is likely to continue into 2013. Governments, parents and increasingly students will be looking to online learning to increase productivity: better learning outcomes for less money. Are we up to the challenge?

Goodbye, 2012

I asked the question last year: will it be a rough ride? It’s certainly been a fast ride and quite bumpy at the same time. I don’t know how you feel, but I feel I’m hanging on, but only just. It’s good though that it’s exciting, stimulating, infuriating, and frustrating. It means that online learning is alive and well, growing in both breadth and more importantly depth.

So to all my readers, thank you for coming along for the ride. Have a great break, merry Christmas, happy Hanukah, or just have a good time, whatever your religion or beliefs. And I look forward to sharing my outlook for 2013 in the new year.


1. What pleased, surprised or disappointed you in 2012 with regard to online learning?

2. What do you think was the most important development in 2012 for online learning? Obama’s re-election? MOOCs? New course designs? Or something else?

3. Are we up to the challenge of using online learning to increase productivity through innovation? If so, what would that look like?

Massive growth of online learning in Asia

Aakash 2: already 3.5 million ordered

Adkins, S. S. (2012) The Asia Market for Self-paced eLearning Products and Services: 2011-2016 Forecast and Analysis Ambient Insight, October

In all the hoopla about MOOCs, it is worth noting that in Asia, credit-based online learning is already reaching many millions of learners. This report from Ambient Insight, targeted mainly at the corporate e-learning market, provides a host of fascinating statistics about the Asian market for online learning.

Several countries for instance are putting their entire k-12 curriculum online. China’s goal is to have their entire K-12 population of over 200 million students online by 2020. In South Korea all primary and secondary schools must be entirely digital by 2015, and every child with have a personal learning device. In India, the Aakash 2 tablet, which launched this month, already has 3.5 million orders.

The report also highlights ‘explosive growth of online higher education enrollments‘ in Asia. One institution alone in China, ChinaEdu, has nearly 200,000 students taking degree programs wholly online, and over 100,000 South Koreans are enrolled in cyber universities.

Perhaps most interesting of all though is the author’s comment on how the digitization is occurring:

The content digitization tends to start with converting print-based textbooks to eTextbooks. Yet, once the infrastructure and learning technology is in place, the buyers are increasingly opting for interactive, self-paced multimedia content. Several of the newer initiatives are leapfrogging eTextbooks altogether and building out interactive media as a core component.

If you want to pay for a full copy of the report, contact: info@ambientinsight.com

Who has the richest professors? Canada!?

© Higher Ed Morning, 2011

Jaschik, S. (2012) Faculty Pay, Around the World, Inside Higher Education, March 22

Philip Altbach, Liz Reisberg, Maria Yudkevich, Gregory Androushchak, Iván Pacheco (in press) Paying the Professoriate: A Global Comparison of Compensation and Contracts London/New York: Routledge

The Inside Higher Education reports on a fascinating study conducted by researchers from the USA and Russia that compares academic salaries around the world, after adjusting for the cost of living in each country (but not for taxes, for some reason).

Canada has the highest paid faculty, both at entry and as an overall average, followed by Italy, South Africa, India and the United States, in that order.

The web site for the project has several very interesting maps showing relative salaries by country.


Such comparative studies are always open to criticism (see the comments after the Inside Higher Education article), but having travelled to many of the countries, the results make sense to me. It’s how you interpret them and what they may result in that matters.

First, one result of globalization is that there is an ‘arms race’ in salaries to attract the best faculty. However there are other factors too, such as the availability of research grants, and working conditions, especially teaching load, and other extras, such as the availability of consultancies and other extra work, and the proportion of much lower paid contract instructors. Nevertheless, the need for knowledge workers and the importance given by governments in many countries to their universities in producing such graduates means that there will be continued pressure to keep salaries increasing in a ‘free’ global labour market. In this sense, it’s probably good for Canada that it’s ‘top’.

Second, the report points out that despite the trends in salaries, most university faculty are well outside the top 1% of the wealthy.

Third, the USA is such a diverse system, with many ‘universities’ that barely deserve the name, as well as top rank elite universities, that an average for that country doesn’t mean a lot. Canada’s high salaries generally result from making comparisons with the elite universities just south of the border.

Fourth, expect Canadian salaries to come under pressure (or rather, not to continue to increase at a rate compared with other of the top five countries) over the next few years as some provincial governments grapple with budget deficits, and students (and politicians) try to limit increases to tuition fees. Canadian universities slid relatively unscathed through the 2008 economic recession compared to the state universities in the USA, who are currently undergoing massive cuts to their budgets, but that comparative advantage is not going to continue for ever as the US economy slowly recovers.

Fifth, India and China make interesting comparisons. Despite a massive increase in student numbers in both countries, India has managed to protect the salaries of its academics across the board, while China has had to rely on paying high salaries to a small elite of professors, but the general faculty salaries are still low in China even when the relatively low cost of living is taken into account. I leave it to you to speculate on what that ‘gap’ in salaries means for the future.

Lastly, what does all this mean in terms of value for money? Faculty salaries are by far the biggest single item in university budgets, accounting for at least 70% of all teaching costs. They have increased at least in North America at a much faster rate than inflation over the last 20 years, while teaching loads have actually dropped. Do higher salaries though lead to better teaching? I doubt it. What universities are looking for are top researchers rather than top teachers. Is there any way to tie increased salaries to teaching performance? Probably not, except perhaps for internal promotions. It looks like students and/or tax payers in Canada will continue to pay more without any expectation that the teaching will get any better or more productive. How depressing.

Key points from OECD’s 2011 ‘Education at a Glance’

© Wired Magazine, 2011

OECD (2011) Education at a Glance: OECD Indicators Paris: OECD

For those of you interested in educational statistics (and for masochists in general), here’s your early Christmas treat. Any attempt at summarizing this 490 page report is futile, but I’ve picked out some data that may be of interest to readers of this site. Italics denote direct quotes from the report – important because I may have misinterpreted some of the data not in italics so check with the original!

81% with upper secondary education and 37% with tertiary (post-secondary) degrees across the OECD in 2009

The proportion of those with tertiary qualifications has risen from 13% in 1933 to 37% in 2009 [for the 34 countries that are full OECD members].

Korea: from 21st to 1st in tertiary education attainment; Germany has the least progress; USA in the middle; Canada second in tertiary attainment; China has 12% of all graduates worldwide

The growth rate at the tertiary level has been relatively slow in the United States, where attainment was originally relatively high, and in Germany, which had lower levels of attainment. In contrast, Japan and Korea have made higher education dramatically more accessible. In both countries, among the cohort born 1933-42, only about one in ten had tertiary qualifications by late in their working lives. Among younger Japanese and Koreans, who reached graduation age around the turn of the millennium, most now have tertiary degrees. On this measure, Korea has moved from the 21st to the first rank among 25 OECD countries with comparable data. The United States now shows just over the average proportion of tertiary-level graduates at age 25-34. In Europe, Germany stands out as the country that has made the least progress: it has a population of tertiary graduates only around half the size, relative to its total population, of many of its neighbours.

Canada is just behind Korea in terms of the proportion of 25-34 year olds with a tertiary qualification (just under 60% – Chart A1.1, p. 30).

While the level of tertiary attainment in China is still low, because of the size of its population, China still holds some 12% of all tertiary graduates, compared with 11% in Japan and 26% in the USA

How important are OECD indicators?

At one level, indicators are no more than a metric for gauging progress towards goals. Yet increasingly, they are performing a more influential role. Indicators can prompt change by raising national concern over weak educational outcomes compared to international benchmarks; sometimes, they can even encourage stronger countries to consolidate their positions. When indicators build a profile of high-performing education systems, they can also inform the design of improvements for weaker systems. 

Other stuff

Tons of it. There is a lot of data on upper post-secondary education graduation, on vocational education, and the relationship between graduates and salaries.

What does it mean?

It would really help if the OECD would produce a simple, ‘key facts’ document or brief executive summary that highlighted the main, clear points to come from the study. Why am I having to do this? Surely the OECD should be confident in its own data. However, I suspect that doing a simple fact sheet would be difficult because of the rather abstruse and complicated methods they have used to produce comparable figures across different countries, which tends to raise the question of why all this is being done if it’s not possible to come to some clear conclusions for many of the statistics they have generated.

A lot of the data in this report are difficult to understand or interpret (and yes, I did study statistics at university, but didn’t specialize in it). For instance, from what I can judge, undergraduate completion rates look very low, averaging around 39% for ‘abstract’ Tertiary type A programs (and Canada is slightly below average). Are our completion rates really so low? Do barely one in three students graduate? If not, what is this table comparing?

I suspect that these figures represent the way the OECD statisticians have chosen to measure the graduation rate but it is not at all clear how this is done without spending ages checking the original methodology and even then I’m still not clear as to the actual basis for this figure. (Surely the figure should represent what proportion of students who entered an undergraduate program graduated within 4, 5 or six years, or more. But the data isn’t collected that way). It’s also complicated that in many countries a bachelor’s degree program is three years, whereas in others it’s four or even five. Rather than take general national statistics from survey data then trying to make them ‘fit’ questions the data were never intended to answer, these questions would be better addressed through specific studies. Canada in particular is a nightmare for such statistical comparisons, as we don’t have a national system so the data have to be collected from each province, which again introduces more variables and lack of consistency. And how meaningful is the distinction between type A universities and type B universities? Who decides?

Even more worrying is the comment in the report on how indicators have been driving government behaviour. If I was a minister, I’d like to have my own statistician with me before making any decisions based on these data, which really shouldn’t be necessary. For instance is a degree from Germany the same as one from Slovakia? The OECD indictors suggest that it would be better to have more degrees at a lower standard, than fewer degrees at a higher standard. Of course, no-one at the OECD would make that argument but the argument is strongly implicit in the rankings. Never mind the quality, feel the width.

The attempt to gather cross national data in a comparable way is commendable, and certainly the trends over a long period of time become more discernible, but it does sometimes feel as if the OECD is trying to do the impossible, which is to standardize results from very different systems through the same sausage machine.