August 14, 2018

Zuckerberg’s Frankenstein

© The Mind Reels

Prosecutor: Dr. Frankenberg, are you aware that there is a monster roaming the countryside, stealing all the villagers’ personal information?

Dr. Frankenberg: Yes, sir, I am.

Prosecutor: And is it true, Dr. Frankenberg, that you invented this monster, in your dorm room at Harvard?

Dr.Frankenberg (proudly): Yes, sir.

Prosecutor: And are you aware that your monster is going around selling the villagers’ personal information to any Tom, Dick or Harry who will buy it?

Dr. Frankenberg: Yes, sir, that’s why I invented the monster – it’s my business model.

Prosecutor: Has your business model been successful?

Dr. Frankenberg (smugly): Oh, yes, sir, it’s made me and my friends very rich. You see the monster sends all the money to me. I only need a few engineers to make sure the monster doesn’t break down – and of course some very good lawyers – so there’s a lot left over afterwards.

Prosecutor: And are you aware that the monster helped our new Emperor, Donald the Terrible, to become emperor?

Dr. Frankenberg: I was made aware of that only just recently, but of course, I had heard of the rumours much earlier.

Prosecutor: So it was not your intent then that the monster should help Donald the Terrible?

Dr. Frankenberg: Absolutely not.

Prosecutor: And are you aware that hostile tribes outside the kingdom have used the monster to attack us?

Dr. Frankenberg: Yes, of course, that’s why I’m here – but honestly, I didn’t know about this until you did. And I made the monster get them to promise not to do that – but they are hostiles and didn’t keep their promise. 

Prosecutor: It seems to me that you don’t have much control over your monster.

Dr. Frankenberg (sighs): Look, you don’t understand how this works. You design something, you throw it out into the world, then wait to see what happens. Sometimes it’s good. Sometimes it’s bad. But there would be no way to make lots of money if you didn’t do this. If you tried to control it, you wouldn’t know what it could do.

Prosecutor: So you agree that your monster is now out of your control?

Dr. Zuckerberg (frowns, drinks water): Not entirely. We tried using chains recently, but the monster is too strong – he keeps breaking them. But our engineers are working on it, believe me.

Prosecutor: Let me put this to you: you created the monster, so you are responsible for it, but you’ve not done enough to control it.

Dr. Frankenberg: That’s a bit unfair. How was I to know it would become so dangerous? I realise it now, but anyone can be smart after the event.

Prosecutor: Some of the Emperor’s advisers are suggesting that the government should try to control the monster. What are your views on that?

Dr. Frankenberg (shrugs):Well, good luck with that. You realise the monster is not just stealing from our villagers, but from everyone’s now – he’s all over the place. But if you think you can do it, don’t let me stop you.

Judge intervenes: Thank you, Prosecutor, Dr. Frankenberg. We’ll adjourn for today, but we’ll be back in court tomorrow. Dr. Frankenberg, I hope you will take advantage of this time for some thought on how we can control your monster, because you should be aware, neither I nor the government have the slightest clue about how to do this.

Court adjourns.

 

 

 

Our responsibility in protecting institutional, student and personal data in online learning

Image: © Tony Bates, 2018

WCET (2018) Data Protection and Privacy Boulder CO: WCET

United States Attorney’s Office (2018) Nine Iranians Charged With Conducting Massive Cyber Theft Campaign On Behalf Of The Islamic Revolutionary Guard Corps New York: U.S. Department of Justice

With the recent publicity about unauthorised use of personal data on Facebook to manipulate elections in the USA and the U.K., and the above report about Iranians hacking universities for research results and intellectual property, everyone now has to take as much responsibility as possible for making sure personal data is secure and used only for authorised purposes.

This is particularly true for those of us working in online learning, where most of our interaction with students is online. Most institutions using learning management systems provide a secure area for student-instructor interactions – security is one reason why universities and colleges pay big bucks for IT systems, and making sure our student data and interactions are kept secure is a major reason for using a learning management system.

However, there are increasing reasons for working outside secure LMSs. Faculty and students now have blogs and wikis that are more open, although most require a password to allow for content to be added or comments to be made. ‘Good’ institutions ensure that student and faculty blogs and wikis are also protected from hacking. For instance, the University of British Columbia offers web and wiki facilities free of charge for all students and faculty and provides the security to support this. This blog is hosted by Contact North, which provides stronger security than I could as an individual or through an affordable commercial agency.

The problem comes when instructors and students start using unrestricted social media tools for instructional purposes. This all becomes ‘product’ for the social media companies and their advertisers (and very valuable product, given that university and college students are more likely to be high income earners after graduation.)

I was an early adopter of Facebook, back in 2005, but within 12 months I became inactive. It was not a company I felt I could trust, even back in 2005. I have good news for Facebook addicts who are wanting to get off of Facebook – life even within the online world is perfectly manageable, enjoyable and effective without Facebook. I do still keep in touch with my family and friends perfectly well and my professional life has if anything improved without Facebook.

Here I admit to being conflicted as I am still a heavy user of Google Search (although I prefer to use Firefox rather than Chrome). I was influenced by the Google corporate policy of ‘Do No Evil’ in its early days. Now Google Search is just one part of the umbrella company Alphabet, whose corporate motto is currently ‘Do the right thing’ – but for whom? It comes down more to pragmatics than ethics in the end. I can manage quite happily and easily without Facebook – I can’t without Google Search. 

This points to the problem we have as individuals in a digital society. Our power to control the use of our personal data is quite limited. We are now at the point where government regulation becomes unfortunately a necessity. (I say unfortunately because this is likely to limit to some extent innovation and change, but then so do the semi-monopolies of Amazon, Alphabet, Apple and Facebook, at least limiting change outside their systems). 

In the meantime, WCET has come to our rescue with a very useful site which really contains all you need to know about privacy and security. As their site says:

This is not just an IT problem! A breach could occur from an unintentional action by non-technical staff or student that could expose personal or institutional data to criminals and place the institution at risk by merely using weak passwords, connecting to dangerous networks, or opening suspicious emails. All members of an academic community must be trained with data protection best practices to preserve the security of the institution.

The WCET site contains links to the following:

  • their Frontiers blog posts on privacy and security issues
  • links to relevant recorded webcasts
  • links to a number of tools and reports on improving/protecting cybersecurity.

Essential reading for us all.

Now forgive me while I go and change all my passwords. 

Developing a next generation online learning assessment system

Facial recognition

Facial recognition

Universitat Oberta de Catalunya (2016) An Adaptive Trust-based e-assessment system for learning (@TeSLA) Barcelona: UOC

This paper describes a large, collaborative European Commission project headed by the Open University of Catalonia, called TeSLA, (no, not to develop a European electric car, but) a state-of-the-art online assessment system that will be accepted as equal to if not better than traditional face-to-face assessment in higher education.

The challenge

The project argues that at the moment there is no (European?) online assessment system that:

  • has the same level of trust as face-to-face assessment systems
  • that is universally accepted by educational institutions, accreditation agencies and employers
  • incorporates pedagogical as well as technical features
  • integrates with other aspects of teaching and learning
  • provides true and secure ‘authentication’ of authorship.

I added the ‘European’, as I think this claim might come as a surprise to Western Governors’ University, which has been successfully using online proctoring for some time. It is also why I used the term ‘next generation’ in the heading, as the TeSLA project is aiming at something much more technologically advanced than the current WGU system, which consists mainly of a set of web cameras observing learners taking an assessment (click here for a demonstration).

Also, the TeSLA proposal makes a good point when it says any comprehensive online assessment system must also be able to handle formative as well as summative assessment, and that this can be a challenge as formative assessment is often embedded in the day-to-day teaching and learning activities.

But the main reason for this project is that online learning assessment currently lacks the credibility of face-to-face assessment.

The solution

A non-invasive system that is able to provide a quality continuous assessment model, using proportionate and necessary controls that will ensure student identity and authorship [in a way that offers] accrediting agencies and society unambiguous proof of academic progression….

Any solution must work fully online and take into account ‘academic requirements’ for assessment, including enriched feedback, adaptive learning, formative assessment and personalized learning.

This will require the use of technologies that provide reliable and accurate user authentication and identification of authorship, face and voice recognition, and keystroke dynamics recognition (see here for video examples of the proposed techniques).

The solution must result in

a system based on demonstrable trust between the institution and its students. Student trust is continuously updated according to their interaction with the institution, such as analysis of their exercises, peer feedback in cooperative activities or teacher confidence information. Evidence is continuously collected and contrasted in order to provide such unambiguous proof.

The players

The participants in this project include

  • eight universities,
  • four research centres,
  • three educational quality assurance agencies,
  • three technology companies,
  • from twelve different countries.

In total the project will have a team of about 80 professionals and will use large-scale pilots involving over 14,000 European students.

Comment

I think this is a very interesting project and is likely to grab a lot of attention. At the end of the day, there could well be some significant improvements to online assessment that will actually transfer to multiple online courses and programs.

However, I spent many years working on large European Commission projects and I am certainly glad I don’t have to do that any more. Quite apart from the truly mindless bureaucracy that always accompanies such projects (the form-filling is vast and endless), there are real challenges in getting together participants who can truly contribute to such a project. Participants are determined more by political considerations, such as regional representation, rather than technical competence. Such projects in the end are largely driven by two or three key players; the remaining participants are more likely to slow down or inhibit the project, and they certainly divert essential funding away from the those most able to make the project succeed. However, these projects are as much about raising the level of all European countries in terms of learning technologies as becoming a world leader in this field.

These criticisms apply to any of the many European Commission projects, but there are some issues that are particular to this project:

  1. I am not convinced that there is a real problem here, or at least a problem that requires better technology as a solution. Assessment for online learning has been successfully implemented now for more than 20 years, and while it mostly depends on some form of face-to-face invigilation, this has not proved a major acceptability problem or a barrier to online enrolments. There will always be those who do not accept the equivalence of online learning, and the claimed shortcomings of online assessment are just another excuse for non-acceptance of online learning in general.
  2. Many of the problems of authenticity and authorship are the same for face-to-face assessment. Cheating is not exclusive to online learning, nor is there any evidence that it is more prevalent in online learning where it is provided by properly accredited higher education institutions. Such a study is just as likely to reduce rather than increase trust in online learning by focusing attention on an issue that has not been a big problem to date.
  3. Even if this project does result in more ‘trustworthy’ online assessment, there are huge issues of privacy and security of data involved, not to mention the likely cost to institutions. Perhaps the most useful outcome from this project will be a better understanding of these risks, and development of protocols for protecting student privacy and the security of the data collected for this purpose. I wish though that a privacy commissioner was among the eighteen different participants in this project. I fail to see how such a project could be anything but invasive for students, most of whom will be assessed from home.

For all these reasons, this project is well worth tracking. It has the potential to radically change the way we not only assess online learners, but also how we teach them, because assessment always drives learner behaviour. Whether such changes will be on balance beneficial though remains to be seen.

Keyboard dynamics

Keyboard dynamics

Appropriate interventions following the application of learning analytics

Humble Pie 2

SAIDE (2015) Siyaphumelela Inaugural Conference May 14th – 15th 2015 SAIDE Newsletter, Vol. 21, No.3

Reading sources in the right order can avoid you having to eat humble pie. Immediately after posting Privacy and the Use of Learning Analytics in which I questioned the ability of learning analytics to suggest appropriate interventions, I came across this article in the South African Institute of Distance Education’s (SAIDE) newsletter about a conference in South Africa on Exploring the potential of data analytics to inform improved practice in higher education: connecting data and people.

At this conference, Professor Tim Renick, Vice-President of Georgia State University in the USA, reported on his institution’s accomplishment of eliminating race and income as a predictor of student success.

This has been achieved through implementing various initiatives based on data mining of twelve years’ worth of student data. The university’s early warning system, based on predictive analysis, has spawned a number of tested and refined low cost, scalable, innovative programmes such as:

  • supplemental instruction by former successful students;
  • formation of freshman learning communities which entail groups of 25 students enrolled in “meta-majors” ;
  • block scheduling of courses ;
  • re-tooled pedagogies involving adaptive learning software;
  • and small, prudent financial retention grants.

The combination of the above has resulted in phenomenally reduced student attrition.

I have no further comment (for once!). I would though be interested in yours.

Incidentally, there were other interesting articles in the SAIDE newsletter, including:

Each of these reports has important lessons for those interested in these issues that go far beyond the individual cases themselves. Well worth reading.

 

Privacy and the use of learning analytics

Image: from Michael Radford's movie, 1984 - Big Brother is watching you!

Image: from Michael Radford’s movie, 1984 – Big Brother is watching you!

Warrell, H. (2105) Students under surveillance Financial Times, July 24

Applications of learning analytics

This is a thoughtful article in the Financial Times about the pros and cons of using learning analytics, drawing on applications from the U.K. Open University, Dartmouth College in the USA, student monitoring service Skyfactor, and CourseSmart, a Silicon Valley start-up that gives universities a window into exactly how e-textbooks are being read.

The UK Open University is using learning analytics to identify students at risk as early as a week into a course.

An algorithm monitoring how much the new recruits have read of their online textbooks, and how keenly they have engaged with web learning forums, will cross-reference this information against data on each person’s socio-economic background. It will identify those likely to founder and pinpoint when they will start struggling. Throughout the course, the university will know how hard students are working by continuing to scrutinise their online reading habits and test scores.

The article also discusses Dartmouth College’s mobile phone app which:

tracks how long students spend working, socialising, exercising and sleeping. The information is used to understand how behaviour affects grades, and to tailor feedback on how students can improve their results.

The article also tries to get a handle on student attitudes to this form of monitoring or surveillance. Not surprisingly, students appear to be somewhat ambiguous about learning analytics and differ in their acceptance of being monitored.

Rationalisations

What was particularly interesting is the range of justifications given in this article for monitoring student behaviour through data analysis:

  • the most obvious is to identify students at risk, so that appropriate interventions can be made. However, there weren’t any examples given in the article of appropriate interventions, highlighting the fact that it is one thing to identify a problem and quite another to know what to do about it. For instance we know that from previous research that students from particular socio-economic backgrounds or students from particular ethnic backgrounds are potentially more at risk than others. What does this mean though in terms of teaching and learning? If you know this is a challenge before students start studying, why wait for learning analytics to identify it as a problem?
  • the next argument is the need to ensure that the high investment each student (or their parents) makes in higher education is not wasted by a failure to complete a program. Because of the high cost, fear of failure is increasing student stress. At Dartmouth, a third of the undergraduate student body saw mental health counsellors last year. However, the solution to that may not be better learning analytics, but finding ways to finance students that don’t lead to such stress in the first place;
  • another rationale is to reduce the financial risk to an institution. The Chief Technology Officer at Skyfactor argues that with revenues from tuition fees of around $25,000+ per student per annum in the USA, avoiding student drop-out is a financial necessity for many U.S. institutions. However, surely there is a moral necessity as well in ensuring that your students don’t fail.

Making sense of learning analytics

The Open University has always collected data on students since it started. In fact, McIntosh, Calder and Smith (1976) found that statistically, the best predictor of success was whether a student returned a questionnaire in the first week of a course, as this indicated their commitment. It still didn’t tell you what to do about the students who didn’t return the questionnaire. (In fact, the OU’s solution at the time was not to count anyone as an enrolment until they had completed an assignment two weeks into the course – advice that MOOC proponents might pay attention to).

As with so many technology developments, the issue is not so much the technology but how the technology is used, and for what purposes. Conscientious instructors have always tried to track or monitor the progress of individual students and learning analytics merely provides a more quantitative and measurable way of tracking progress. The issue though is whether the data you can track and measure can offer solutions when students do run into trouble.

My fear is that learning analytics will replace the qualitative assessment that an instructor gets from, for instance, participating in a live student discussion, monitoring an online discussion forum, or marking assignments. This is more likely to identify the actual conceptual or learning problems that students are having and is more likely to provide clues to the instructor about what needs to be done to address the learning issues. Indeed in a discussion the instructor may be able to deal with it on the spot and not wait for the data analysis. Whether a student chooses to study late at night, for instance, or only reads part of a textbook, might provide a relatively weak correlation with poorer student performance, but recommending students not to stay up late or to read all the textbook may not be the appropriate response for any individual student, and more importantly may well fail to identify key problems with the teaching or learning.

Who gets to use the data?

Which brings me to my last point. Ruth Tudor, president of the Open University’s Students’ Association, reported that:

when the data analytics programme was first mooted, participants were “naturally” anxious about the university selling the information it collected to a third party.

The OU has given strong assurances that it will not do this, but there is growing concern that as higher education institutions come to rely more on direct funding and less government support, they will be tempted to raise revenues by selling data to third parties such as advertisers. As Andrew Keen has argued, this is a particular concern about MOOCs, which rely on other means than direct fees for financial support.

Thus it is incumbent on institutions using learning analytics to have very strong and well enforced policies about student privacy and use of student data. The problem then though is that can easily lead to instructors being denied access to the very data which is of most value in identifying student learning difficulties and possible solutions. Finding the right balance, or applying common sense, is not going to be easy in this area.

Reference

McIntosh, N., Calder, J. and Swift, B. (1976) A Degree of Difference New York: Praeger