May 29, 2017

Developing a next generation online learning assessment system

Facial recognition

Facial recognition

Universitat Oberta de Catalunya (2016) An Adaptive Trust-based e-assessment system for learning (@TeSLA) Barcelona: UOC

This paper describes a large, collaborative European Commission project headed by the Open University of Catalonia, called TeSLA, (no, not to develop a European electric car, but) a state-of-the-art online assessment system that will be accepted as equal to if not better than traditional face-to-face assessment in higher education.

The challenge

The project argues that at the moment there is no (European?) online assessment system that:

  • has the same level of trust as face-to-face assessment systems
  • that is universally accepted by educational institutions, accreditation agencies and employers
  • incorporates pedagogical as well as technical features
  • integrates with other aspects of teaching and learning
  • provides true and secure ‘authentication’ of authorship.

I added the ‘European’, as I think this claim might come as a surprise to Western Governors’ University, which has been successfully using online proctoring for some time. It is also why I used the term ‘next generation’ in the heading, as the TeSLA project is aiming at something much more technologically advanced than the current WGU system, which consists mainly of a set of web cameras observing learners taking an assessment (click here for a demonstration).

Also, the TeSLA proposal makes a good point when it says any comprehensive online assessment system must also be able to handle formative as well as summative assessment, and that this can be a challenge as formative assessment is often embedded in the day-to-day teaching and learning activities.

But the main reason for this project is that online learning assessment currently lacks the credibility of face-to-face assessment.

The solution

A non-invasive system that is able to provide a quality continuous assessment model, using proportionate and necessary controls that will ensure student identity and authorship [in a way that offers] accrediting agencies and society unambiguous proof of academic progression….

Any solution must work fully online and take into account ‘academic requirements’ for assessment, including enriched feedback, adaptive learning, formative assessment and personalized learning.

This will require the use of technologies that provide reliable and accurate user authentication and identification of authorship, face and voice recognition, and keystroke dynamics recognition (see here for video examples of the proposed techniques).

The solution must result in

a system based on demonstrable trust between the institution and its students. Student trust is continuously updated according to their interaction with the institution, such as analysis of their exercises, peer feedback in cooperative activities or teacher confidence information. Evidence is continuously collected and contrasted in order to provide such unambiguous proof.

The players

The participants in this project include

  • eight universities,
  • four research centres,
  • three educational quality assurance agencies,
  • three technology companies,
  • from twelve different countries.

In total the project will have a team of about 80 professionals and will use large-scale pilots involving over 14,000 European students.

Comment

I think this is a very interesting project and is likely to grab a lot of attention. At the end of the day, there could well be some significant improvements to online assessment that will actually transfer to multiple online courses and programs.

However, I spent many years working on large European Commission projects and I am certainly glad I don’t have to do that any more. Quite apart from the truly mindless bureaucracy that always accompanies such projects (the form-filling is vast and endless), there are real challenges in getting together participants who can truly contribute to such a project. Participants are determined more by political considerations, such as regional representation, rather than technical competence. Such projects in the end are largely driven by two or three key players; the remaining participants are more likely to slow down or inhibit the project, and they certainly divert essential funding away from the those most able to make the project succeed. However, these projects are as much about raising the level of all European countries in terms of learning technologies as becoming a world leader in this field.

These criticisms apply to any of the many European Commission projects, but there are some issues that are particular to this project:

  1. I am not convinced that there is a real problem here, or at least a problem that requires better technology as a solution. Assessment for online learning has been successfully implemented now for more than 20 years, and while it mostly depends on some form of face-to-face invigilation, this has not proved a major acceptability problem or a barrier to online enrolments. There will always be those who do not accept the equivalence of online learning, and the claimed shortcomings of online assessment are just another excuse for non-acceptance of online learning in general.
  2. Many of the problems of authenticity and authorship are the same for face-to-face assessment. Cheating is not exclusive to online learning, nor is there any evidence that it is more prevalent in online learning where it is provided by properly accredited higher education institutions. Such a study is just as likely to reduce rather than increase trust in online learning by focusing attention on an issue that has not been a big problem to date.
  3. Even if this project does result in more ‘trustworthy’ online assessment, there are huge issues of privacy and security of data involved, not to mention the likely cost to institutions. Perhaps the most useful outcome from this project will be a better understanding of these risks, and development of protocols for protecting student privacy and the security of the data collected for this purpose. I wish though that a privacy commissioner was among the eighteen different participants in this project. I fail to see how such a project could be anything but invasive for students, most of whom will be assessed from home.

For all these reasons, this project is well worth tracking. It has the potential to radically change the way we not only assess online learners, but also how we teach them, because assessment always drives learner behaviour. Whether such changes will be on balance beneficial though remains to be seen.

Keyboard dynamics

Keyboard dynamics

Appropriate interventions following the application of learning analytics

Humble Pie 2

SAIDE (2015) Siyaphumelela Inaugural Conference May 14th – 15th 2015 SAIDE Newsletter, Vol. 21, No.3

Reading sources in the right order can avoid you having to eat humble pie. Immediately after posting Privacy and the Use of Learning Analytics in which I questioned the ability of learning analytics to suggest appropriate interventions, I came across this article in the South African Institute of Distance Education’s (SAIDE) newsletter about a conference in South Africa on Exploring the potential of data analytics to inform improved practice in higher education: connecting data and people.

At this conference, Professor Tim Renick, Vice-President of Georgia State University in the USA, reported on his institution’s accomplishment of eliminating race and income as a predictor of student success.

This has been achieved through implementing various initiatives based on data mining of twelve years’ worth of student data. The university’s early warning system, based on predictive analysis, has spawned a number of tested and refined low cost, scalable, innovative programmes such as:

  • supplemental instruction by former successful students;
  • formation of freshman learning communities which entail groups of 25 students enrolled in “meta-majors” ;
  • block scheduling of courses ;
  • re-tooled pedagogies involving adaptive learning software;
  • and small, prudent financial retention grants.

The combination of the above has resulted in phenomenally reduced student attrition.

I have no further comment (for once!). I would though be interested in yours.

Incidentally, there were other interesting articles in the SAIDE newsletter, including:

Each of these reports has important lessons for those interested in these issues that go far beyond the individual cases themselves. Well worth reading.

 

Privacy and the use of learning analytics

Image: from Michael Radford's movie, 1984 - Big Brother is watching you!

Image: from Michael Radford’s movie, 1984 – Big Brother is watching you!

Warrell, H. (2105) Students under surveillance Financial Times, July 24

Applications of learning analytics

This is a thoughtful article in the Financial Times about the pros and cons of using learning analytics, drawing on applications from the U.K. Open University, Dartmouth College in the USA, student monitoring service Skyfactor, and CourseSmart, a Silicon Valley start-up that gives universities a window into exactly how e-textbooks are being read.

The UK Open University is using learning analytics to identify students at risk as early as a week into a course.

An algorithm monitoring how much the new recruits have read of their online textbooks, and how keenly they have engaged with web learning forums, will cross-reference this information against data on each person’s socio-economic background. It will identify those likely to founder and pinpoint when they will start struggling. Throughout the course, the university will know how hard students are working by continuing to scrutinise their online reading habits and test scores.

The article also discusses Dartmouth College’s mobile phone app which:

tracks how long students spend working, socialising, exercising and sleeping. The information is used to understand how behaviour affects grades, and to tailor feedback on how students can improve their results.

The article also tries to get a handle on student attitudes to this form of monitoring or surveillance. Not surprisingly, students appear to be somewhat ambiguous about learning analytics and differ in their acceptance of being monitored.

Rationalisations

What was particularly interesting is the range of justifications given in this article for monitoring student behaviour through data analysis:

  • the most obvious is to identify students at risk, so that appropriate interventions can be made. However, there weren’t any examples given in the article of appropriate interventions, highlighting the fact that it is one thing to identify a problem and quite another to know what to do about it. For instance we know that from previous research that students from particular socio-economic backgrounds or students from particular ethnic backgrounds are potentially more at risk than others. What does this mean though in terms of teaching and learning? If you know this is a challenge before students start studying, why wait for learning analytics to identify it as a problem?
  • the next argument is the need to ensure that the high investment each student (or their parents) makes in higher education is not wasted by a failure to complete a program. Because of the high cost, fear of failure is increasing student stress. At Dartmouth, a third of the undergraduate student body saw mental health counsellors last year. However, the solution to that may not be better learning analytics, but finding ways to finance students that don’t lead to such stress in the first place;
  • another rationale is to reduce the financial risk to an institution. The Chief Technology Officer at Skyfactor argues that with revenues from tuition fees of around $25,000+ per student per annum in the USA, avoiding student drop-out is a financial necessity for many U.S. institutions. However, surely there is a moral necessity as well in ensuring that your students don’t fail.

Making sense of learning analytics

The Open University has always collected data on students since it started. In fact, McIntosh, Calder and Smith (1976) found that statistically, the best predictor of success was whether a student returned a questionnaire in the first week of a course, as this indicated their commitment. It still didn’t tell you what to do about the students who didn’t return the questionnaire. (In fact, the OU’s solution at the time was not to count anyone as an enrolment until they had completed an assignment two weeks into the course – advice that MOOC proponents might pay attention to).

As with so many technology developments, the issue is not so much the technology but how the technology is used, and for what purposes. Conscientious instructors have always tried to track or monitor the progress of individual students and learning analytics merely provides a more quantitative and measurable way of tracking progress. The issue though is whether the data you can track and measure can offer solutions when students do run into trouble.

My fear is that learning analytics will replace the qualitative assessment that an instructor gets from, for instance, participating in a live student discussion, monitoring an online discussion forum, or marking assignments. This is more likely to identify the actual conceptual or learning problems that students are having and is more likely to provide clues to the instructor about what needs to be done to address the learning issues. Indeed in a discussion the instructor may be able to deal with it on the spot and not wait for the data analysis. Whether a student chooses to study late at night, for instance, or only reads part of a textbook, might provide a relatively weak correlation with poorer student performance, but recommending students not to stay up late or to read all the textbook may not be the appropriate response for any individual student, and more importantly may well fail to identify key problems with the teaching or learning.

Who gets to use the data?

Which brings me to my last point. Ruth Tudor, president of the Open University’s Students’ Association, reported that:

when the data analytics programme was first mooted, participants were “naturally” anxious about the university selling the information it collected to a third party.

The OU has given strong assurances that it will not do this, but there is growing concern that as higher education institutions come to rely more on direct funding and less government support, they will be tempted to raise revenues by selling data to third parties such as advertisers. As Andrew Keen has argued, this is a particular concern about MOOCs, which rely on other means than direct fees for financial support.

Thus it is incumbent on institutions using learning analytics to have very strong and well enforced policies about student privacy and use of student data. The problem then though is that can easily lead to instructors being denied access to the very data which is of most value in identifying student learning difficulties and possible solutions. Finding the right balance, or applying common sense, is not going to be easy in this area.

Reference

McIntosh, N., Calder, J. and Swift, B. (1976) A Degree of Difference New York: Praeger

 

39 questions to ask when choosing media for teaching and learning

© bewareofimages.com, 2011

© bewareofimages.com, 2011

Yeah, 39 questions is a lot, but then there is a lot of things to take into consideration. I pulled together the key questions for consideration from Chapter 9 (just published) of my open textbook, ‘Teaching in a Digital Age.’

Take a look at them, then tell me:

(a) what have I missed

(b) what you would leave out

(c) if this is a futile exercise

These questions should be used in conjunction with Chapter 9, and address a real context that you may be facing, such as designing a new course.

It is recommended you work through each question one by one, possibly making notes of your answers. It is also recommended that you do this in a fairly systematic manner the first two or three times when faced with a possible choice of media for a whole course or program. This could take a few days, allowing time for thinking. Some questions may need to wait until other questions have been answered. It will likely to be an iterative process.

After you have worked through the questions, give yourself a day or two if possible before thinking about what media or technology will best fit with your course or program. Discuss  your thoughts about media use with other instructors and with any professionals such as an instructional designer or media designer before the design of the course. Leave yourself open to making more final decisions as you start designing/developing and delivering the course, with the option of checking back with your notes and more details in Chapter 9.

After the first two or three times of working through the questions, you will be able to be less systematic and quicker in making decisions, but the questions and answers to the questions should always be in your head when making decisions about media for teaching.

Students

1. What is the mandate or policy of your institution, department or program with respect to access? How will students who do not have access to a chosen technology be supported?

2. What are the likely demographics of the students you will be teaching? How appropriate is the technology you are thinking of using for these students?

3. If your students are to be taught at least partly off campus, to which technologies are they likely to have convenient and regular access at home or work?

4. If they are to be taught at least partly on campus, what is – or should be – your or your department’s policy with regard to students’ access to learning technologies in class?

5. What digital skills do you expect your students to have before they start the program?

6. If students are expected to provide their own access to technology, will you be able to provide unique teaching experiences that will justify the purchase or use of such technology?

7. What prior approaches to learning are the students likely to bring to your program? How suitable are such prior approaches to learning likely to be to the way you need to teach the course? How could technology be used to cater for student differences in learning?

Ease of use

8. How intuitively easy to use is the technology you are considering, both by students and by yourself?

9. How reliable is the technology?

10. How easy is it to maintain and up-grade the technology?

11. The company that is providing the critical hardware or software you are using: is it a stable company that is not likely to go out of business in the next year or two, or is it a new start-up? What strategies are in place to secure any digital teaching materials you create should the organisation providing the software or service cease to exist?

12. Do you have adequate technical and professional support, both in terms of the technology and with respect to the design of materials?

13. How fast developing is this subject area? How important is it to regularly change the teaching materials? Which technology will best support this?

14. To what extent can the changes be handed over to someone else to do, and/or how essential is it for you to do them yourself?

15. What rewards are you likely to get for using new technology in my teaching? Will use of a new technology be the only innovation, or can you also change your way of teaching with this technology to get better results

16. What are the risks in using this technology?

Cost/your time

17. Which media are likely to take a lot of your time to develop? Which could you do quickly and easily?

18. How much time do you spend preparing lectures? Could that time be better spent preparing learning materials, then using the time saved from delivering lectures on interaction with students (online and/or face-to-face)?

19. Is there a possibility of extra funding for innovative teaching or technology applications? How could you best use that funding?

20. What kind of help can you get in your institution from instructional designers and media professionals for media design and development?

21. What open educational resources could be used for this course? Could you use an open textbook, thereby saving students the cost of buying textbooks? Can the library or your learning technology support group help identify potential OERs for your course?

Teaching/educational factors

22. What are the desired learning outcomes from the teaching in terms of content and skills?

23. What instructional strategies will be employed to facilitate the learning outcomes?

24. What unique pedagogical characteristics of text will be appropriate for this course, in terms of content presentation and skills development?

25. What unique pedagogical characteristics of audio will be appropriate for this course, in terms of content presentation and skills development?

26. What unique pedagogical characteristics of video will be appropriate for this course, in terms of content presentation and skills development?

27. What unique pedagogical characteristics of computing will be appropriate for this course, in terms of content presentation and skills development?

28. What unique pedagogical characteristics of social media will be appropriate for this course, in terms of content presentation and skills development?

29. What really must be done face-to-face on this course? (Are you sure? Think about it!)

Interaction

30. In terms of the skills you are trying to develop, what kinds of interaction will be most useful? What media or technology could you use to facilitate that kind of interaction?

31. In terms of the effective use of your time, what kinds of interaction will produce a good balance between  student comprehension and student skills development, and the amount of time you will be interacting personally or online with students?

Organisational issues

32. How much and what kind of help can you get from the institution in choosing and using media for teaching? Is help easily accessible? How good is the help? Do they have the media professionalism you will need? Are they up to date in the use of new technologies for teaching?

33. Is there possible funding available to ‘buy you out’ for a semester and/or to fund a teaching assistant so you can concentrate on designing a new course or revising an existing course? Is there funding for media production?

34. To what extent will you have to follow ‘standard’ technologies, practices and procedures, such as using a learning management system, or lecture capture system, or will you be encouraged and supported to try something new?

Networking

35. How important is it to enable learners to network beyond a course, with others such as subject specialists, professionals in the field, and relevant people in the community? Can the course, or student learning, benefit from such external connections?

36. If this is important, what’s the best way to do this? Use social media exclusively? Integrate it with other standard course technology? Delegate responsibility for its design and/or administration to students or learners?

Security and privacy

37. What student information are you obliged to keep private and secure? What are my institution’s policies on this? Who would know?

38. What is the risk that by using a particular technology your institution’s policies concerning privacy could easily be breached? Who in your institution could advise you on this?

39. What areas of teaching and learning, if any, need you keep behind closed doors, available only to students registered in your course? Which technologies will best allow you to do this?

 

Balancing the use of social media and privacy protection in online learning

Print

Figure 9.9 Privacy ranking by Privacy International, 2007 Red: Endemic surveillance societies Strong yellow: Systemic failure to uphold safeguards Pale yellow: Some safeguards but weakened protections http://en.wikipedia.org/wiki/Privacy#mediaviewer/File:Privacy_International_2007_privacy_ranking_map.png

Figure 9.9 Privacy ranking by Privacy International, 2007
Red: Endemic surveillance societies
Strong yellow: Systemic failure to uphold safeguards
Pale yellow: Some safeguards but weakened protections
http://en.wikipedia.org/wiki/Privacy#mediaviewer/File:Privacy_International_2007_privacy_ranking_map.png

Print

This is the last of the SECTIONS criteria for selecting media for my online open textbook, Teaching in a Digital World. The last ‘S’ stands for Security and Privacy.

This is a change from earlier versions of the SECTIONS model, where ‘S’ stood for speed, in terms of how quickly a technology enabled a course to be developed.. However, the issues that I previously raised under speed have been included in Section 9.3, ‘Ease of Use’. This has allowed me to replace ‘Speed’ with ‘Security and privacy’, which have become increasingly important issues for education in a digital age.

9.9.1 The need for privacy and security when teaching

Instructors and students need a private place to work online. Instructors want to be able to criticize politicians or corporations without fear of reprisal; students may want to keep rash or radical comments from going public or will want to try out perhaps controversial ideas without having them spread all over Facebook. Institutions want to protect students from personal data collection for commercial purposes by private companies, tracking of their online learning activities by government agencies, or marketing and other unrequested commercial or political interruption to their studies. In particular, institutions want to protect students, as far as possible, from online harassment or bullying. Creating a strictly controlled environment enables institutions to manage privacy and security more effectively.

Learning management systems provide password protected access to registered students and authorised instructors. Learning management systems were originally housed on servers managed by the institution itself. Password protected LMSs on secure servers have provided that protection. Institutional policies regarding appropriate online behaviour can be managed more easily if the communications are managed ‘in-house.’

9.9.2 Cloud based services and privacy

However, in recent years, more and more online services have moved ‘to the cloud’, hosted on massive servers whose physical location is often unknown even to the institution’s IT services department. Contract agreements between an educational institution and the cloud service provider are meant to ensure security and back-ups.

Nevertheless, Canadian institutions and privacy commissioners have been particularly wary of data being hosted out of country, where it may be accessed through the laws of another country. There has been concern that Canadian student information and communications held on cloud servers in the USA may be accessible via the U.S. Patriot Act. For instance, Klassen (2011) writes:

Social media companies are almost exclusively based in the United States, where the provisions of the Patriot Act apply no matter where the information originates. The Patriot Act allows the U.S. government to access the social media content and the personally identifying information without the end users’ knowledge or consent.
The government of British Columbia, concerned with both the privacy and security of personal information, enacted a stringent piece of legislation to protect the personal information of British Columbians. The Freedom of Information and Protection of Privacy Act (FIPPA) mandates that no personally identifying information of British Columbians can be collected without their knowledge and consent, and that such information not be used for anything other than the purpose for which it was originally collected.

Concerns about student privacy have increased even more when it became known that countries were sharing intelligence information, so there remains a risk that even student data on Canadian-based servers may well be shared with foreign countries.

Perhaps of more concern though is that as instructors and students increasingly use social media, academic communication becomes public and ‘exposed’. Bishop (2011) discusses the risks to institutions in using Facebook:

  • privacy is different from security, in that security is primarily a technical, hence mainly an IT, issue. Privacy needs a different set of policies that involves a much wider range of stakeholders within an institution, and hence a different (and more complex) governance approach from security;
  • many institutions do not have a simple, transparent set of policies for privacy, but different policies set by different parts of the institution. This will inevitably lead to confusion and difficulties in compliance;
  • there is a whole range of laws and regulations that aim to protect privacy; these cover not only students but also staff; privacy policy needs to be consistent across the institution and be compliant with such laws and regulation.
  • Facebook’s current privacy policy (2011) leaves many institutions using Facebook at a high level of risk of infringing or violating privacy laws – merely writing some kind of disclaimer will in many cases not be sufficient to avoid  breaking the law.

The controversy at Dalhousie University where dental students used Facebook for violent sexist remarks about their fellow women students is an example of the risks endemic in the use of social media.

9.9.3 The need for balance

Although there may well be some areas of teaching and learning where it is essential to operate behind closed doors, such as in some areas of medicine or areas related to public security, or in discussion of sensitive political or moral issues, in general though there have been relatively few privacy or security problems when teachers and instructors have opened up their courses, have followed institutional privacy policies, and above all where students and instructors have used common sense and behaved ethically. Nevertheless, as teaching and learning becomes more open and public, the level of risk does increase.

9.9.4 Questions for consideration

1. What student information am I obliged to keep private and secure? What are my institution’s policies on this?

2. What is the risk that by using a particular technology my institution’s policies concerning privacy could easily be breached? Who in my institution could advise me on this?

3. What areas of teaching and learning, if any, need I keep behind closed doors, available only to students registered in my course? Which technologies will best allow me to do this?

Over to you

1. I couldn’t find more recent references on this issue than 2011, when it seemed to be a hot topic. Has anything significantly changed with regard to privacy and social media in education since 2011 that I should be aware of? Or have our institutions nailed it regarding sensible policies and practices? (Did I hear guffaws?) References would be particularly welcome.

2. If anyone would like to share their experiences regarding privacy issues as a result of using social media for teaching, please either send me an e-mail (for privacy reasons) or share a comment on this post.

Up next

The final section on Chapter 9: Making decisions about what media to use. This will suggest a relatively simple approach for what is in effect a highly complex topic.

Yes, I know, you just can’t wait for this final episode. Keep tuned to this station.

References

Bishop, J. (2011)  Facebook Privacy Policy: Will Changes End Facebook for Colleges? The Higher Ed CIO, October 4

Klassen, V. (2011) Privacy and Cloud-­Based  Educational Technology in British Columbia Vancouver BC: BCCampus

See also:

Bates, T. (2011) Cloud-based educational technology and privacy: a Canadian perspective, Online Learning and Distance Education Resources,, March 25