April 26, 2018

Zuckerberg’s Frankenstein

© The Mind Reels

Prosecutor: Dr. Frankenberg, are you aware that there is a monster roaming the countryside, stealing all the villagers’ personal information?

Dr. Frankenberg: Yes, sir, I am.

Prosecutor: And is it true, Dr. Frankenberg, that you invented this monster, in your dorm room at Harvard?

Dr.Frankenberg (proudly): Yes, sir.

Prosecutor: And are you aware that your monster is going around selling the villagers’ personal information to any Tom, Dick or Harry who will buy it?

Dr. Frankenberg: Yes, sir, that’s why I invented the monster – it’s my business model.

Prosecutor: Has your business model been successful?

Dr. Frankenberg (smugly): Oh, yes, sir, it’s made me and my friends very rich. You see the monster sends all the money to me. I only need a few engineers to make sure the monster doesn’t break down – and of course some very good lawyers – so there’s a lot left over afterwards.

Prosecutor: And are you aware that the monster helped our new Emperor, Donald the Terrible, to become emperor?

Dr. Frankenberg: I was made aware of that only just recently, but of course, I had heard of the rumours much earlier.

Prosecutor: So it was not your intent then that the monster should help Donald the Terrible?

Dr. Frankenberg: Absolutely not.

Prosecutor: And are you aware that hostile tribes outside the kingdom have used the monster to attack us?

Dr. Frankenberg: Yes, of course, that’s why I’m here – but honestly, I didn’t know about this until you did. And I made the monster get them to promise not to do that – but they are hostiles and didn’t keep their promise. 

Prosecutor: It seems to me that you don’t have much control over your monster.

Dr. Frankenberg (sighs): Look, you don’t understand how this works. You design something, you throw it out into the world, then wait to see what happens. Sometimes it’s good. Sometimes it’s bad. But there would be no way to make lots of money if you didn’t do this. If you tried to control it, you wouldn’t know what it could do.

Prosecutor: So you agree that your monster is now out of your control?

Dr. Zuckerberg (frowns, drinks water): Not entirely. We tried using chains recently, but the monster is too strong – he keeps breaking them. But our engineers are working on it, believe me.

Prosecutor: Let me put this to you: you created the monster, so you are responsible for it, but you’ve not done enough to control it.

Dr. Frankenberg: That’s a bit unfair. How was I to know it would become so dangerous? I realise it now, but anyone can be smart after the event.

Prosecutor: Some of the Emperor’s advisers are suggesting that the government should try to control the monster. What are your views on that?

Dr. Frankenberg (shrugs):Well, good luck with that. You realise the monster is not just stealing from our villagers, but from everyone’s now – he’s all over the place. But if you think you can do it, don’t let me stop you.

Judge intervenes: Thank you, Prosecutor, Dr. Frankenberg. We’ll adjourn for today, but we’ll be back in court tomorrow. Dr. Frankenberg, I hope you will take advantage of this time for some thought on how we can control your monster, because you should be aware, neither I nor the government have the slightest clue about how to do this.

Court adjourns.

 

 

 

‘Humans Wanted’: online learning and skills development

Royal Bank of Canada (2018) Humans Wanted Toronto ON: Royal Bank of Canada

I have at last got hold of a full copy of this report that came out a couple of weeks ago. Much to my surprise, I found the report essential reading for anyone in education, mainly because it is relatively specific about the skills that the Canadian job market will need between 2018 and 2021, and the results were not quite what I expected to see.

Conclusions from the report

I can’t better the summary in the report itself:

1. More than 25% of Canadian jobs will be heavily disrupted by technology in the coming decade. Fully half will go through a significant overhaul of the skills required.

2. An assessment of 20,000 skills rankings across 300 occupations and 2.4 million expected job openings shows an increasing demand for foundational skills such as critical thinking, co-ordination, social perceptiveness, active listening and complex problem solving.

3. Despite projected heavy job displacement in many sectors and occupations, the Canadian economy is expected to add 2.4 million jobs over the next four years, all of which will require this new mix of skills.

4. Canada’s education system, training programs and labour market initiatives are inadequately designed to help Canadian youth navigate this new skills economy.

5. Canadian employers are generally not prepared, through hiring, training or retraining, to recruit and develop the skills needed to make their organizations more competitive in a digital economy.

6. Our researchers identified a new way of grouping jobs into six “clusters,” based on essential skills by occupation rather than by industry.

7. By focusing on the foundational skills required within each of these clusters, a high degree of mobility is possible between jobs.

8. Digital fluency will be essential to all new jobs. This does not mean we need a nation of coders, but a nation that is digitally literate.

9. Global competencies like cultural awareness, language, and adaptability will be in demand.

10. Virtually all job openings will place significant importance on judgment and decision making and more than two thirds will value an ability to manage people and resources.

So, no, automation is not going to remove all work for humans, but it is going to change very much the nature of that work, and it is in this sense that technology will be disruptive. Workers will be needed in the future but they will need to be very different workers from the past.

This has massive implications for teaching and learning and the bank is in my view correct in arguing that Canada’s education system is inadequately designed to help Canadian youth navigate this new skills economy.

What skills will be in demand?

Not the ones most of us would have thought that a bank would identify:

© Royal Bank of Canada, 2018

You will see that the most in demand skills will be active listening, speaking, critical thinking and reading comprehension, while the least important skills include science, programming and technology design.

In other words, ‘soft skills’ will be most needed for human work. While this may seem obvious to many educators, it is refreshing to hear this from a business perspective as well.

Methodology

How did the Royal Bank not only identify these skills and their importance, but also how did it put actual numbers in terms of workers against these skills?

The data were derived from an interesting application of big data: an analysis of the skills listed on the web in ‘future-oriented’ job advertisements through media such as LinkedIn, combined with more qualitative interviews with employers, policy-makers, educators and young people.

What does this mean for teaching and learning?

There are several challenges I see:

  • first, getting teachers and instructors to accept that these (and other) skills need to be taught within any subject domain;
  • second, as these skills are not likely to be developed within a singe course, identifying how best to teach these skills at different ages, throughout a program of study, and indeed throughout life;
  • third, codifying these skills in terms of appropriate teaching and assessment methods; too often educators claim they are teaching these skills but if so, it is often implicit or not clear how or even if students acquire these skills.
  • we need to determine how best digital technology/e-learning can support the development of skills. For instance well-designed digital learning can enable skills practice and feedback at scale, freeing teachers and instructors to focus on what needs to be done on a face-to-face basis.

It’s not just about work

The Royal Bank has done a very good job in identifying work-force skills, but these are not the only skills needed in a digital age. Equally if not more important are the skills we need as humans in handling everyday life in a digital age. Examples would be:

  • a wide range of non-work oriented digital literacy skills, such as managing our digital identities (see UBC’s Digital Tattoo as an excellent example) so we as individuals have at least some control over the technology and how it is used
  • understanding the organization and power structures of digital companies and digital technologies: one example might be understanding how to identify and challenge algorithmic decision-making, for instance
  • teaching the important non-digital skills necessary in a digital society (for instance, mindfulness, or social awareness and conduct in both real and online environments).

Identifying such skills and finding ways to integrate the development of such skills within the curriculum is a major challenge but essential if we are to not only survive but thrive as humans in a digital world. We are just getting started on this, but it’s none too soon. In the meantime, the Royal Bank has done a good job in making the discussion about 21st century skills more concrete and practical.

Stanford University to be fully online by 2025?

A Stanford sophomore experiences the virtual world at its Virtual Human Interaction Lab

Today I have received a tip from a close colleague that Stanford University is planning to build a partnership with Alphabet Inc., the owner of Google, to enable Stanford to become a fully online global university by 2025. 

Because the university is on an Easter break, it was difficult to find anyone at Stanford to verify this rumour, but the planning seems to be quite advanced. Apparently a highly confidential strategic planning committee has been working for some time on a plan to convert all programs at Stanford into a fully online format, using advanced technologies such as Artificial Intelligence (AI), Virtual and Augmented Reality (VR and AR), and data analytics (DA), technologies in which both Stanford and Google are world leaders.

This will enable Stanford to offer fully accredited degrees to many thousands of students worldwide at a fraction of the current tuition fees, which are currently just under $50,000 a year. Once fully online, the low tuition fees, estimated be around $1,000 a year, will be made possible by a highly innovative business plan being worked out jointly by Stanford and Google. Stanford plans to sell that part of the campus that will no longer be needed for teaching purposes. The Farm, as it is affectionately known, is over 8,000 acres, located close to Silicon Valley. With real estate currently selling at approximately $65 million an acre in Stanford, just selling off half the land will provide sufficient capital for the investment needed to convert all programs into an online mode, leaving the other half of the land for research and administrative purposes. The partnership with Google will allow Google to use data analytics from student online activity for commercial purposes, which will more or less cover the operational costs of online delivery.

I did manage to get hold of a couple of the committee, who asked not to be named as they are not authorised to give information on this project. However, both were very excited. ‘We won’t have to sack any of the current professorial staff, as we still need their subject expertise’, said one. The other said he was really looking forward to developing the first fully augmented reality engineering degree. ‘This could have huge implications,’ he said. ‘Imagine designing a whole bridge without actually having to physically test it! It’s only ever been tried once before without VR and it didn’t work.’ The Director of Stanford University’s Division of Continuing Studies said, ‘You know, it’s not such a big deal. We’ve been delivering online courses in our division for nearly 20 years, so we do know what we’re doing.’

Others outside the university I talked to though were not quite so sanguine. A spokesperson from WCET was concerned about how the accreditation or professional bodies would react. ‘It’s one thing for the university to give degrees; it’s quite another to get recognized by the Accreditation Boards for Engineering and Technology, who in the past have not accepted any online qualifications. But, hey, it’s Stanford, so who knows?’

My personal view is that it still has to get through Stanford’s Senate and Board of Governors. This will be the real test. However, if it is successful, this model will be totally disruptive of the rest of post-secondary education worldwide. If Stanford can scale its model, it could be not just a global university, but THE one university for the whole world. How cool would that be? 

In the meantime, enjoy April the first.

Virtual reality for midwives: an Australian example

Connolly, B. (2018) How virtual reality is transforming learning at the University of Newcastle, CIO, 8 March

This article includes a couple of nice, short videos demonstrating the use of AR and VR in a University of Newcastle nurses’ program in Australia.

The first one, below, demonstrates the use for breech positioning and placenta replacement (click image to play):

University of Newcastle, NSW, Australia

The second demonstrates a neonatal resucitation scenario when a newborn baby stops breathing.

University of Newcastle, NSW, Australia

These are very good examples of the power of AR and VR to enable students to practice and learn in a safe environment without danger to patients. The technology is accessible via mobile phones or tablets so students can practice in their own time as well as in the VR studio with an instructor.

What would be useful to know is the cost of producing such VR applications and the number of students that make use of the equipment over the length of a course – in other words, what is the return on investment, compared, with, for example, traditional video? What are the added benefits? Do learning outcomes improve? We need much more research into these questions.

 

Assessing the dangers of AI applications in education

Image: CaspionReport

Lynch, J. (2017) How AI will destroy education, buZZrobot, 13 November

I’m a bit slow catching up on this (I have a large backlog of articles and books to review), but this is the best critique I have seen of the potential dangers of AI applications in education.

Don’t be put off by the title – it’s not totally anti-AI but thoughtfully criticises some of the current thinking about AI applications in education.

It’s worth reading in full (an 8 minute read) but here’s a quick summary to encourage you to have the full meal rather than a snack, with my bits of flavouring on top:

Measuring the wrong things

Most data collected about student learning is indirect, inauthentic, lacking demonstrable reliability or validity, and reflecting unrealistic retention timelines. And current examples of AIEd often rely on these poor proxies for learning, using data that is easily collectable rather than educationally meaningful.

Yes, but don’t educators do that too?

(re)Discovering bad ways to teach

AIEd solutions frequently incorporate false and/or unsupported educational ideas reflecting the biases of their developers….If AIEd is going to benefit education, it will require strengthening the connection between AI developers and experts in the learning sciences. Otherwise, AIEd will simply ‘discover’ new ways to teach poorly and perpetuate erroneous ideas about teaching and learning.

I hope the good folks at MIT are reading this because this is exactly what happened with their early MOOCs.

Prioritising adaptivity over quality

The ubiquity of poor quality content means AIEd technologies often simply recommend the ‘best’ piece of (crappy) content or identify students at risk of failing a (crappy) online course early in the semester….Improving and evaluating the quality of instructional content is neither easy nor cheap, it also isn’t something any AIEd solution is going to do. 

This comes down to the criteria that AI uses to make recommendations. This means replacing criteria such as the number of hits, or likes, with more educational criteria, such as clarity and reliability. Not easy but not impossible. And we still need to improve the quality of content, whether we use AI or not.

Swapping affect for efficiency

Maybe one day AIEd will be capable of effectively identifying and nurturing student emotions during learning, but until then we must be careful not to offload educational tasks that, on the surface, may appear menial or routine, but critically depend on emotion and meaningful human connections to be optimally beneficial.

AI advocates often argue that they are not trying to replace teachers but to make their life easier or more efficient. Don’t believe them: the key driver of AI applications is cost-reduction, which means reducing the number of teachers, as this is the main cost in education. In fact, the key lesson from all AI developments is that we will need to pay increased attention to the affective and emotional aspects of life in a robot-heavy society, so teachers will become even more important. 

Comment

One problem with being old is that you keep seeing the same old hype going round and round. I remember the same arguments back in the 1980s over artificial intelligence. Millions of dollars went into AI research at the time, including into educational applications, with absolutely no payoff.

There have been some significant developments in AI since then, in particular pattern recognition, access to and analysis of big data sets, and formalized decision-making within limited boundaries. The trick though is to recognise exactly what kind of applications these new AI developments are good for, and what they cannot do well. In other words, the context in which AI is used matters and needs to be taken account of. Thus the importance of Lynch’s comment about involving learning scientists/educators in the design of AI applications in education.

I believe there will be some useful applications of AI in education, but only if there is continuing dialogue between AI developers and ‘learning scientists’/educators as new developments in AI become available. But that will require being very clear about the purpose of AI applications in education and being wide awake to the unintended consequences.