August 22, 2014

Game-based and immersive courses in a community college system

Listen with webReader

SimSprayBradley, P. (2014) Getting in the game: Colorado colleges develop game-based, immersive courses Community College Weekly, March 3

The Colorado Community College System (CCCS) is one of the leading community college systems in exploring new online technologies. I have already reported on their use of remote labs for teaching introductory science courses at a distance. This article looks at the extensive use of immersion and game-based learning in the CCCS:

CCCS set aside $3 million through its Faculty Challenge Grant Program to encourage the development of courses and curriculum focusing on immersion and game-based learning (IGBL). Grants were awarded to 15 projects. The intent was that they would be “lighthouse projects,” illuminating the way for others to follow. Each solution would be scalable, shared with other institutions throughout the 13-college system.

Some of the 15 projects

Projects from this investment include the following:

  • CSI Aurora (Aurora CC) teaches the reality of forensic work through an immersive learning exercise involving a mock crime scene and mock criminal trial, with student participation from the archaeology, forensic anthropology, criminal justice, paralegal and science departments.
  • the Auto Collision Repair program at Morgan Community College purchased a SimSpray immersive virtual reality painting simulation unit, designed to assist in the teaching of spray painting and coating fundamentals. Using SimSpray decreases the expense of paint used to teach spray painting and prevents exposure to potentially dangerous fumes. The 3D SimSpray experience allows students to practice painting before ever stepping into the paint bay (I think in this case the real thing would be more fun!)
  • At Front Range Community College, Project Outbreak is a series of augmented reality scenarios in which microbiology students track and follow a potential epidemic in their local area to its source across international borders. Students use their mobile devices, the TagWhat geolocation app, Google Hangout and Google maps. Scenarios are designed to meet core competencies, promote global connectedness and give students a global perspective in solving real-world problems
  • the Community College of Aurora’s film school is in the process of using a $100,000 grant to create a virtual economy designed to mirror the reality of the studio system, from writing scripts to luring investors to screening the film in front of a real-life audiences. Over the past seven years, the film school has developed proprietary software that allows students to experience — virtually — every aspect of the filmmaking experience. The cost of rental housing in Los Angeles, New York and Denver can be accessed with a few clicks of a mouse. The cost of obtaining equipment can easily be calculated. Students working within a set budget can see how much to devote to paying actors and directors, producers and key grips.
  • an instructor at the the Community College of Denver is using ACCESS, a web-based game modelled after the board game “Life”, whixh simulates a person’s travels through his or her life, from college to retirement, with jobs, marriage, and possible children along the way. ACCESS teaches the course in a flipped format, allowing students to receive information through videos, podcasts, downloadable lectures and social media, and then discuss the materials in class. The course is designed to help students successfully complete remedial coursework.

Results

The article offers the following results from a ‘consultant’s report’ but I couldn’t find any corroboration:

  • where the ACCESS game was used, scores on quizzes jumped 14 percent and 71 percent of students completed the course, compared to 60 percent enrolled without the gaming component
  • students exhibited nearly identical pass/fail rates as non-IGBL courses.
  • 69 percent of students across semesters indicated that they were either more or much more satisfied with their IGBL course, as compared to other courses; 85 percent of students indicated that they were either more or much more satisfied with their IGBL instructor, as compared to other instructors.
  • students indicated that their IGBL course did a better or much better job (as compared to non-IGBL courses) of helping them achieve a variety of learning outcomes, including: having fun while learning (83 percent/73 percent); applying learning to new situations (81 percent/72 percent); staying engaged in learning (79 percent/73 percent); feeling involved in the college (69 percent/60 percent); working well with other students (67 percent/61 percent).

Over to you

Contact North has descriptions of a number of immersive learning projects under its ‘Pockets of Innovation‘ such as Loyalist College’s Border Simulation in Second Life.

See also:

Games-and-learning-in-digital-worlds-en-francais/

More news of video games

Games to defeat obesity, Napoleon, and students’ learning, and other games’ news

I’d be interested to hear from others who are using game-based immersive learning in the two year college system.

Game-based learning: special edition of the ETS journal

Listen with webReader

Forge FX's Heifer Village: Nepal

Forge FX’s Heifer Village: Nepal

Bellotti, F. et al. (2014) Guest editorial: Game-based learning for 21st century transferable skills: Challenges and Opportunities Educational Technology and Society, Vol. 17, No. 1

The Journal of Educational Technology and Society has a special issue on ‘Game-based learning for 21st century transferable skills: Challenges and Opportunities.

This special issue focuses on analysing how digital SGs [serious games] can contribute to the knowledge society’s higher demand towards acquiring transferable, transversal skills, that can be applied in different contexts, dealing with various scientific disciplines and subjects. Examples of such skills, often referred to as 21st century transferable skills, include, for example, collaboration, critical thinking, creative thinking, problem solving, reasoning abilities, learning to learn, decision taking, digital literacy (Voogt & Pareja Roblin, 2010).

Five papers have been selected covering the following topics:

  • a study that identifies a relationship between learning outcomes and physiological measurements of mental workload,
  • an evidence model for assessing persistence
  • two studies on pedagogical models …developed to support the effective use of serious games in formal education settings
  • an empirical investigation aimed at examining the interplay between learners’ motivation, engagement, and complex problem-solving outcomes in game-based learning
  • a large case-study of four formal education programs exploiting serious games based on multiuser virtual environments.

There is also a large number of papers on other topics in this edition. The focus is mainly on the k-12 sector, but the papers on serious games also have implications and potential for post-secondary education.

e-learning trends from South Africa

Listen with webReader

61760032

Chadwick, K. (2014) e-Learning Trends for 2014 Bizcommunity.com

This is an interesting perspective on corporate e-learning trends from Kirsty Chadwick in South Africa. I’ve focused on this, because trends in Africa are likely to be somewhat different from those here in North America, due to differences in access to the Internet and mobile phones. Here are her 10 picks:

  1. From textbook to tablet: the government of South Africa has launched a tablet program for high schools. ‘In 2014, 88,000 Huawei tablets will be distributed to 2200 public schools in Gauteng as part of a new e-learning initiative.’
  2. The shift to mobile: ‘Smartphone growth in Africa has increased by 43% annually since 2000, and experts predict that 69% of mobiles in Africa will have internet access by 2014.’
  3. More gaming
  4. MOOCs: ‘While MOOCs currently don’t have standardised quality assurance in place, this will likely change in the near future.’
  5. Social media: students’ success is very reliant on their ability to participate in study groups and that those who engage in these groups learn significantly more than students who don’t.
  6. Classes online: ‘2014 is likely to see a large number of businesses moving over to online training. Recent studies have projected that by 2019, 50% of all classes taught, will be delivered online.’
  7. Trading desktop for mobile: ‘2014 will be the year in which the number of mobile users will exceed the number of desktop users.’
  8. More learning for everyone: 47% of online learners are over the age of 26, compared to a significantly lower age group a few years ago
  9. HTML5: ‘improved JavaScript performance will begin to push HTML5 and the browser as a mainstream enterprise application development environment.’
  10. More interactivity: ‘courseware is likely to be more immersive and interactive ….the use of animations and games within learning environments keeps the tech-savvy generation engaged and entertained, leading to increased knowledge retention.’

Comment

How can I argue with someone in Africa on this? It looks pretty good to me from the other side of the world. However, I think there are some unique developments in online learning that will come out of Africa. So here’s my very tentative suggestions for e-learning in Africa in 2014.

I agree that in Africa generally, mobile learning, cheap tablets and open textbooks will become driving forces, saving on expensive and often hard to get foreign textbooks, and ensuring more locally adaptable learning materials.

The big growth though will be in non-formal education, where major strides have already been made in supporting small farmers and small business development for women, the development of entrepreneurs, and of IT competencies and skills, using mobile phones, social networking, and direct links to university and government agencies in the field.

Corporate education will be not far behind, but e-learning will be focused mainly in large and/or multinational companies.

Unfortunately, in many African countries, the penetration of online learning into formal education will be much slower, due to government bureaucratic barriers, lack of investment and failure by established institutions to recognize the importance of technology in education, and by governments not giving equal consideration to the need for teacher training in technology use as to investment in technology.

One or two African universities though will become world leaders in online learning through the use of local wi-fi networks and becoming commercial ‘hubs’ for global connections to the Internet, enabling them to cross-subsidize their online teaching activities.

Whatever the eventual outcome, what strikes me about Africa is the hope and the potential for major breakthroughs in online learning and e-learning. Necessity is the mother of invention.

61690027

 

Are we right to fear computers in education – or in life?

Listen with webReader

In this post, I’m going to look at some fun fiction about computers, then raise some questions about whether our fears are rational, or whether we really do need to question much more closely our addiction to technology, especially in education. This is not so much focused on specific new developments such as MOOCs (see: My Summer Paranoia) but on what it is reasonable to expect computers to do in education, and what we should not be trying to do with them.

Computers in film and print

There was an interesting article in the Globe and Mail on October 20 about IBM’s super computer, WATSON, being used to ‘help conquer business world challenges.’ Dr. Eric Brown of IBM actually described how WATSON was being used to help with medical diagnosis, or what he called ‘clinical-decision support,’ and how this approach could be extended to other areas in business, such as call-centre support, or financial services to identify ‘problems’ where large amounts of data need to be crunched (did he mean derivatives?)

Just after reading the article, I accidently came across an old 1970 movie on TVO last night, called, ‘Colossus: the Forbin Project‘. It was based upon the 1966 novel Colossus, by Dennis Feltham Jones, about a massive American defense computer, named Colossus, becoming sentient and deciding to assume control of the world. It does not have a good ending (at least for mankind’s freedom).

Colossus was the name given to the first large electronic computer, used to break the German Enigma code in the Second World War. It was located at Bletchley Park, England, not far from where the Open University's headquarters are located.

The date of the movie is interesting, made at the height of the Cold War, but when challenged by the power of in fact two supercomputers (Colossus in the USA and Guardian in the Soviet Union) which decide to communicate with each other and combine their power, the Americans and the Communists come together to fight – unsuccessfully – the mutual threats from the computers, suggesting there is more in common across humanity than there is between humanity and machines.

Of course, this movie came two years after Stanley Kubrik’s masterful 2001: A Space Odyssey, where HAL, the spaceship’s computer, begins to malfunction, kills nearly all the crew, and is finally shut down by the last remaining crew member, Dave Bowman. So we now have a score: humans 1, computers 1.

Then there is my personal favourite, the Matrix (1999). The film depicts a future in which reality as perceived by most humans is actually a simulated reality or cyberspace created by sentient machines to pacify and subdue the human population, while their bodies’ heat and electrical activity are used as an energy source. Upon learning this, computer programmer “Neo” is drawn into a rebellion against the machines, involving other people who have been freed from the “dream world” and into reality. I put this one down to a draw, since there have been two sequels and the battle continues.

Lastly, a new film is coming out in March, 2013, based on Orson Scott Carson’s wonderful book ‘Ender’s Game‘, first published in 1985 and slightly updated in 1991. (If you have teenage boys, this is a must for a Christmas present, especially if they generally hate reading). In preparation for an anticipated third invasion from an insectoid alien species, an international fleet maintains a school to find and train future fleet commanders. The world’s most talented children, including the novel’s protagonist, Ender Wiggin, are taken at a very young age to a training center known as the Battle School. There, teachers train them in the arts of war through increasingly difficult games including ones undertaken in zero gravity in the Battle Room where Ender’s tactical genius is revealed. Again, the book explores the intersection between virtuality and reality.

Computers: promise and reality

It is interesting to look at these old science fiction movies and novels and today’s computer world, and see where progress has been made, and where it hasn’t. Colossus in some ways anticipated the Internet, as the two computers searched for ‘pathways’ through which to communicate with each other. We certainly have much more remote surveillance, especially in the United Kingdom, where almost every public space is now under video surveillance, and where increasingly governments are exerting more monitoring over the Internet, both for protecting individual freedoms, such as monitoring sexual exploitation of minors, and for more insidious purposes, such as industrial and political espionage. Claims have been made that 2011: Space Oduyssey predicted the iPad. Ender’s Game comes very close to representing the complexity and depth of many computer games today, and conspiracy theorists will tell you that the first moon landing was filmed in Hollywood, so close do movies come to presenting fiction as reality.

However, despite Watson and distributed computing, many of the developments in this early science fiction have proved to be much more difficult to implement. In particular, although all these early movies assumed voice recognition, we are still a long way from having the fluency depicted in these movies, even after more than 40 years of research and development. For instance, try communicating with WestJet’s or Telus’s automated answering systems (and in WestJet’s case, it frequently fails to recognize the spoken language of even native English speakers – such as myself!) These ‘voice recognition’ systems manage simple algorithmic decisions (yes or no; options  1-5) but cannot deal with anything that is not predictable, which is often the very reason why you need to communicate with these organizations. In addition to the difficulties of voice recognition, these systems are clearly designed by computer specialists who do not take into account how humans behave, or the reasons they are likely to use the phone to communicate, rather than the Internet.

As Dr. Eric Brown of IBM admits, ‘When you try to create computer systems that can understand natural language, given all the nuance and ambiguity, it becomes a very significant problem.’ As he rightly says, human language is often implicit and tacit, using signs and meanings which humans have learned to almost automatically and most times correctly interpret, but which are very difficult for computers to interpret. Indeed, in recent years, more progress seems to have been made on face recognition than voice recognition, no doubt driven by security concerns.

Face recognition has made more progress than voice recognition

The biggest challenge though that computers face is in the field of artificial intelligence, and in particular how humans think and make decisions. As already noted, computers can handle algorithms very well, but this is a comparatively small component of human decision-making. Humans tend to be inductive or intuitive thinkers, rather than deductive or algorithmic thinkers. Computers tend to operate in absolute terms. If part of the algorithm fails, then the computer is likely to crash. Humans however are more qualitative and probabilistic in their thinking. They handle ambiguity better, are willing to make decisions on less than perfect information, and continue to operate even though they may be wrong in their thinking or actions – they tend to be much more self-correcting than computers.

Can we and should we?

This raises two important questions:

  • will it be possible to design machines that can think like humans?
  • And more importantly, if we can do this, should we?

These questions have particular significance for education, because as Dr. Brown of IBM said, ‘to build these kinds of systems you actually need to leverage learning, automatic learning and machine learning in a variety of ways.’

At the moment, even though WATSON, the world’s largest computer, can beat experts at chess, can outperform humans in memory games such as Jeopardy, and can support certain kinds of decision-making, such as medical diagnosis, it still struggles with non-algorithmic thinking. One human brain has many more nodes and networks than the largest computers today. According to Dharmendra Modha, director of cognitive computing at the IBM Almaden Research Center:

We have no computers today that can begin to approach the awesome power of the human mind. A computer comparable to the human brain would need to be able to perform more than 38 thousand trillion operations per second and hold about 3,584 terabytes of memory. (IBM’s BlueGene supercomputer, one of the worlds’ most powerful, has a computational capability of 92 trillion operations per second and 8 terabytes of storage.)

However, research and development in psychology probably will lead to developments in artificial intelligence that will enable very powerful computers, probably using networked distributed computing, to eventually outperform humans in more intuitive and less certain forms of thinking. Dr. Modha went on to predict that we’ll be able to simulate the workings of the brain by 2018. I’m not so sure. If we still haven’t satisfactorily cracked voice recognition after 40 years, it may take a little more than six years to tackle intuitive thinking. Nevertheless, I do believe eventually it will be possible to replicate in machines much of what is now performed by human brains. The issue then becomes whether this is practical or cost-efficient, compared with using humans for similar tasks, who in turn often have to be educated or trained at high cost to do these activities well.

Answering the second question – whether we should replace human thinking with computers – though is much more difficult. Machines have been replacing human activity since at least the Renaissance. The printing press put a lot of monks out of business. So won’t computers start making teachers redundant?

This assumes though that teaching and learning is purely about logic and reasoning. If only it were. So much of learning requires understanding of emotion and feelings, the ability of students to relate to their teachers and their fellow students, and above all, is about fostering, developing and supporting values, especially freedom, security, and well-being. Indeed, even some computer scientists such as Dr. Brown argue that computers are most valuable when they are used to support rather than replace human activities: ‘It’s technology to help humans do their jobs better, faster, more effectively, more efficiently‘. And, as in films such as Colossus and the Matrix, it’s about computers supporting humanity, not the other way round.

The implications for teaching and learning

Thus my belief (how will a computer handle that?) is that computers are wonderful tools for supporting teaching and learning, and as cognitive and computer scientists become more knowledgeable, computers will increase in value in meeting this purpose as time goes on, . However it means that these scientists need to work collaboratively, and more importantly as equals, with teachers and indeed learners, to ensure that computers are used in ways that respect not only the complexity of teaching and learning, but also the value systems that underpin a liberal education.

And it is here that I have the most concerns. There is, especially in the United States of America, a growing ideology that considers teachers to be ineffective or redundant and which seeks means to replace teachers with computers. Coursera-style MOOCs are just one example. Multiple-choice testing and open educational resources in the format of iTunes and OpenCourseWare are other examples.Once it’s ‘up there’, there are some who believe that the recorded lecture is the ‘teacher.’ It is not: it is a transmitter of content, which is not the same as a teacher.

Another concern for us, as humans, is to be continually aware of the difference between virtuality and reality. This is not to criticize the use of virtual reality for teaching, but it is to ensure that learners understand the significance of their actions when they transfer skills from a virtual to a real world, and to be able to distinguish which world they are in. This is not yet a major problem because virtual reality is disappointingly under-used in education, but it is increasingly a feature of the lives of young people. This sensitivity to the difference between virtuality and reality will become an increasingly important life skill, as we begin to merge them, for instance in the remote control of robot welders in pipelines. It’s important to know the difference between training (virtual reality) and life, when a mistake can lead to an explosion or an oil leak, which has very real consequences.

Lastly, I also have some concerns about the ‘open culture’ of web 2.0. In general, as readers will know, I am a great supporter of web 2.0 tools in education, and of open access in particular. However, this does not apply to all web 2.0 tools, or all ways in which they are used. Jared Lanier, one of the founders of virtual reality, says:

 “I know quite a few people … who are proud to say that they have accumulated thousands of friends on Facebook. Obviously, this statement can only be true if the idea of friendship is reduced.

Also, while in general Lanier supports the use of crowd sourcing and the ‘wisdom of the crowd’ that underlies moves towards cMOOCs and Siemen’s theory of connectivism, he criticizes:

the odd lack of curiosity about the limits of crowd wisdom. This is an indication of the faith-based motivations behind such schemes. Numerous projects have looked at how to improve specific markets and other crowd wisdom systems, but too few projects have framed the question in more general terms or tested general hypotheses about how crowd systems work.’

None of these concerns undermine my belief that computers, when used appropriately, can and do bring enormous benefits to teaching and learning. We shouldn’t anthropomorphize computers (they don’t like it) but, as I learned from ‘Downton Abbey’, like all good servants, they need to know their place.

Questions

1. Do you believe that ‘we’ll be able to simulate the workings of the brain by 2018′? I’d like to hear from brain scientists if they agree – too often what’s reported in science is not what the majority of scientists think.

2. If we could ‘simulate the workings of the brain’, what impact would it have on teaching and learning?

3. Do you believe that there is a desire in some countries to replace teachers with computers? Do you see Coursera and xMOOCs as part of this conspiracy?

4. Do you think I am being irrational in my concerns about computers in teaching?

Further reading

HAL 9000 (2012) Wikipedia

Houpt, S. (2012) IBM hones Watson the supercomputer’s skills to help conquer business world challenges The Globe and Mail, October 20

Lanier, J. (2010) You Are Not a Gadget New York: Alfred A. Knopf

Orson Scott Card (1994) Ender’s Game New York: Tor

Colossus: The Forbin Project 

Online game to help self-management of finances

Listen with webReader

AP (2012) NH launches online money management game Vanguard, February 12

Extracts

The U.S. Treasury Department recently awarded grants to five states to expand financial education and counseling services for prospective homebuyers. While the other states set up more traditional face-to-face counseling programs, the New Hampshire Housing Finance Authority created an online program that includes an educational game aimed at making the process more enjoyable.

The game is set up as a “financial freedom island cruise.” Each island represents lessons on budgeting and credit management. Participants earn money by answering questions correctly or by “spinning” the cruise ship’s wheel for bonus prizes — “You won $400 at bingo!” — though it’s all just part of the game….

The idea for the program stemmed from a family self-sufficiency program the finance authority already offers for people trying to build their assets and get out of poverty.

While there are other websites dedicated to the same topic, they often feature advertising. About 100 people have signed up for the New Hampshire program so far, and the goal is to register at least 600 in the next year and a half.

Comment

This won’t of itself solve the US housing mess created by the banks, but it does provide an interesting way of helping people manage their money after they’ve been ripped off by the banks.

But will it also help Americans realise that they cannot get quality public services such as schools and universities if they don’t pay taxes? Now that would be a financial education.