August 27, 2014

Developing intellectual and practical skills in a digital age

Listen with webReader

 

Skills 2

The story so far

Chapter 5 of my open textbook, ‘Teaching in a Digital Age’ is about the design of teaching and learning, which I am currently writing and publishing as I go.

I started Chapter 5 by suggesting that instructors should think about design through the lens of constructing a comprehensive learning environment in which teaching and learning will take place. I have started to work through the various components of a learning environment, focusing particularly on how the digital age affects the way we need to look at some of these components.

I started by looking at how the characteristics of our learners are changing, and followed that by examining how our perspectives on content are being influenced by the digital age. In this post, I look at how both intellectual and practical skills can be developed to meet the needs of a digital age. The following posts will do the same for learner support, resources and assessment respectively.

This will then lead to a discussion of different models for designing teaching and learning. These models aim to provide a structure for and integration of these various components of a learning environment.

Scenario: Developing historical thinking

© Wenxue City: China During the Early Days of the Reform

© Wenxue City: China During the Early Days of the Reform

Ralph Goodyear is a professor of history in a public Tier 1 research university in the central United States. He has a class of 120 undergraduate students taking HIST 305, ‘Historiography’.

For the first three weeks of the course, Goodyear had recorded a series of short 15 minute video lectures that covered the following topics/content:

  • the various sources used by historians (e.g. earlier writings, empirical records including registries of birth, marriage and death, eye witness accounts, artifacts such as paintings, photographs, and physical evidence such as ruins.)
  • the themes around which historical analysis tend to be written,
  • some of the techniques used by historians, such as narrative, analysis and interpretation
  • three different positions or theories about history (objectivist, marxist, post modernist).

Students downloaded the videos according to a schedule suggested by Goodyear. Students attended two one hour classes a week, where specific topics covered in the videos were discussed. Students also had an online discussion forum in the course space on the university’s learning management system, where Goodyear had posted similar topics for discussion. Students were expected to make at least one substantive contribution to each online topic for which they received a grade that went towards their final grade.

Students also had to read a major textbook on historiography over this three week period.

In the fourth week, he divided the class into twelve groups of six, and asked each group to research the history of any city outside the United States over the last 50 years or so. They could use whatever sources they could find, including online sources such as newspaper reports, images, research publications, and so on, as well as the university’s own library collection. In writing their report, they had to do the following:

  • pick a particular theme that covered the 50 years and write a narrative based around the theme
  • identify the sources they finally used in their report, and discuss why they selected some sources and dismissed others
  • compare their approach to the three positions covered in the lectures
  • post their report in the form of an online e-portfolio in the course space on the university’s learning management system

They had five weeks to do this.

The last three weeks of the course were devoted to presentations by each of the groups, with comments, discussion and questions, both in class and online (the in class presentations were recorded and made available online). At the end of the course, students assigned grades to each of the other groups’ work. Goodyear took these student gradings into consideration, but reserved the right to adjust the grades, with an explanation of why he did the adjustment. Goodyear also gave each student an individual grade, based on both their group’s grade, and their personal contribution to the online and class discussions.

Goodyear commented that he was surprised and delighted at the quality of the students’ work. He said: ‘What I liked was that the students weren’t learning about history; they were doing it.’

Based on an actual case, but with some embellishments.

Skills in a digital age

In Chapter 1, Section 1.4, I listed some of the skills that graduates need in a digital age, and argued that this requires a greater focus on developing such skills, at all levels of education, but particularly at a post-secondary level, where the focus is often on specialised content. Although skills such as critical thinking, problem solving and creative thinking have always been valued in higher education, the identification and development of such skills is often implicit and almost accidental, as if students will somehow pick up these skills from observing faculty themselves demonstrating such skills or through some form of osmosis resulting from the study of content. I also pointed out in the same section, though, that there is substantial research on skills development but the knowledge deriving from such research is at best applied haphazardly, if at all, to the development of intellectual skills.

Furthermore the skills required in a digital age are broader and more wide ranging than the abstract academic skills traditionally developed in higher education. For instance, they need to be grounded just as much in digital communications media as in traditional writing or lecturing, and include the development of digital competence and expertise within a subject domain, as well as skills such as independent learning and knowledge management. These are not so much new skills as a different emphasis, focus or direction.

It is somewhat artificial to separate content from skills, because content is the fuel that drives the development of intellectual skills. At the same time, in more traditionally vocational training, we see the reverse trend in a digital age, with much more focus on developing high level conceptual thinking as well as manual skills development. My aim here is not to downplay the importance of content, but to ensure that skills development receives as much focus and attention from instructors, and that we approach intellectual skills development in the same rigorous and explicit way as apprentices are trained in manual skills.

Setting goals for skills development

Thus a critical step is to be explicit about what skills a particular course or program is trying to develop, and to define these goals in such a way that they can be implemented and assessed. In other words it is not enough to say that a course aims to develop critical thinking, but to state clearly what this would look like in the context of the particular course or content area, in ways that are clear to students. In particular the ‘skills’ goals should be capable of assessment and students should be aware of the criteria or rubrics that will be used for assessment.

Thinking activities

A skill is not binary, in the sense that you either have it or you don’t. There is a tendency to talk about skills and competencies in terms of novice, intermediate, expert, and master, but in reality skills require constant practice and application and there is, at least with regard to intellectual skills, no final destination. So it is critically important when designing a course or program to design activities that require students to develop, practice and apply thinking skills on a continuous basis, preferably in a way that starts with small steps and leads eventually to larger ones. There are many ways in which this can be done, such as written assignments, project work, and focused discussion, but these thinking activities need to be thought about, planned and implemented on a consistent basis by the instructor.

Practical activities

It is a given in vocational programs that students need lots of practical activities to develop their manual skills. This though is equally true for intellectual skills. Students need to be able to demonstrate where they are along the road to mastery, get feedback on it, and retry as a result. This means doing work that enables them to practice specific skills.

In the scenario above, students had to cover and understand the essential content in the first three weeks, do research in a group, develop an agreed project report, in the form of an e-portfolio, share it with other students and the instructor for comments, feedback and assessment, and present their report orally and online. Ideally, they will have the opportunity to carry over many of these skills into other courses where the skills can be further refined and developed. Thus, with skills development, a longer term horizon than a single course will be necessary, so integrated program as well as course planning is important.

Discussion as a tool for developing intellectual skills

Discussion is a very important tool for developing thinking skills. However, not any kind of discussion. It was argued in Chapter 2 that academic knowledge requires a different kind of thinking to everyday thinking. It usually requires students to see the world differently, in terms of underlying principles, abstractions and ideas. Thus discussion needs to be carefully managed by the instructor, so that it focuses on the development of skills in thinking that are integral to the area of study. This requires the instructor to plan, structure and support discussion within the class, keeping the discussions in focus, and providing opportunities to demonstrate how experts in the field approach topics under discussion, and comparing students’ efforts.

Figure 5.3: Online threaded discussion forums provide students with opportunities for developing intellectual skills, but the instructor needs to design and manage such forums carefully for this to happen

Figure 5.3: Online threaded discussion forums provide students with opportunities for developing intellectual skills, but the instructor needs to design and manage such forums carefully for this to happen

In conclusion

There are many opportunities in even the most academic courses to develop intellectual and practical skills that will carry over into work and life activities in a digital age, without corrupting the values or standards of academia. Even in vocational courses, students need opportunities to practice intellectual or conceptual skills such as problem-solving, communication skills, and collaborative learning. However, this won’t happen merely through the delivery of content. Instructors need to:

  • think carefully about exactly what skills their students need,
  • how this fits with the nature of the subject matter,
  • the kind of activities that will allow students to develop and improve their intellectual skills, and
  • how to give feedback and to assess those skills, within the time and resources available.

This is a very brief discussion of how and why skills development should be an integral part of any learning environment. We will be discussing skills and skill development in more depth in later chapters.

Over to you

Your views, comments and criticisms are always welcome. In particular:

  • how does the history scenario work for you? Does it demonstrate adequately the points I’m making about skills development?
  • are the skills being developed by students in the history scenario relevant to a digital age?
  • is this post likely to change the way you think about teaching your subject, or do you already cover skills development adequately? If you feel you do cover skills development well, does your approach differ from mine?

Love to hear from you.

Next up

Learner support in a digital age

 

Kuali Foundation goes commercial

Listen with webReader
"No, you idiot, Kuali, not Koalas" 'But isn't kuali a Malaysian way of cooking?"

“No, you idiot, Kuali, not Koalas” ‘But isn’t kuali a Malaysian way of cooking?”

Straumsheim, C. (2014) Kuali Foundation: If you can’t beat them….., Inside Higher Education, August 25

While there are several providers of open source learning management systems for education, Kuali is the only provider of free, open source administrative software specifically built for higher education. In a blog post on August 22, it announced that while its software will still continue to be developed, open source and freely available, it will be creating a commercial company to provide for profit commercial services, such as hosting and contracted software development.

What is Kuali?

Kuali started as a consortium of mainly U.S. research universities which paid to join the Kuali Foundation, with the aim of developing free administrative software software systems designed specifically to meet the needs of higher education/post-secondary institutions.

What does Kuali do?

So far it has developed the following software systems:

How is it doing?

So far nearly 60 HE institutions are using Kuali products. However,  each product is at a different stage of development/usefulness. The financial system is the most advanced and most stabilized.

Why does it matter?

Although the days when Peoplesoft nearly bankrupted several major HE institutions are now long gone, commercial administrative systems such as Oracle and SAS are extremely expensive, designed primarily for a business rather than an educational environment, and as a consequence are often financially risky when it comes to adaptation and implementation within a higher education context. The development of administrative systems for higher education by higher education is a worthy goal, if it can be accomplished.

The ‘if’ though is still in some doubt. The financial system seems to be a success, the Student system is described as a ‘monster’ development project, and the HR system lacks enough investment. So Kuali as a whole is still very much a work in progress.

What are the changes? How is Kuali 2.0 different from the Kuali Foundation?

Kuali is now essentially a for-profit company, rather than a community consortium, although its governance is actually more complex than that. Universities and colleges paid to join the Foundation and contributed investment towards product development. The Foundation will continue to exist but members will not have votes or shares in the new company, although members can continue to contribute to projects that they want done. Other sources of revenue will come from charging for software as a service for cloud-based services.

Comment

I’m not in anyway involved with Kuali, so it is difficult to give an informed comment. I thought it was a good idea when it started, but making a consortium approach to sustainable software development and services work is a major challenge. It requires dedication, goodwill, and continuity from a large number of institutions. In these circumstances, any benefits for the participating organizations need to direct and substantive.

Changing it to a commercial organization is a major disruption to this model. In particular, even if the same people are involved in the investment in product development, governance and operation, it radically changes the culture of the organization. I’m not a governance expert, but I don’t understand why full members who invest substantially in product development don’t have shares or voting rights in the board.

I do hope it succeeds in its goal of providing reliable, sustainable open source solutions for administrative software for HE institutions. I wouldn’t bet my own money on it now, though.

For more on Kuali, see:

A student information system monopoly?

Open source software for research administration

Open source software for administrative systems

 

Special edition on research on MOOCs in the journal ‘Distance Education’

Listen with webReader
The University of Toronto is one of a number of institutions conducting research on MOOCs

The University of Toronto is one of a number of institutions conducting research on MOOCs; their results are still to come

The August 2014 edition of the Australian-based journal, Distance Education (Vol.35, No. 2.), is devoted to new research on MOOCs. There is a guest editor, Kemi Jona, from Northwestern University, Illinois, as well as the regular editor, Som Naidu.

The six articles in this edition are fascinating, both in terms of their content, but even more so in their diversity. There are also three commentaries, by Jon Baggaley, Gerhard Fischer and myself.

My commentary provides my personal analysis of the six articles.

MOOCs are a changing concept

In most of the literature and discussion about MOOCs, there is a tendency to talk about ‘instructionist’ MOOCs (i.e. Coursera, edX, Udacity, xMOOCs) or ‘connectivist’ MOOCs (i.e. Downes, Siemens, Cormier, cMOOCs). Although this is still a useful distinction, representing very different pedagogies and approaches, the articles in this edition show that MOOCs come in all sizes and varieties.

Indeed, it is clear that the design of MOOCs is undergoing rapid development, partly as a result of more players coming in to the market, partly because of the kinds of research now being conducted on MOOCs themselves, and, sadly much more slowly, a recognition by some of the newer players that much is already known about open and online education that needs to be applied to the design of MOOCs, while accepting that there are certain aspects, in particular the scale, that make MOOCs unique.

The diversity of MOOC designs

These articles illustrate clearly such developments. The MOOCs covered by the articles range from

  • MOOC video recorded lectures watched in isolation by learners (Adams et al.)
  • MOOC video lectures watched in co-located groups in a flipped classroom mode without instructor or tutorial support (Nan Li et al.)
  • MOOCs integrated into regular campus-based programs with some learner support (Firmin et al.)
  • MOOCs using participatory and/or connectivist pedagogy (Anderson, Knox)

Also the size of the different MOOC populations studied here differed enormously, from 54 students per course to 42,000.

It is also clear that MOOC material is being increasingly extracted from the ‘massive’, open context and used in very specific ‘closed’ contexts, such as flipped classrooms, at which point one questions the difference between such use of MOOCs and regular for-credit online programming, which in many cases also use recorded video lectures or online discussion and increasingly other sources of open educational materials. I would expect in such campus-based contexts the same quality standards to apply to the MOOC+ course designs as are already applied to credit-based online learning. Some of the research findings in these articles indirectly support the need for this.

The diversity of research questions on MOOCs

Almost as interesting is the range of questions covered by these articles, which include:

  • capturing the lived experience of being in a MOOC (Adams et al.; Knox)
  • the extent to which learners can/should create their own content, and the challenges around that (Knox; Andersen)
  • how watching video lectures in a group affects learner satisfaction (Nan Li et al.)
  • what ‘massive’ means in terms of a unique pedagogy (Knox)
  • the ethical implications of MOOCs (Marshall)
  • reasons for academic success and failure in ‘flipped’ MOOCs (Firmin et al.; Knox)

What is clear from the articles is that MOOCs raise some fundamental questions about the nature of learning in digital environments. In particular, the question of the extent to which learners need guidance and support in MOOCs, and how this can best be provided, were common themes across several of the papers, with no definitive answers.

The diversity of methodology in MOOC research

Not surprisingly, given the range of research questions, there is also a very wide range of methodologies used in the articles in this edition, ranging from

  • phenomenology (Adams),
  • heuristics (Marshall)
  • virtual ethnography (Knox; Andersen)
  • quasi-experimental comparisons (Nan Li et al.)
  • data and learning analytics (Firmin et al.)

The massiveness of MOOCs, their accessibility, and the wide range of questions they raise make the topic a very fertile area for research, and this is likely to generate new methods of research and analysis in the educational field.

Lessons learned

Readers are likely to draw a variety of conclusions from these studies. Here are mine:

  • the social aspect of learning is extremely important, and MOOCs offer great potential for exploiting this kind of learning, but organizing and managing social learning on a massive scale, without losing the potential advantages of collaboration at scale, is a major challenge that still remains to be adequately addressed. The Knox article in particular describes in graphic detail the sense of being overwhelmed by information in open connectivist MOOCs. We still lack methods or designs that properly support participants in such environments. This is a critical area for further research and development.
  • a lecture on video is still a lecture, whether watched in isolation or in groups. The more we attempt to support this transmissive model through organized group work, ‘facilitators’, or ‘advisors’ the closer we move towards conventional (and traditional) education and the further away from the core concept of a MOOC.
  • MOOCs have a unique place in the educational ecology. MOOCs are primarily instruments for non-formal learning. Trying to adapt MOOCs to the campus not only undermines their primary purpose, but risks moving institutions in the wrong direction. We would be better re-designing our large lecture classes from scratch, using criteria, methods and standards appropriate to the goals of formal higher education. My view is that in the long run, we will learn more from MOOCs about handling social learning at scale than about transmitting information at scale. We already know about that. It’s called broadcasting.
  • lastly, there was surprisingly little in the articles about what actual learning took place. In some cases, it was a deliberate research strategy not to enquire into this, relying more on student or instructor feelings and perceptions. While other potential benefits, such as institutional branding, stimulating interest, providing a network of connections, and so on, are important, the question remains: what are participants actually learning from MOOCs, and does this justify the hype and investment (both institutionally and in participants’ time) that surrounds them?

Cultural and ethical issues

The Marshall paper provides an excellent overview of ethical issues, but there is almost no representation of perspectives on MOOCs from outside Western contexts. I would have liked to have seen more on cultural and ethical issues arising from the globalization of MOOCs, based on actual cases or examples. Given the global implications of MOOCs, other perspectives are needed. Perhaps this is a topic for another issue.

Happy reading

I am sure you will be as fascinated and stimulated by these articles as I am. I am also sure you will come away with different conclusions from mine. I am sure we will see a flood of other articles soon on this topic. Nevertheless, these articles are important in setting the research agenda, and should be essential reading for MOOC designers as well as future researchers on this topic.

How to get the articles

To obtain access to these articles, go to: http://www.tandfonline.com/toc/cdie20/current#.U-1WqrxdWh1

WCET’s analysis of U.S. statistics on distance education

Listen with webReader

IPEDS 2

U.S.Department of Education (2014) Web Tables: Enrollment in Distance Education Courses, by State: Fall 2012 Washington DC: U.S.Department of Education National Center for Education Statistics

Hill, P. and Poulin, R. (2014) A response to new NCES report on distance education e-Literate, June 11

The U.S. Department of Education’s Institute of Education Sciences operates a National Center for Education Statistics which in turn runs the Integrated Postsecondary Education Data System (IPEDS). IPEDS is:

a system of interrelated surveys conducted annually by the U.S. Department’s National Center for Education Statistics (NCES). IPEDS gathers information from every college, university, and technical and vocational institution that participates in the federal student financial aid programs. The Higher Education Act of 1965, as amended, requires that institutions that participate in federal student aid programs report data on enrollments, program completions, graduation rates, faculty and staff, finances, institutional prices, and student financial aid. These data are made available to students and parents through the College Navigator college search Web site and to researchers and others through the IPEDS Data Center

Recently IPEDS released “Web Tables” containing results from their Fall Enrollment 2012 survey. This was the first survey in over a decade to include institutional enrollment counts for distance education students. In the article above, Phil Hill of e-Literate and Russell Poulin of WCET have co-written a short analysis of the Web Tables released by IPEDS.

The Hill and Poulin analysis

The main points they make are as follows:

  • overall the publication of the web tables in the form of a pdf is most welcome, in particular by providing a breakdown of IPEDS data by different variables such as state jurisdiction, control of institution, sector and student level
  • according to the IPEDS report there were just over 5.4 million students enrolled in distance education courses in the fall semester 2012 (NOTE: this number refers to students, NOT course enrollments).
  • roughly a quarter of all post-secondary students in the USA are enrolled in a distance education course.
  • the bulk of students in the USA taking distance education courses are in publicly funded institutions (85% of those taking at least some DE courses), although about one third of those taking all their classes at a distance are in private, for-profit institutions (e.g. University of Phoenix)
  • these figures do NOT include MOOC enrollments
  • as previously identified by Phil Hill in e-Literate, there is major discrepancy in the number of students taking at least one online course between the IPEDS study and the regular annual surveys conducted by Allen and Seaman at Babson College – 7.1 million for Babson and 5.5 million for IPEDS. Jeff Seaman, one of the two Babson authors, is also quoted in e-Literate on his interpretation of the differences. Hill and Poulin comment that the NCES report would have done well to at least refer to the significant differences.
  • Hill and Poulin claim that there has been confusion over which students get counted in IPEDS reporting and which do not. They suspect that there is undercounting in the hundreds of thousands, independent of distance education status.

Comment

There are lies, damned lies and statistics. Nevertheless, although the IPEDS data may not be perfect, it does a pretty good job of collecting data on distance education students across the whole of the USA. However, it does not distinguish between mode of delivery of distance education (are there still mainly print-based courses around)?

So we now have two totally independent analyses of distance education students in the USA, with a minimum number of 5.5 million and a maximum number of 7.1 million, i.e. between roughly a quarter and a third of all post-secondary students. From the Allen and Seaman longitudinal studies, we can also reasonably safely assume that online enrollments have been increasing between 10-20% per annum over the last 10 years, compared with overall enrollments of 2-5% per annum.

By contrast, in Canada we have no national data on either online or distance education students. It’s hard to see how Canadian governments or institutions can take evidence-based policy decisions about online or distance education without such basic information.

Lastly, thank you, Phil and Russ, for a very helpful analysis of the IPEDs report.

Update

For a more detailed analysis, see also:

Haynie, D. (2014) New Government Data Sheds Light on Online Learners US News, June 13

 

The success or otherwise of online students in the California Community College system

Listen with webReader

 Online offerings vary widely across subject

Johnson, H. and Mejia, M. (2014) Online learning and student outcomes in California’s community colleges San Francisco CA: Public Policy Institute of California, 20 pp

I’m not a great fan of studies into completion rates in online learning, because most studies fail to take into account a whole range of factors outside of the mode of delivery that influence student outcomes. However, this study is an exception. Conducted by researchers at the highly influential PPIC, it takes a very careful look at how well students across the whole California community college system (CCCS) do in online learning, and there are some very interesting findings that may not come as a surprise to experienced observers of online learning, but will certainly provide fodder for both supporters and skeptics of online learning.

Why the study is important

Several reasons:

  • California’s community colleges offer more online credit courses than any other public higher education institution in the country. By 2012, online course enrollment in the state’s community colleges totaled almost one million, representing about 11 percent of total enrollment
  • Over the past ten years, online course enrollment has increased by almost 850,000, while traditional course enrollment has declined by almost 285,000.
  • Community colleges are more likely than other institutions of higher education [in the USA] to serve nontraditional students. These students often have employment and family obligations and therefore may potentially benefit the most from online learning.
  • The state of California is investing $57 million over the next five to six years for online learning initiatives within the California Community College system
  • The California Community Colleges Chancellor’s Office (CCCCO) provided … access to unique longitudinal student- and course-level data from all of the state’s 112 community colleges

Main findings

  • Between 2008–09 and 2011–12, total credit enrollment at California’s community colleges declined by almost a million. The scarcity of traditional courses has been a factor in the huge increase in online enrollments. With the state cutting support to community colleges by more than $1.5 billion between 2007–08 and 2011–12, community colleges experienced an unprecedented falloff in enrollment 
  • online course success rates are between 11 and 14 percentage points lower than traditional course success rates.
  • in the long term, students who take online classes tend to be more successful than those who enroll only in traditional courses…students who take at least some online courses are more likely than those who take only traditional courses to earn an associate’s degree or to transfer to a four-year institution.
  • for students juggling school, family and work obligations, the ability to maintain a full-time load by mixing in one or two online courses per term may outweigh the lower chances of succeeding in each particular online course.
  • if a student’s choice is between taking an online course or waiting for the course to be offered in a classroom at a convenient time, taking the online course can help expedite completion or transfer
  • participation in online courses has increased for each of the state’s largest ethnic groups—and online enrollment rates for African American students, an underrepresented group in higher education in California, are particularly high. However, these rates are much lower among Latino students.

Main recommendations

  • move from ad hoc offerings to more strategic planning of online courses
  • improve the ability to transfer credits between community colleges and between colleges and the state’s universities
  • improve the design and provide more consistency in the quality of online courses between institutions
  • adopt a standardized learning management system across all colleges
  • collect systematic information on the cost of developing and maintaining online courses

My comments

This is another excellent and succinct research report on online learning, with a very strong methodology and important results, even if I am not at all surprised by the outcomes. I would expect online completion rates for individual courses to be lower than for traditional courses as students taking online courses often have a wider range of other commitments to manage than full-time, on campus students.

Similarly, I’m not surprised that online course success is lightly lower for community colleges than for universities (if we take both the figures from Ontario and my own experience as a DE director) and for certain ethnic groups who suffer from a range of socio-economic disadvantages. Online learning is more demanding and requires more experience in studying. Post-graduate students tend to do better at online learning than undergraduate students, and final year undergraduate students tend to do better than first year undergraduate students. Nevertheless, as the study clearly indicates, over the long term online learning provides not only increased access but also a greater chance of success for certain kinds of students.

I am worried though that online learning in California has ‘succeeded’ because of the massive cuts to campus-based education. It is better than nothing, but online learning deserves to be considered in its own right, not as a cheaper alternative to campus-based education. Online learning is not a panacea. Different students have different needs, and a successful public post secondary education system should cater to all needs. In the meantime, this is one of the most useful studies on online completion rates.