Ferdig, R. et al. (2014) Findings and reflections from the ‘K-12 Teaching in the 21st Century’ MOOC Lansing MI: Michigan Virtual Learning Research Institute
We are now beginning to get some in-depth research or evaluations of MOOCs. This one is from a team at Kent State University that developed a five week ‘connectivist’ MOOC aimed principally at three distinct audiences: high school students interested in becoming teachers, preservice teachers, and inservice teachers in the K-12 system.
I provide here a very brief summary of the report (as always, you should read the report for yourself if my summary gets you interested). Italics are direct quotes from the report.
Goal of the MOOC
How can we get teachers to think more deeply about reinventing education?
facilitators take on the role of connecting people around an idea for the purpose of bettering our understanding of the
idea. A connectivist-based MOOC draws on the extensive number of participants as well as the existing open repository of content to develop an experience. Participants are both teachers and learners in a process – not a product.
The course was designed around four principles often associated with teaching in the 21st century: connected learning, personalization, collaboration, and reflection.
Coursesites by Blackboard provided the basic platform for content and discussion, supplemented by the use of participants’ social media networks and technologies. In addition participants were asked to create an ‘artifact’ to represent their learning.
Use of partners/co-facilitators
Kent State provided core facilitators for the MOOC, but they also invited other co-facilitators from schools, colleges and universities both in Michigan and from several other states.
Badges and continuing education units were given for successful participation.
Participants (data at time of enrollment, i.e. all participants)
Start of course: 673; end of course: 848; mainly from Michigan and surrounding states, although 12 were international
School teachers: 42%; k-12 students: 23%; post-secondary students: 16%; 19% other (inc. school administrators, university faculty); 80% female.
Participants’ response to the MOOC (168 participants who completed a post-course survey)
Most participants who responded enjoyed the MOOC, with in-service teachers enjoying it the most. Th main criticism (especially from the k-12 students) was the amount of work involved in following the MOOC.
Very active participation in the online discussion forums (within the Coursesites LMS)
There were over 6,000 actual posts (comments) and over 65,000 ‘hits’/looks over a five week period, from just over 300 of the participants – but almost to-thirds did not participate at all.
Types of participation
Lurkers (i.e. did not participate in LMS discussion forums – they may have participated through social media): 63%. There were accounts created in Facebook, Twitter, Delicious and blogs related to the course which indicated active social media connections both for registered participants and with those who had not registered for the course but were interested. However, these numbers were relatively small, and hard to measure.
Passive participation was defined as doing the minimum amount of work required to complete the course. Some of the passive participants were K-12 students forced to complete the MOOC for a class requirement.
There were also preservice teachers and inservice teachers who could be described as passive participants. These participants often completed the course; however, much like the high school students, their posts were limited to one or two sentences per posts. Their comments were also superficial, for example, “Nice job” or “I like what you did.”
Active participants participated in four ways:
- informing personal practice
- sharing the MOOC with their communities
- leadership within the MOOC community
- critical colleagues
The authors’ main conclusions
The seeking and sharing of digital media highlights that people want to form and engage in communities, and the growing interest in MOOCs shows this is true of educational communities as well….
Learning takes place in communities; depending on the implementation, technology has the capability to create and sustain the communities’ learning and practice….. Evidence in this report suggests that such activities can lead to positive outcomes, particularly as they relate to getting teachers to think more deeply about teaching and learning in the 21st century.
Even though (or perhaps because) this is a self-evaluation, this is a very useful report. I was fascinated for instance that this course ended with more participants than when it started, due to the ‘publicity’ of social media connections during the course itself. It was interesting too that some of the participants in this MOOC were not necessarily willing participants – being forced to participate as part of a formal credit program. This seems to me to go against the whole purpose of a connectivist MOOC.
More importantly for me, the report highlights some of the ways research can be conducted on MOOCs and also some of the challenges. The study identifies the importance, from a research perspective, of having some kind of platform that can gather student data and track student behaviour, such as levels or types of participation. However, given the importance of social media for connectivist MOOCs, some way of accurately tracking related social media activity is critical. It seems to me that this is a problem that appropriate software could solve (further development of gRRShopper?), although privacy issues would need to be addressed as well. (Perhaps the spy agencies can help here – just joking!)
I agree completely with the authors when they write:
Researchers have already provided ample evidence that asking if a technology works is the wrong question. A more appropriate question is: under what conditions do certain types of MOOCs work?
Another even more pertinent question is: What prior research into credit-based online learning applies – and what does not apply – to different kinds of MOOCs? This might save a lot of time re-inventing the wheel, particularly for xMOOCs. I am getting sick of hearing from research on xMOOCs that immediate feedback helps retention – we have known that for nearly 100 years. We do need though for instance to assess the importance and most useful roles, if any, of instructors/facilitators/subject matter experts in MOOCs, and whether MOOCs can succeed with reduced ‘expert’ participation. This report suggests almost the opposite – connectivist MOOCs work best with a wide range of facilitators – but what are the hidden costs of this?
Finally, I also agree with the authors that completion rates are not the best measure of success for MOOCs. This MOOC does seem to have raised some interesting questions for participants. I’m just curious about their answers. Despite the very good work done by the instructors/researchers of this MOOC, I am still left with the question: what did the participants actually learn from this MOOC? For instance, what would an analysis of the student ‘artifacts’ have told us about their learning? Unless we try to answers questions about what actual learning took place then it will remain difficult if not impossible to measure the true value of different kinds of MOOC, and I think that would be a pity.
In the meantime, this report is definitely recommended reading for anyone interested in doing research on or evaluating MOOCs.