Baker, R. et al. (2018) Bias in Online Classes: Evidence from a Field Experiment, Stanford CA: Stanford Center for Education Policy Analysis, CEPA Working Paper No. 18-03
Yesterday I ranted at the high costs in the UK of online programs aimed at part-time, working people. Today, I want to look at a recent study from researchers at Stanford University reporting racial bias in online discussion forums.
First let’s report the facts: what did the researchers say? (Please read the report for yourself if you are uncomfortable with my comments about their conclusions).
this study provides what we believe is the first evidence of the possible presence of racial and gender biases among students and instructors in online courses.
First, it provides novel and fundamentally important insights into a rapidly proliferating type of learning environment. In 2013, 25 percent of all postsecondary students took some or all of their courses online. This fact has equity implications given that students enrolling in less selective colleges make up a larger fraction of the online student body online. Even in K-12 education, more than 300,000 students exclusively attend online schools, with as many as 5 million students having taken at least one online course….
Because our study relies on fictive student identities, it cleanly isolates behavioral effects due to instructors and unequivocally rules out mechanisms related to student reactions to a particular instructor.
…a comment from a White male is a statistically significant 5.8 percentage points more likely to receive a response from an instructor than non-White male students. The magnitude of this effect is striking. Given the instructor reply rate of 6.2 percent for non-White male posters, the White male effect represents an 94 percent increase in the likelihood of instructor response.
This is a pretty damning criticism of online learning. How did they come to this conclusion?
We tested for the presence of racial and gender biases in these settings by creating fictional student identities with racial- and gender-connotative names, having these fictional students place randomly assigned comments in the discussion forums, and observing the engagement of other students and instructors with these comments.
We situated our study within 124 Massive Open Online Courses…..Critically, we also believe there is credible external validity to conducting this study within MOOCs because their basic design features (e.g., asynchronous engagement, recorded lectures, discussion forums) and their postsecondary content are widely used in other online courses.
Using fictive student identities, we placed eight discussion-forum comments in each of the 124 MOOCs. Within each course, eight student accounts were used to place one comment each. The eight student accounts each had a name that was connotative of a specific race and gender (i.e., White, Black, Indian, Chinese, each by gender); each race-gender combination was used once per class…..By observing the responses to our comments by instructors and by students in the course, we can identify any difference in the number of responses received by our student accounts that were assigned different race and gender identities.
Fifty eight percent of the courses in our sample were taught by either one White male instructor or a teaching team of exclusively White men….White students were 5.9 percentage points more likely than non-White students to respond to one of our comments when that comment was assigned a White name….We find that White women were over 10 percentage points more likely to respond to a post with a White female name than non-White women.
This study has received a lot of attention, being reported in many different outlets. The main reporting suggests that discussions in online learning are strongly biased, with more attention being paid to white male students by instructors, and white female students more likely to correspond with or respond to other white females.
I don’t dispute these findings, as far as they apply to the 124 MOOCs that the researchers studied.
Where the madness comes in is then generalising this to all online courses. This is like finding that members of drug gangs in Mexico are likely to kill each other so the probability of death by gunfire is the same for all Mexicans.
MOOCs are one specific type of online learning, offered mainly by elitist institutions with predominantly white male faculty delivering the MOOCs.
Furthermore, the instructor:student ratio in MOOCs is far higher than in credit-based online learning, which still remains the main form of online learning, despite the nonsense spouted by Stanford, MIT and Harvard about MOOCs. In an edX or Coursera MOOC, with very many students, it is impossible for an instructor to respond to every student. Some form of selection has to take place.
In most credit-based online courses, discussion forums are much more tightly managed by instructors. Many using best practices try to ensure that all students in their online discussion forum are as fully engaged as possible in the discussions. This is just not possible for an instructor to ensure in very large MOOC discussions forums. Also to imply that their findings will also apply to k-12 online courses is even more ridiculous. Their statement that the basic design features of MOOCs are widely used in other online courses is just not correct.
So yes, because of the very nature of most MOOCs, I am not surprised to find racial and gender bias in the discussions forums. I am sure that if one looked closely enough, one would probably find some instructors in credit-based online courses show either conscious or unconscious bias, but I would need to see evidence drawn from this context, not from a completely different context such as MOOCs.
Once again, we see faculty from Stanford assuming that MOOCs are the standard for online learning, when all along they have been a mutant, and so it is not surprising to find mutant behaviour in them.