Human Rights Watch (2022) How Dare They Peep into My Private Life? New York: Human Rights Watch, May 25

The promise and the ideal

I remember very clearly the first time I saw and used the Internet. It was in a colleague’s basement office in his home in Vancouver in 1986. I was at the time working at the UK Open University and was visiting Canada on a British Council grant to research distance education developments in Canada.

I didn’t know my colleague, David Kaufman, very well. I had met him earlier on the trip in Québec, when I had struggled to present in French and he had intervened to help me out, so I was a bit cautious when he said, ‘Come down to my basement – I’ve got something to show you.’ There was a screen, and a computer attached to a black box (a modem), which in turn was plugged into the telephone socket.

He fired it up, and showed me he was sending out a message that he was online and was anyone willing to talk (via text messaging). We had a reply immediately. ‘Where are you?’, David typed. ‘New York’, came back the reply. After a couple of inconsequential questions such as what’s the weather like, I said to David: ‘Ask him how old he is.’ ’12’ came back the reply. (It would have been 1.00 am in New York at the time).

I went back to the OU, where we were about to develop a Social Science course on Technology and Society (DT200), fired up with the idea of using computer mediated communication to link students and instructors. (We eventually used CoSy, a platform developed at the University of Guelph.)

I tell this story to show the hope and idealism. Here was a free tool, developed originally for defence purposes with public money by the U.S. government, that could be used to link everyone in the world for educational purposes. Indeed, for about 10 years, that was exactly how it was used. I was at UBC when the first LMS designed specifically for education, WebCT, was developed and used.

Now 40 years later, where are we?

The Human Rights Watch report

This report is a global investigation of the education technology (EdTech) endorsed by 49 governments for children’s education during the pandemic. Based on technical and policy analysis of 164 EdTech products, Human Rights Watch finds that governments’ endorsements of the majority of these online learning platforms put at risk or directly violated children’s privacy and other children’s rights, for purposes unrelated to their education.

“‘How Dare They Peep into My Private Life?’: Children’s Rights Violations by Governments that Endorsed Online Learning during the Covid-19 Pandemic,” is grounded in technical and policy analysis conducted by Human Rights Watch on 164 education technology (EdTech) products endorsed by 49 countries. It includes an examination of 290 companies found to have collected, processed, or received children’s data since March 2021, and calls on governments to adopt modern child data protection laws to protect children online.

The report was released simultaneously with publications by media organizations around the world that had early access to the Human Rights Watch findings and engaged in an independent collaborative investigation.

My reaction

I’ll be frank. I was shocked and disgusted when I read the media reports of this study. I felt about the same to these allegations as I would to the accusation of child pornography. It would certainly appear on the surface that children’s (and parents’) privacy rights are being violated on a massive scale, for purely commercial purposes.

The report came out five days ago, but I wanted to read the actual report and see how governments and IT professionals responded to the accusations. However, although the coverage of the report had been pretty widespread in national media, I have not found any comment yet, so here’s my response. I should point out though that I am neither a lawyer nor an IT specialist, and we need both professions to participate in this discussion.

What the report claims

Of the 164 EdTech products reviewed, 146 (89 percent) appeared to engage in data practices that put children’s rights at risk, contributed to undermining them, or actively infringed on these rights. These products monitored or had the capacity to monitor children, in most cases secretly and without the consent of children or their parents, in many cases harvesting data on who they are, where they are, what they do in the classroom, who their family and friends are, and what kind of device their families could afford for them to use.

Most online learning platforms sent or granted access to children’s data to third-party companies, usually advertising technology (AdTech) companies. In doing so, they appear to have permitted the sophisticated algorithms of AdTech companies the opportunity to stitch together and analyze these data to guess at a child’s personal characteristics and interests, and to predict what a child might do next and how they might be influenced. Access to these insights could then be sold to anyone—advertisers, data brokers, and others—who sought to target a defined group of people with similar characteristics online.

With the exception of a single government—Morocco—all governments reviewed in this report endorsed [during the pandemic] at least one EdTech product that risked or undermined children’s rights. Most EdTech products were offered to governments at no direct financial cost to them; in the process of endorsing and ensuring their wide adoption during Covid-19 school closures, governments offloaded the true costs of providing online education onto children, who were unknowingly forced to pay for their learning with their rights to privacy, access to information, and potentially freedom of thought.

Some of these governments made it compulsory for students and teachers to use their EdTech product, not only subjecting them to the risks of misuse or exploitation of their data, but also making it impossible for children to protect themselves by opting for alternatives to access their education…Most EdTech companies did not allow their students to decline to be tracked; most of this monitoring happened secretly, without the child’s knowledge or consent. In most instances, it was impossible for children to opt out of such surveillance and data collection without opting out of compulsory education and giving up on formal learning altogether during the pandemic.

It should be noted that the report included data from Canada’s two largest provinces, Ontario and Quebec.

Why am I shocked?

Probably because I work mainly in the post-secondary sector, so I am not familiar with the tools and processes used in k-12/school education. Maybe I am being naive, but in the post-secondary sector most institutions use a learning management system that requires students to log-in with a password. There was a scare about Zoom when it was hacked during the pandemic, but the company quickly fixed that. Most IT departments in universities and colleges are responsible for ensuring the tools used for teaching are secure and protect student privacy. I had assumed that most school boards, at least in Canada, would operate in a similar manner. Apparently, according to the Human Rights Watch report, this ain’t so. It’s a free-for-all, a jungle, out there. I hope someone will use the comment box and say that in their school system, there are defences and policies to protect children’s privacy when using online tools. I am concerned though that I have not seen any statements yet issued by authorities in Ontario and Quebec that refute the claims by Human Rights Watch.

I had in an earlier post criticised many school boards in Canada for not providing teachers and students with a common, tested and secure learning management system during the pandemic. It was understandable that some school boards and Ministries would have been unprepared in March 2020, but that this should have been fixed by now. If not, the Human Rights Watch is a clarion call to all school authorities in Canada to get their house in order. This is not a difficult thing to do.

Are we also to blame?

Lastly, having read the report in detail, I think an element of caution is needed. Just because it is possible to scrape data and sell it to advertising agencies, it doesn’t necessarily mean that it is done by all the companies listed. For instance it is possible for me to drive my car at 200 kilometres an hour, but I don’t for two reasons: I don’t want to die, and someone will try and stop me because it’s against the law.

Some Ed Tech companies, such as LEARN, opted for the first – they have tried to make it difficult for data to be used by third parties, because they want to be seen as an ethical company, because in the long run it’s in their interest. What seems to be missing at the moment is the second reason. We don’t have laws that protect children’s rights – or anyone else’s – when it comes to companies scraping data from our Internet activities. There is no risk to them in doing this.

And for that, I think we have only ourselves to blame. We all want free access and use of the Internet, yet services do need to be paid for. Most of us accept that advertising is one way to pay for this, but we all play the game of trying to get as much for free as we can. The problem is that in this game the odds are really stacked against us. We need some rules or regulations that even up the odds, without destroying the game itself.

In the meantime, school boards and governments, get to work. Ensure that safe and secure tools are used by teachers and students. This is not a difficult problem but it does need to be fixed. The larger problem of privacy and the power of tech companies is a much bigger problem but we do not need to wait for that to be fixed to ensure that children’s use of ed tech tools is secure and private.

For parents reading this article, go to Porch’s Children’s Online Safety, for some great tips and guidance.

I’d love to hear from lawyers and IT specialists on this issue, as well, of course, from educators and students.

 

 

2 COMMENTS

  1. Surprised that this post didn’t generate comments… and that I’m only finding out about this now, months after HRW released their report.

    Interesting that Ontario was excluded from the report, after the fact.
    ­> Canada’s Ontario, and the single EdTech product that it recommended, was originally included in our analysis. After additional rounds of data verification and analysis of the EdTech product, which yielded an inconclusive assessment, Human Rights Watch removed Ontario from its list.

    For context: I work in Quebec Higher Education, mostly in the province’s Cegep system. Part of my work revolves around Open Educational Resources (and other forms of no-cost online material). I’m also on the board for a small volunteer organization about Free/Libre Open Source Software in Quebec Higher Ed.

    The report’s key topic is of deep interest to me for a simple reason: I find that it’s not discussed nearly enough among learning professionals.
    And while the report focuses on government recommendations, I’d focus on choices made at the institutional level.

    Educators and others frequently choose no-cost tools and resources without paying much attention to their impact on learners’ privacy. In a way, it’s as though people were choosing tools for themselves yet they “impose” them on students. To my mind, there’s a huge difference between adopting technology and requiring it. Don’t we have a duty, in Higher Ed, to think critically about such choices? And involve learners in the process? And/or provide a variety of options?

    LMS choice is an interesting case in point. (As a disclaimer: I’ve been a power user of Moodle for a while and my employer hosts a multitude of Moodle instances.) The decision to focus on a given LMS goes through institutions’ typical IT processes. Yes, including all the testing required for compliance (cybersecurity threat assessment, accessibility scores, language support, etc.).
    Depending on the institution, this can be a thorough vetting process. In government (I’ve worked at the Canada School of Public Service), it’s quite involved as a procurement process. Once a platform has been selected, there’s an expectation that people will use it and, in fact, focus on it. That’s a longstanding model.
    Thing is, learning pros often find those institution-vetted tools to be lacking, in some ways. Perhaps because, during a conference or webinar, some rep from an EdTech vendor convinced them that they should use something else. Or they have a distinct use case, not considered in the IT-based process. There’s a split between pedagogues and IT about this.
    (By the by, while I was working at CSPS, the Government of Canada banned Zoom and, at least as late as Fall 2021, I was hearing from cybersecurity experts who maintained that Zoom wasn’t safe to use. Long before the issue you mentioned, the company had proven itself to be undilligent in terms of security. The Mac version of its software even installed a webserver with an open port, without letting users know. The official reaction that it was a feature and not a bug should have given people pause. These days, it mostly sounds like they’ve been adept at avoiding conversations about their product’s security… or accessibility.)

    Having taught at Concordia from 2006 to 2016, I find it interesting that their Centre for Teaching & Learning would offer a list of university-approved tools: https://www.concordia.ca/ctl/digital-teaching/supported-tech.html
    Don’t get me wrong: I understand people on both sides! IT people have a duty to keep things safe. Learning pros have a duty to appropriate technology. Part of the problem lies in a mismatch between the two.
    My sense is that this whole situation could be the basis for collaborative work among diverse people. Yes, the typical “How Might We” exercise. My hunch is that we could include learners’ privacy alongside a variety of other factors to consider in selecting platforms and other tools: pedagogical efficacy, pedagogical inclusive, accessibility, cybersecurity, cultural sensitivity, usability, sustainability, flexibility… even “epistemic justice”.

    As for rules, regulations, and policy… Privacy laws are an important part of this. And they’re insufficient. Regardless of what people think about GDPR, there’s a lot of ground that it doesn’t cover.

    All this to say: can we all work together to make thoughtful decisions about the tools we use with learners at any level?
    Or, at least, raise awareness that no-cost tools and resources often come with more strings attached than one might realize.

    • Hi, Alex
      Many thanks for a great comment. There was some reaction (mainly ‘likes’) from my Twitter followers to this post, but people often don’t find the comment box or read the post, note it, then move on.
      I do think the risks are higher in the k-12 sector, mainly because teachers and particularly school administrators are more removed from the IT specialists who worry about these issues, but you are right, we all need to ask questions about who has access to data collected through these tools
      Regards

LEAVE A REPLY

Please enter your comment!
Please enter your name here