On September 9 and 10, scholars across Canada are participating in Scholar Strike Canada, in solidarity with the Scholar Strike happening in the US on September 8 and 9. Scholar Strike Canada, like its US counterpart, addresses systemic anti-Black violence and specifically anti-Blackness in the academy, and adds an additional focus on settler colonialism and anti-Indigeneity. More resources on Scholar Strike Canada, including an open letter in support that you can sign and teach-ins to attend virtually, are available here.

Educational technologies play a unique role in our current teaching climate, and they also — like all aspects of academia — have a history of policing Black and Indigenous bodies differently and more stringently than white bodies. Technologies are not neutral. For Scholar Strike Canada, we invite you to think about how a post-secondary system that increasingly relies on artificial intelligence and digital surveillance works against larger goals of equity and inclusion.

As Ruha Benjamin argues in Race After Technology: Abolitionist Tools for the New Jim Code:

Far from coming upon a sinister story of racist programmers scheming in the dark corners of the web, we will find that the desire for objectivity, efficiency, profitability, and progress fuels the pursuit of technical fixes across many different social arenas. Oh, if only there were a way to slay centuries of racial demons with a social justice bot! But, as we will see, the road to inequity is paved with technical fixes.

[…]

The animating force of the New Jim Code is that tech designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled – magnified and buried under layers of digital denial.

So then as scholars, when we learn that digital proctoring tools use AI face recognition that cannot recognize dark-skinned students, that those systems log IDs with citizenship status on servers subject to the PATRIOT Act, that their proctors ask students to remove headscarfs and hairwraps, what do we do? Do we resist, even when it’s uncomfortable or difficult to do so, or do we remain complicit within a system that undertakes, as Safiya Umoja Noble says in Algorithms of Oppression: How Search Engines Reinforce Racism, “technological redlining” that makes explicit — and explicitly uncomfortable — “what values are prioritized”? Do our tools and systems make space for, as Jennifer Wemigwans suggests in A Digital Bundle: Protecting and Promoting Indigenous Knowledge Online, “speak[ing] back to dominant colonial systems of knowledge in Canada,” or do we reinforce those colonial systems of knowledge with plagiarism detection software that reinforces only a single, settler-derived conception of scholarship and research?

We must answer these questions collectively and in community, and we must answer them directly. The tools of artificial intelligence and digital surveillance that are commonplace certainly exist to solve real problems. We cannot shirk our responsibility to build classrooms that are equitable and inclusive, whether in the face-to-face space or in the online world, just because the problems we use technology to solve seem intractable. We must recognize where the tools of convenience that we employ in our teaching and learning are, in fact, tools of oppression for Black and Indigenous students, and we must recognize the cost of these priorities on Black and Indigenous colleagues.

In solidarity.