read

What is the future of teaching and learning? Part 2 of my contribution to EDU8213.

I want to respond to something that Liz said about our focus on computers as calculating. She said she preferred to see them as communication machines, and I do think that an emphasis on communication rather than calculation could help us to think about and to really foster those pedagogical practices that recognize and value affect not just those practices that privilege quantification.

But I’m not sure that saying that computers as not merely calculating machines gets us out of the quandary of our “computational culture.” The ideological underpinnings of computers coincide with this longstanding privileging in Western culture of rationality. For a couple of centuries now, modern societies have been built on the belief that more rationality and more technology and more capital are the solutions to all the problems we face.

This makes it challenging, I think, to talk about “the future of teaching and learning” without seeing “teaching and learning” as a problem to be solved. And specifically about a problem to be solved with more data, more machines, more analytics.

This really stands in stark opposition to affect. Reason and rationality versus emotion – we know that story. The former privileged as the realm of men. Men of science. The latter scorned as the realm of women. Weak. Soft.

A sidenote: it’s so ironic that the women who worked in the field of pre- or proto-computing were called “computers” and “calculators.” But once the work became mechanized, computerized, they were largely ousted from the field and their contributions erased.

In all things, all tasks, all jobs, women are expected to perform affective labor – caring, listening, smiling, reassuring, comforting, supporting. This work is not valued; often it is unpaid. But affective labor has become a core part of the teaching profession – even though it is, no doubt, “inefficient.” It is what we expect –stereotypically, perhaps – teachers to do. (We can debate, I think, if it’s what we reward professors for doing. We can interrogate too whether all students receive care and support; some get “no excuses,” depending on race and class.)

What happens to affective teaching labor when it runs up against machines, against robots, against automation? Politically. Economically. Culturally. I think most of us would admit that even the tasks that education technology purports to now be able to automate – tutoring, testing, grading – are shot through with emotion when done by humans, or at least when done by a person who’s supposed to have a caring, supportive relationship with their students. Grading essays isn’t necessarily burdensome because it’s menial, for example; grading essays is burdensome because it is affective labor; it is emotionally and intellectually exhausting.

This is part of our conundrum, and again I think this is a deep cultural conundrum that we cannot just wave away by calling computers “communication machines”: teaching labor is affective not simply intellectual. Affective labor is not valued. Intellectual labor is valued in research. It is viewed as rational and reasonable. But at both the K–12 and college level, teaching is often seen as menial, routine, and as such replaceable by machine. Intelligent machines will soon handle the task of cultivating human intellect, or so we’re told. And because we already privilege a certain kind of intellect as rational and reasonable, I think culturally we are sort of prepped for intelligent machines handling the tasks of research and decision-making.

Artificial intelligence sees the human mind as a computer. This is a powerful metaphor that underscores the whole field. Intelligence is rational, so they say. It is about calculation. It is mechanical. It is replicable. It is measurable. Think of all the words in artificial intelligence language that are drawn from human’s mental capacities: memory. learning. The benefit of artificial intelligence, so we’re told, is that it can surpass the capabilities of humans. It can be faster. It can store more data. It can process more data. It is computational.

What does it mean for the future of teaching and learning if – culturally – we are being told that the future of intelligence is machine intelligence?

Where does affect fit into this?

Rather than finding that machines are become more intelligent, I fear we will find that humans are becoming more machine-like. But if we bury affect, I do wonder – and one only need look at this US Presidential election for an example – what happens when we have these emotional outbursts. Anxiety. Irrationality.

I think I said in the last recording that I often turn to Antonio Gramsci: “I am a pessimist because of intelligence but an optimist because of will.”

I’ve been thinking a lot lately about irrationality and the Internet, about what seems to be an embrace of conspiracy theories, factlessness, a rejection of expertise. I’m not sure I’ve ever been more pessimistic about the Internet’s potential for participatory democracy or for networked learning before. “Don’t read newspapers,” Trump told his supporters recently. “Read the Internet.” As such, the Internet feels like a weapon of war – and war has always relied on calculation, hasn’t it – a weapon of hate – there’s the affect that culturally we seem to be embracing right now.

Audrey Watters


Published

Audrey Watters

Writer

Back to Archives