read

I was supposed to speak to a reporter today about iPhones and addiction, but the interview fell through. I jotted down some of my thoughts in preparation for the call, and I thought I’d post them here in case it’s a topic I decide to return to and flesh out more in the future…

I am hesitant to make any clinical diagnosis about technology and addiction – I’m not a medical professional. But I’ll readily make some cultural observations, first and foremost, about how our notions of “addiction” have changed over time. “Addiction” is medical concept but it’s also a cultural one, and it’s long been one tied up in condemning addicts for some sort of moral failure. That is to say, we have labeled certain behaviors as “addictive” when they’ve involve things society doesn’t condone. Watching TV. Using opium. Reading novels. And I think some of what we hear in discussions today about technology usage – particularly about usage among children and teens – is that we don’t like how people act with their phones. They’re on them all the time. They don’t make eye contact. They don’t talk at the dinner table. They eat while staring at their phones. They sleep with their phones. They’re constantly checking them.

Now, this “constantly checking their phones” behavior certainly looks like a compulsive behavior. Compulsive behavior, says the armchair psychologist, is a symptom of addiction. (Maybe. Maybe not.) What is important to recognize, I’d argue, is that that compulsive behavior is encouraged by design.

Apps are being engineered for “engagement” and built for “clicks” – behavioral design. They are purposefully designed to demand our attention. Apps are designed to elicit certain responses and to shape and alter our behaviors. Notifications – we know how these beckon at us. “Nudges” – that’s the way in which behavioral economist Richard Thaler has described this. But these notifications and nudges are less about “better decision making” socially as Thaler would frame it than they are about decisions and behaviors that benefit the app-maker: getting us to download an app, to register, to complete our profile (to hand over more personal data, that is), to respond to notifications, to open the app, to stay in the app, to scroll, to click, to share, to buy. These are actions that tech entrepreneurs and investors value because these are the metrics that the industry uses to judge the success of a product, of a company.

I think we’re starting to realize – or I hope we’re starting to realize – that those metrics might conflict with other values. Privacy, sure. But also etiquette. Autonomy. Personal agency. Free will.

I bring up “free will” here because one of the best known behaviorists, Harvard psychology professor B. F. Skinner, did not believe in it. He thought that behaviors could be shaped through what he called “contingencies of reinforcement” – behavior could be shaped through a series of rewards (he was interested in positive reinforcement and discouraged a reliance, particularly in educational settings, in punishment, in negative reinforcement). Skinner was well known for training pigeons and rats – the famous “Skinner’s Box.” But he also built early “teaching machines” that aimed to apply his behaviorist theories to instruction. Learning, to Skinner, was a behavioral (not a cognitive) process.

His theories have fallen out of favor, but I think there are remnants of this behaviorist bent (perhaps unexamined, unattributed), particularly in those who are still designing “teaching machines” in Silicon Valley and elsewhere. The Persuasive Technology Lab, for example, at Stanford.

How do you design apps that nudge and persuade users to do certain things, that reward them when they do so? How do you encourage people to repeatedly check their phones, to feel like they always need their phones? There is a huge asymmetry here in terms of power and information, and the tech industry has spent a lot of time and energy trying to encourage and reinforce these sorts of behaviors in its users. I’m not sure consumers have necessarily stopped and asked in response, why am I responding this way? What am I being trained to do or trained to feel?

I don’t want to suggest that this is something the consumer alone is responsible for – blaming consumers, for example, for looking at their phone when it vibrates or beeps or for downloading Candy Crush and trying to get all their friends to play along. The whole modus operandi of the tech industry has been to create apps that are as engaging and compelling and viral as Candy Crush. The industry views its users as highly manipulable, their behaviors as something that can be easily shaped and nudged and controlled. Maybe it’s time to rethink and regulate and restrict how that happens?

Indeed, I think that a concern about regulation – government regulation not self-regulation – is probably what prompted the Apple investors last week to suggest that iPhones might be hurting users and that Apple should address the problem. (That is, can Apple please address this internally before the government steps in?) There were several high profile stories in the last year or so, arguing that, for example, social media causes depression, that smartphones are making kids miserable, and so on. Again, those diagnoses are up for debate. But I think these stories are important to think about alongside others about the ways in which we all have found ourselves to have been manipulated – by “fake news” most obviously. How are our minds – our sense of well-being, our knowledge of the world – being shaped and mis-shaped by technology? Is “addiction” really the right framework for this discussion? What steps are we going to take to resist the nudges of the tech industry – individually and socially and yes maybe even politically? (Because unlike Skinner, I happen to believe we do have free will and we can make different choices and we can refuse these particular “contingencies of reinforcement.”)

Audrey Watters


Published

Audrey Watters

Writer

Back to Archives