Tristan Harris: What Is The Cost Of Infinite Distractions?

May 25, 2018
Originally published on May 25, 2018 9:54 am

Part 4 of the TED Radio Hour episode Attention Please.

About Tristan Harris's TED Talk

Designer Tristan Harris says attention is at the core of human experience. He argues that our addiction to technology has the power to threaten our very capacity to think, reason and problem solve.

About Tristan Harris

Designer Tristan Harris has been called "the closest thing Silicon Valley has to a conscience," by The Atlantic magazine. He has spent a decade understanding the invisible influences that hijack human thinking and action — including what has been coined "the attention economy": technology companies caught in a zero-sum race for our finite attention.

After working as a design ethicist at Google, Tristan left to work full-time on reforming the attention economy with the non-profit initiative, Time Well Spent. He now runs the Center for Humane Technology, which aims to push for more ethical technology.

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

(SOUNDBITE OF MUSIC)

GUY RAZ, HOST:

Is it strange to think of attention as a commodity?

TRISTAN HARRIS: You know, it's interesting to think of it as a commodity, given how important it is. I mean, attention is just consciousness. It's experience. And to think of it as a commodity is itself so dehumanizing. It's like taking the - you know, whatever it is that's unexplainable about this experience, and saying, yeah, let's sell that. Let's sell the core thing to our human experience.

RAZ: This is Tristan Harris. He runs an organization called the Center for Humane Technology that's trying to free us from our addiction to tech. But before that, Tristan actually used to work in the tech industry at Google, creating the very things that try to compete for our attention.

HARRIS: People have been competing for attention for a long time. But, you know, I think we have a very new environment for that competition that's unprecedented. We're not built for this. And I think we have to start by looking in the mirror at how human instincts really work. Right? We have this, actually, this image on our website of, you know, the ape turning into the Neanderthal turning into the Homo sapiens, the man. And then the next thing is this human who's sort of turning around, like, looking in the mirror and seeing what we are, where all those instincts come from and how they're configured. Because we're not going to get out of the mind-body meat suit that's configured and calibrated the way that it was for, you know, thousands of years ago on the savannah.

RAZ: No.

HARRIS: So we're either going to build environments that are humane, that work with that architecture, or we're going to build environments that do not respect that architecture. And I think that it's really possible to do that, but it really just requires really a science of ourselves, like, looking in the mirror and saying, how do we make stuff work with us?

(SOUNDBITE OF MUSIC)

RAZ: Here's more from Tristan Harris on the TED stage.

(SOUNDBITE OF TED TALK)

HARRIS: What do you think makes more money in the United States than movies, game parks and baseball combined? Slot machines. How can slot machines make all this money when we play with such small amounts of money? We play with coins. How is this possible? Well, the thing is, my phone is a slot machine. Every time I check my phone, I'm playing the slot machine to see, what am I going to get? Every time I check my email, I'm playing the slot machine to see, what what am I going to get? Every time I scroll a news feed, I'm playing the slot machine to see, what am I going to get next, right? And the thing is that, again, knowing exactly how this works - and I'm a designer. I know exactly how the psychology of this works. I know exactly what's going on. But it doesn't leave me with any choice. I still just get sucked into it. So what are we going to do? Because it leaves us with this all-or-nothing relationship with technology, right? You're either on and you're connected and distracted all the time, or you're off but then you're wondering, am I missing something important? In other words, you're either distracted or you have a fear of missing out. Right?

(SOUNDBITE OF MUSIC)

RAZ: In just a moment, how we can break free from our addiction to technology and take back the right to choose. Stay with us. I'm Guy Raz, and you're listening to the TED Radio Hour from NPR.

(SOUNDBITE OF MUSIC)

RAZ: It's the TED Radio Hour from NPR. I'm Guy Raz. And on the show today - ideas about the consequences of living in an age of constant information and infinite distractions. And just before the break, we were hearing from Tristan Harris. After working at Google, Tristan went on to found an organization that's trying to rethink and improve our relationship with technology because right now, he says, we don't have much of a choice.

HARRIS: I think we have meta choices. I mean, I think - what I mean by that is if you knew that there is some other environment that's really distracting, like a casino, and it's all the way in Vegas and you have to get in a car and drive 60 miles to get there or something like that, you could say, OK, distraction's a choice. I just won't go to Vegas. It's another thing when you wake up in the morning and you turn off your alarm and right next to your alarm clock in your phone when you undo the app switcher are, like, a hundred different, invisible, crazy things that all want your attention - all casinos.

And so one of the problems is we don't have the ability to separate the casinos and the crazy distractions from the things that are not the distractions. Like, when you do that alarm clock, you - what's different about that, right? I mean, you chose for that thing to interrupt your attention. You tied your hands behind your back and said, I want you to interrupt my entire attentional cycle at 6:30 in the morning tomorrow. But then all these other things you think you chose because you downloaded that app and you think you hit allow notifications, but did you really allow for all of those supercomputers and engineers on the other side of the screen to try and, you know, figure out what's going to keep you coming back? You know, that's not what we signed up for. It's not quite the same thing.

(SOUNDBITE OF TED TALK)

HARRIS: And that's what we're doing all the time. We're bulldozing each other's attention left and right. And there's serious cost to this because every time we interrupt each other, it takes us about 23 minutes on average to refocus our attention. We actually cycle through two different projects before we come back to the original thing we were doing. This is Gloria Mark's research combined with Microsoft research that showed this. And her research also shows that it actually trains bad habits. The more interruptions we get externally, it's conditioning and training us to interrupt ourselves. We actually self-interrupt about every 3 1/2 minutes.

(SOUNDBITE OF MUSIC)

HARRIS: And just imagine that, you know, that study I think was done, you know, in a very specific context. It was done in a workplace setting. I think it was done at NASA. And, you know, that was done, I think, in 2008 - 10 years ago. And so imagine how much more often we hear our buzz, dings, new emails, you know, pouncing at us from a thousand different directions that are completely unrelated. I think something that's really undervalued is the cost of unrelated, different things coming at you all the time so your mind has to switch and switch and switch.

(SOUNDBITE OF MUSIC)

RAZ: So how do we solve this? How do we even begin to fix this?

HARRIS: Well, we've been advocating for five years for a different kind of design, our humane design, that pays attention to the way that the human mind and the human instincts really work. And you design to accommodate them.

RAZ: Yeah, so the idea is through, I mean, design. The solution is in design.

HARRIS: Design is one part of it. But it's an important part. I mean, a simple example is - a lot of people have been doing this recently - you turn your phone to grayscale. Like, why would you do that?

RAZ: Yeah, yeah.

HARRIS: Like, why does that do anything? Like, well, that's a crazy idea.

RAZ: Yes. I've done that. I've done it.

HARRIS: (Laughter) Well, in a way, it's an experiment because you get to see the difference between what's the difference when I look at my phone and all those colors light up my brain? I didn't choose for those colors to light up my brain. It just did that.

RAZ: Yeah.

HARRIS: When it's gray, it has a different effect on me, and I didn't choose that either. It just - I feel a little bit calmer. It feels a little bit less appealing. Imagine if all home screens were gray. It would reduce some of that addictiveness just probably about 15 percent. But that's one tiny, little example. I mean, another example is interruptions. How often should you get interrupted? Should we - you know, if you had to choose, if you had a lever for two billion people - this kind of what I was thinking about is Google. It's like you have this two billion person ant colony called humanity. And then you've got this phone in their pocket that's going to steer their attention. Now, if you're the operator, you're the one in the control room and you get to choose, do two billion people get interrupted every, you know, on a random schedule, every one to 15 minutes with little buzzes? Or would you prefer to have a world where they get interrupted once a day at 5 p.m., except if there's something extraordinarily important? Like you'd probably pick the latter if you're defining a default setting for two billion people.

RAZ: What are the consequences if we do nothing, if nothing changes or it just continues to get worse?

HARRIS: I think the consequences are incredibly serious because there's a temptation to believe - I remember I had this moment I was at a conference and I met some people who worked for a security - what are they called? - three-letter acronym security agencies. And I gave a talk about the manipulative design stuff, you know, the way the companies use these techniques and - or hooked them. Someone came up to me and said, yeah, totally, I'm so hooked by this stuff. And I said, oh, no.

Like, I always thought there was this other special group of people who work on climate change or some special class of people who work on, you know, national security or some special class of people who work on inequality, and they're immune to all this stuff, and they're making really good decisions to get us out of all these messes that we're in as a global system. And then what strikes me and what terrifies me is this is the fabric through which all human thinking and choices happen is your mind. And if that capacity for thinking and choosing is corrupted, that's everything that disables our ability to do everything - climate change, solving inequality, agreeing on what the truth is, our political process, elections.

Everything comes down to the fabric of our mind. What are we thinking and believing? And when you turn your phone over in the morning and you see a bunch of stuff, thoughts start streaming into our heads that we didn't choose, and that is the environment that we have immersed 2 billion human animals. And that's why I look at this and say, we need to change course right now. You have to see the structure of this, that this is too much of a threat, and we can't survive it if we don't.

RAZ: Tristan Harris - he's the co-founder of the Center for Humane Technology. You can see all of his talks at ted.com. Transcript provided by NPR, Copyright NPR.