It’s Time To Treat Social Media As A Public Health Emergency

The following is a rush transcript of my recent interview with Tristan Harris, co-founder and president of the Center for Humane Technology and co-host of the “Your Undivided Attention” podcast. Tristan, a former design ethicist at Google whom you may remember from Netflix’s “The Social Dilemma,” joined Federalist Radio Hour on Thursday to discuss the urgent threats posed by social media.

We discussed everything from TikTok and national security to the metaverse to the psychology of social networks and their incentive structures. Our personal, professional, and political lives are now largely filtered through these poisonous platforms, the largest of which is owned by a company that operates out of a hostile foreign country.

We are in the midst of a tech crisis and a public health emergency. Tristan’s insights make that abundantly clear. It’s time for consumers, lawmakers, media, and business leaders to start recognizing that.

You can read the rush transcript and listen to the full episode below.

Emily Jashinsky: We’re back with another edition of The Federalist Radio Hour. I’m Emily Jashinsky, culture editor here at The Federalist. As always, you can email the show at radio at the Federalist dot com, follow us on Twitter at FDRLST. Make sure to subscribe wherever you download your podcasts as well. Today, I’m joined by Tristan Harris. He’s the co-founder and president of the Center for Humane Technology, which is catalyzing a comprehensive shift toward humane technology that operates for the common good.

He’s also the co-host of Your Undivided Attention, which consistently ranks among the top ten technology podcasts on Apple Podcasts and explores how social media’s race for attention is destabilizing society and the vital insights we need to envision solutions. Tristan, you probably remember from “The Social Dilemma”, which we talked about here, and we talk about the Center for Humane Tech, I cite it all the time here. So our listeners are certainly familiar with that. Formerly a design ethicist at Google. Tristan, thank you so much for joining Federalist Radio Hour.

Tristan Harris: My pleasure to be here, Emily.

EJ: So many, many, many people have seen The Social Dilemma. Just incredibly successful documentary that aired on Netflix and it covers your background pretty extensively. But for anybody who may have missed that, could you just brief us on your career and how you ended up at the Center for Humane Technology?

TH: Yeah, so I let’s see, I grew up in the Bay Area of California, and for the beginning of my life and career, I used to be very optimistic on technology. I actually thought I wanted to work at Apple and be part of the next Macintosh team. And actually since I was about 11 years old, I thought I wanted to do that.

So I was really early on what the positive and optimistic and inspiring cases that technology can really, you know, be in our lives. And I was really affected by, I think, the people and the culture that built the Macintosh. And so I knew I wanted to go into technology early on. I went to Stanford, got a degree in computer science and psychology not a degree in psychology, but kind of studied at the intersection of psychology, persuasion, sociology.

What influences human beings outside their awareness? I was a magician as a kid, and I took a class that was connected to this lab at Stanford called the Persuasive Technology Lab with Professor B.J. Fogg. And it was really the combination with the background with magic that I got interested in, how is technology restructuring our attention, our feelings, our thoughts, the choices that we make?

And that might sound like a conspiracy theory to many people, hopefully less so now, now that The Social Dilemma has kind of come out on Netflix and made that clear. But that was really where my attention went is how is technology shaping society, not just shaping in like a minor way, like rearranging on a restaurant menu, which item comes first which might affect your choices by like 2% or 5%, but really reshaping the entire basis of how 3 billion people make sense of the world. You know, so anyway, we can get into all that. But that’s kind of a bit of my background. And today we have let’s see, I could quickly summarize it to the best of my bio that might be relevant for folks. 

I was a tech entrepreneur. I started a small company called Aperture. It was acquired by Google. I landed at Google and then became concerned about the issues of the attention economy in this race for attention and how I thought that was going to create perverse side effects starting in early on in 2012, 2013. And so I then became a design ethicist at Google studying how do you ethically be in the position of responsibility for 2 billion people’s attention starting with things like Gmail and notifications and then also going all the way up to structuring the competition of different social media platforms for our attention which would then lead to news and sense making in the way that newsfeeds rank things. So all of that then led us, led me to leave Google because I was not able to get anything done there and started this organization nonprofit called the Center for Human Technology, which is how we actuate the change in the world that we’re trying to see. 

EJ: Google is such a good example because if I’m remembering correctly, in The Social Dilemma, you talk about Gmail, and Gmail is not Facebook, it’s not Twitter. It’s not Instagram or TikTok. It’s not something people often think about in terms of this context of the attention economy. It’s, you know, just the sort of blunt device that just delivers your mail in the same way that the USPS does.

But when you’re acting as a design ethicist at Google and you’re looking at Gmail, you’re basically trying to capture as much of any human being’s attention as you possibly can. Can you explain a little bit about the work of a design ethicist? And another question I have is how common those positions are in Silicon Valley, how they’ve changed or shifted or proliferated even over the last decade?

TH: Yeah, my other position name was head of the Ministry of Truth. No, I’m kidding. It’s funny actually watching that conversation go that direction, actually, because people think, OK, you know, Google is making there’s— this person called Tristan who’s thinking about the ethics of design choices, and they’re deciding what I’m going to get and what I’m going to see.

And that’s kind of alarming to people like who are they to choose how my life is structured or designed. But as you said, with Gmail, it’s more subtle, right? Because you think, Gmail, you think it looks like a neutral platform like I only get an email if someone sent me an email. But if you think about it, I mean, what drove me interested in into Gmail first?

I was addicted to email. I don’t know. Many of us, some people laughed at that in the film, actually. But, you know, you refresh your email, like 4 seconds later,I might pull to refresh again. I think it’s really important for listeners, just think about that. There’s a million behaviors we do on our smartphone every day that make no rational sense.

You’ll refresh to see how many likes you got or what new email you got, like a slot machine and then you do something else for about 30 seconds and you go right back to refreshing again. And notice that makes no rational sense. And I watch myself doing this, and I knew from the Persuasive Technology Class at Stanford in which we read a book called Don’t Shoot the Dog about clicker training for dogs, and also the way that Las Vegas slot machines work and I just noticed that this was what was going on for me, and I knew exactly how this worked.

I literally read the books on it. I studied with the people who built it, and I was addicted to my email. Now, an example like you’re saying, OK, so how does Gmail actually make design choices that affect what we see? Well, I’ll give you a real example. It was 20, I think it was 2013, And I was with the handful of people designing Gmail, which has a billion users at the time, and new cafe. I would go into anywhere around the world, have the laptops open, had Gmail open, right. Like it was where people live, not just an app to use. It’s a digital habitat. People do their work there and they leave it open all day.

And so, for example, if when a new email comes in, in your browser tab, it adds the parentheses, you know, the number goes four, five, six, ten, right? That’s attracting your attention. So whatever else you were doing, suddenly a design choice about should we update the title in the tab that says the number of unread emails that’s going to draw people’s attention there.

So even that which seems this like, well, that’s just a rational thing to do. Of course, I’m going to let the user know how many messages they have, but just that tiny design choice is causing people to switch tabs and basically not get anything done all day because they’re constantly distracted by checking the latest email. A concrete example of how Gmail at the time at the mobile app was trying to get more aggressive in this arms race for attention was they were competing with the regular email app on everyone’s phone and they needed to get you to switch from the regular email app into their custom Gmail app.

And so they, and I was in a meeting and an engineer said to me, like, Why don’t we make it buzz your phone every time you get an email because then you’ll never miss. We’ll turn email into text messaging basically. And that’s how we’ll outcompete the regular mail app. And you know, it seems like such an innocuous decision.

But there I was in that moment, he said that in my mind, visualized like these puppet strings going out into the sky, into the cloud and coming down and interrupting a billion dinner table conversations, you know, couples who are out on a date and, you know, or with, you know, with their families or something. And then their phone buzzes and it makes their attention just like 15% less.

They’re less available, less present for their family members. And so in a subtle way, even something as tiny as that would affect and restructure kind of these these billions of moments. But then, of course, that was way before we got to social media driving polarization, making us, you know, breaking down a shared reality, making democracies you know, dysfunctional.

We can get into more of that later, but I think that’s helpful context for your listeners.

EJ: Well, and it does highlight the irreconcilable, what feels like an irreconcilable tension between profit and consumer demand in that, you know, if you’re conditioning consumers to demand something very particular, you get into cigarette territory, you get into the territory of, you know, if you’re taking away or you’re eroding people’s free will intentionally, you want you’re profiting off of the question of ethics is is huge.

And, you know, this is something you think about all the time, obviously. But how does that is it possible to to reconcile that like is it possible to ethically design a cigarette? Is it possible? Great question. You know, or, you know, you can shift it. You say is it possible to ethically design Doritos? Well, probably. But, you know, it’s different degrees in different scales.

And we’ve seen Apple, for instance, you know, they tell you how much time you spent on your phone every week and TikTok, that’s a whole other can of worms, allows users now to track their time. Like we have seen some efforts in this direction. They don’t seem particularly meaningful to me.

And I’m curious, they’re not how to resolve those tensions if it’s possible.

TH: I love your question of can you design an ethical cigarette or can you design an ethical Ben and Jerry’s that isn’t just diabetes in a spoon? You know, I mean, it’s and many people say, well, that’s just what people want. We’re giving people what they want. But I think and so according to the logic of what people point their attention at is what they want.

If you use that logic. So if you put your attention on something, then I’m going to give you more of the thing you’re putting your attention on. We’re not making a distinction between what we want versus what we can’t help but look at. And I think this is really important because there’s a lot of things we can’t help but look at conflict, the most extreme version of what every political tribe is doing, you know, and also things that we’re traumatized by.

Like if you are a person who, if you’re Asian-American and you have had an experience of being, I don’t know, hate or belittled in the street or something, some kind of violence or something against you, that’s like a thing that you’re traumatized by. So you’re going to really look at it when it shows up in the news feed.

And then you click on one of those examples on, say, Twitter. And then Twitter’s like, oh, I’m going to show you a million more examples like that, because that’s the thing that worked for getting your attention. Now, meanwhile, you have a bunch of other people who are not clicking on that first example, and then they’re not getting reinforced with a thousand other examples of Asian-American hate.

So when they say there’s this thing that’s happening, it’s a really big deal and everybody is like I don’t see that happening. I mean, I’m, you know, a guy living in San Francisco, white. I don’t have that background. I’m not seeing any of those examples. So they sound kind of like they’re overreaching or crazy to me if I don’t see those same examples in my newsfeed.

So but that’s true for every side. Every side has a set of things that they’re seeing, like the you know, the pro CRT people, the anti people, the pro Black Lives Matter people, the anti-Black Lives Matter people. Everyone is seeing a different reality because you click on two videos and then it gives you infinite evidence that confirms what you already feel.

And so I’m not trying to say, well, there actually is this one absolute reality. That’s true. One of the problems of the modern world is like how big a problem is I don’t know, systemic racism. I mean, where do where do we where do we see that? How could you know, my eyeballs can’t see what 3 billion people are experiencing.

So how do I size up where that problem lives? And we’re only left to media that represents that to us. And so I’m not saying what is or isn’t true. I’m just, I want to really point people’s attention at the degree to which we’re missing each other. And there’s a great group called More in Common. We did actually a podcast episode with their director, Dan Vallone, on what they call perception gaps, which is can one political tribe estimate the beliefs of another political tribe?

So can Democrats accurately assess what they think Republicans believe as measured by what = percent of them say they believe that? And can Republicans accurately assess what Democrats believe? And what they found is that the more people use social media, the worse that they were at estimating what other people believe, meaning the bigger the gap between what they perceive the other side believing and what they actually believe.

And so I think that’s just a really important point, because no matter what you feel or believe or think is true or real, the important thing is that we have an exaggerated view of the gap between things the other side believes, the most extreme version instead of the reality.

EJ: And it’s interesting because I have a libertarian friend who will use the example of the wristwatch. He’ll say, you know, there is a tech panic. We wouldn’t think of it as tech. But at the time it was technology, the wristwatch. And people at the time were becoming sort of obsessed with looking at their wrist and checking the time.

And people were concerned about this. And there’s similar reports from the mirror. I mean, you can go back to basically any aspect of technology. Absolutely. There is some sort of moral panic about it. But what those technologies, with maybe the exception of the printing press didn’t do, was transfer vast swaths of our personal and professional lives to those devices or to the smartphone.

And the smartphone has the social media on it. And as somebody who’s been around for a huge part of the history, the young history of the sort of common tech or the modern tech industry, how much of this is smartphones? How much of this is social media and how much is the combination of both? Because there was AOL Instant Messenger that could be very addicting, but you had to go dial up and yeah, a desktop and you had to go to a special room in your house where the desktop was and dial up the Internet to get on air.

TH: I remember those days.

EJ: Right. The smartphone seems to play a big role in all of this as well.

TH: So I’m so glad you bring this up because let’s steelman instead of strawmanning the opposing view here. OK, so the opposing view is I saw The Social Dilemma. It sounds like a moral panic where we’ve always been worried about new technologies when they arrive on the scene, you know, television, radio, rock and roll. Elvis was shaking his hips.

Reefer madness, right. And there have been moral panics about all sorts of new technologies every time one new one arrives. So I want to name and acknowledge the full history of that phenomena of culture, having a moral panic about something new. I think the difference between, let’s say, a wristwatch and say social media. Now, take the example of how it’s affecting children if I’m like a 15 year old and I’m in high school and all my friends use Instagram, not just to post photos of their lives, which they do, but they also do their communication through Instagram.

Meaning I think a lot of adults who might be listening to your podcast might not get this, that teenagers actually might use Instagram as their primary way of messaging their friends. So there’s kind of been making plans like I can’t participate in my social life without using Instagram as a messaging app. But Instagram bundles the messaging with the infinite feed of We got an eye pointed at your brain stem to figure out and reverse engineer what’s going to resonate with your nervous system.

And that’s the next video, photo or thing we’re going to show you based on what was most engaging for you. And so a wristwatch that you put on your wrist didn’t control your social, whether you were excluded from your friends social group. Whereas social media and Instagram have controlled and taken over what it means to be participating in a society.

If I’m a politician, can I win an election without having an Instagram profile or a Facebook profile or a Twitter profile? Well, a few, maybe a while ago, you could do it, but now you can’t. Right. And the ones that do use these platforms are going to get an advantage. But the advantage that they get is in a bad game because they have to basically say the most extreme things that their follower base are going to respond to.

And so they’re not actually responding to their real constituents proportionately. They’re responding to the constituents that are resonating on social media. But social media has this amplification factor on the more you say an extreme thing about a cultural fault line, you add inflammation to that cultural fault line will pay you in more likes and followers that that’s the conversation with Elon, by the way, is you know, how do we fix Twitter?

It’s like it’s not just about speech versus censorship. That’s an important conversation. But the real problem with Twitter is it is a video game that I log into every day and I’m paid in more likes and followers. I get paid only ten points if I add information to a cultural fault line versus I get paid one point.

If I say, you know, here’s what I think the other side might be. Actually, I get paid negative points. If I say, here’s something I think the other side is thinking about that we should consider. We don’t get paid by acting in a good way. You get paid by being, you know, provocative. And people will say that’s no different than any other media.

But that has changed when you have an algorithm of Twitter that’s reinforcing things that you’ve already clicked on, and that’s different and unprecedented in the time of media. So hopefully that is true. Some of the concerns on this is a new and distinct issue no.

EJ: I mean, and this is one of the things that I found myself explaining to just slightly younger people, because the rate of change has gotten so quick that, you know, you can be a vastly different have vastly different life experience just by being a little bit younger. But anywhere you go with your phone, the difference between what you’re not looking at when you’re not looking at your phone, when you are looking at your phone could be the best news of your life. It could be the worst news of your life. And if you don’t look at your phone at that given moment, you might be missing it. 

And that’s very different than even the beeper, than the telephone. There’s just no there’s nothing there’s no precedent for that in human history. And I guess I’m curious as to in your experience how aware these companies are. You know, Google might say, listen, we’re working on this. We’ve hired design ethicists. We have, you know, put a portion of our profits into ethical design.

TH: I need to answer that question because you asked that earlier, actually, about how does one get that job? Are there many design ethicists? Actually, I mean, but just to briefly clarify, there was no job for design ethicist at Google or Apple or Twitter or Facebook or anything like that. I didn’t apply for a job. I again, the people remember The Social Dilemma, which I recommend anybody who hasn’t seen watch The Social Dilemma. 

I go through the history of my own trajectory, which is that I became concerned about this arms race for attention and then made up a slide deck basically that I sent to just 15 friends and colleagues at Google wanting to get feedback, saying, hey, I’m worried that this arms race for attention is going to value humans more when they’re addicted, distracted, outraged, polarized and misinformed and narcissistic and that that’s going to kind of break the way society works.

And then instead of it just being sent to those 15 people, they started sending it to other people. And within basically 48 hours it had spread to, you know, 40,000 people at Google. And then through the success of that presentation and some executives seeing it, I either was going to quit or I was going to be in this position where I can maybe research what do we do about this problem?

And to the generosity, maybe you could say of Google, they allowed me to research, how do you think about this question? How do you ethically design the environment in which applications compete for attention and news competes for attention? So I did that for a few years. But unfortunately, even though I tried very hard to change Android and Chrome and some of the core interfaces that could be changed, they would say, actually, if we do this, it will look paternalistic for Google to make these top down changes.

And it was clear to me that we needed to create a public awareness movement, and that’s why it came out on 60 Minutes. That’s when The Social Dilemma came out all that work coming out to the public was to create more public awareness that this is a massive problem and we have to do something about it.

EJ: You know, I’ve known people who have changed their phones to grayscale as I’m sure you have, and that might even be mentioned in Social Dilemma. But how aware are people when they are engineering these products, when they’re coming up with the new way for Facebook, for instance, they just announced they’re doing more of what TikTok does. And instead of circulating content from people already in your network, they’re going to be circulating and pushing more content from outside your network.

How aware are people in these positions down to the colors, down to all of this in terms of designing it to capture our attention, designing these things like slot machines? In terms of the psychology, how aware are people in those positions of kind of exactly what they’re doing on the granular detail level?

TH: It’s a great question. So first of all, I think there’s a lot of justification. If you work at Facebook or Twitter or TikTok today, you point to the thousand positive examples of things that happened today in the world because of what you were doing. So on Facebook, I point to blood donors who got to meet with other blood donors for rare diseases.

I see cancer support groups. I see the people who lost their horse. And then because of someone on Facebook tagged on, they found the horse. And I see the examples of, you know, Hollywood. I mean, high school sweethearts reconnecting because they found each other. So I can point to a list of a thousand amazing things that happened today because of the existence of social media.

And if I’m TikTok, I’d make a similar list about all the creative findings. I say, here’s all these these creators that now have an entire lifestyle and economic well-being. You know, they basically have a job where they can make money from just being creative and making videos. So I can point to all those examples of the goods.

Now you see a different question. Are they aware of the design choices that they make that are manipulative? I think this is a very interesting question. The people at the very top of the company, I think are aware like I forgot his name, the guy who runs TikTok, and also Zuckerberg, I think are both very aware because anybody who was there in the early days, the way that Facebook got successful was by manipulating human, social, psychological vulnerabilities.

The clear example of Facebook was photo tagging. I don’t know if you were in college when that, I was a sophomore at Stanford when that feature came out. The Facebook was like months old. And for those who don’t remember or know this history, they created the feature where you don’t just upload a photo of some stuff that’s in your life.

And, you know, this is back when you had to do on a desktop computer and do it manually from a digital camera.

EJ:  You had to plug the camera in. Yeah.

TH: It’s funny. How that’s like history now, right? So but they added a feature called photo tagging so I could tag, you know, Tristan, and then in front of all of his friends, he would know that he was tagged in a photo by Emma, and then he’d say, Oh my God, Emma tagged me in the photo, I should, like, respond to that.

Or I have to, I have to make sure it’s a good photo. Otherwise I have to take it down and that was the best way to just get people getting sucked into Facebook for hours and hours a day. Because now all these photos are linked to people’s faces and names and you’re clicking through photos and you’re tapping into people’s vanity and sense of social validation.

And Zuckerberg and his team knew exactly what they were doing when they designed that feature. And they, you know, in different cases when they invented the like button, they thought they were spreading love and positivity in the world, but they also knew they were creating a feedback mechanism. And if you post something and instead of getting no feedback, you suddenly within minutes or seconds you would get ten likes or something like that, that that would be an addictive feature because you suddenly you have this reason to post and you know, you can get some dopamine and some oxytocin and serotonin from just dosing people with likes so they knew what they were doing when TikTok invented the, you know, the infinite scroll for, you know, for second video interface that you just infinitely go through, they know that if they remove the stopping cues, meaning they make it so that it automatically scrolls to the next video. They hide the clock, for example. They know exactly what they’re doing. Actually, a metaphor for listeners isin the casino.

Many of you may know that the designers of the casino actually hide clocks because they want you to lose track of time while you’re in the environment. They want you thinking about time. I didn’t realize it until recently, but if you’re using a TikTok on an iPhone that doesn’t have the notch at the top, but a flat top, TikTok will hide the clock on your phone because they don’t want you thinking about what time it is and how much time has passed since you’ve been scrolling.

These are deliberate design choices, and they do optimize down to the color, down to the, you know, the animations, down to the speed and time it takes for the new video to load. This is all optimized like a machine, and the results of which is that when we’re the product and not the customer, we are worth mor again, when we are addicted, distracted, outraged, narcissistic, validation, seeking and polarized than we are as healthy democratic citizens or as growing children. Right? 

These people who are designing streaks on Snapchat, which show the number of days in a row that kids have sent a message to their friend, which they did just to addict people. Because now if you if you’ve been sending a message to your best friend for 30 days, in a row, now if there’s 30, there’s a number 30 and a fireball next to your name, I don’t want that number 30 to go away just like that I’ve been working at the gym for ten days in a row. It makes it harder to not go the next day, but they’re using that not to help me go to the gym more often, which I might independently choose. They’re using that to support their goal of driving you to keep engaging. And that’s the problem.

EJ: I had a 700 day snap streak once.

TH: Wow. OK, so tell me then, so you have it for share. I mean, I might sound like I’m spinning a conspiracy theory, but it was probably pretty hard to to let go of that 700 day streak, right?

EJ: It was accidental. Yeah, it was. it was really a traumatic moment when the streak went away because my friend that I had it with was in a different time zone. And we just we lost track and it got messed up. But I’m curious what the R&D looks like in that space. Like, is this informed by scientists? Is it AB tested? Is it focus grouped? I mean, how do they optimize these platforms to function in that machine-like way, they’re optimized for this addiction and for keeping you engaged? 

TH: The word is engagement. That’s the word everybody should know is what is engagement. It means seven day actives, meaning users who have been active in the last seven days. That number should go up. The number of sessions that you’ve engaged with per day that should go up, the number of amount of minutes per procession should go up.

The number of total users should go up. The number of users who invite other users should go up. That’s all engagement that was captured in The Social Dilemma where you had the artificial intelligence robot guy sitting behind the control screen that’s basically in a control room that’s trying to increase, you know, time spent advertising per minute spent and then also how many users you invite because all those drive up engagement.

So at this point and even back then, it was pretty well known how to manipulate people’s social psychology. And again, I mentioned, you know, there is a field called Persuasive Technology, which I want to make sure I’m very clear. B.J Fogg, the professor who created this field and wrote the book on Persuasive Technology, who I studied with. He wanted he warned the FTC and I think the late nineties about the ethics of persuasive technology and the need to regulate the space because it’s going to turn into a huge problem.

And he’s not the evil guy with the mustache, you know, twisting it to say, how can I ruin the world? He really was trying to actually use persuasive technology for good. He actually created a whole initiative. How do you use persuasive technology to create more peace in the world. So he did a lot of things there, but there’s a whole space in which once you show people, once there’s this increasing encyclopedia of techniques of how to manipulate people.

And again, I was a magician as a kid, so I knew that there’s a hundred techniques to manipulate people, sense of cause and effect attention, their reasoning, their memory of what happened. There’s a thousand techniques and most people don’t want to admit this is true, but it is totally true. It’s just built into how our minds work. And behavioral economics kind of reinforced that.

And there’s a bunch of conferences in Silicon Valley, in books. There’s a guy named Ariel who had a conference called Hooked, and I spoke at it once. And I remember distinctly he had the head of growth at Pinterest. They’re giving a talk and he said, Now, what would happen if a user gets burnt out and they stopped using the service like a drug lord?

It’s like you stopped using what do they do then? What? He said, Oh, that’s no problem. You just do the following ten techniques and it gets them using it right again, right back. And that’s what people should know is that there is an entire discipline to about how to kind of influence people psychology and it wasn’t done because they wanted to wreck the world.

It’s they wanted to be successful for their business. But the underlying terrain, on top of which that success would be built, was by degrading the soil out of which our society grows. Right? It’s like you’re telling the soil for, you know, for children’s development, how their minds work and you instead of saying, how do we cultivate that soil?

So it grows the best children and, you know, you know, maturity and, you know, healthy development in the world, they’re just saying, how do we keep kids addicted to Snapchat? How do we keep kids addicted to TikTok? And it works really, really well. And I recommend for any parents out there, you have to note, this is not built for the health of your kids and simply limiting the time that they spend is like saying, well, I’ll just limit the amount of heroin that I get every day to like a few milligrams.

So let’s just not give the kids heroin and let’s remove their social communication from an environment which is dominated by that kind of heroin. Like how do we get communication out of that space? There’s a great organization called Fairplay that used to be called the Campaign for Commercial Free Childhood that has been working on that for a long time.

And we’re all allies trying to liberate essentially humidity from the shackles of this manipulative environment.

EJ: Yeah, and you know, it’s your personal lives and your politics. It’s everything that’s in the heroin is informing every conversation that we have. And Social Dilemma came out. I think it was like right on the cusp of TikTok becoming really big. I know you’ve been doing some work in the national security space, too, which is a huge conversation, obviously, when it comes to TikTok which, is now, I think the number one website in America. What stands out to you? You just sort of talked about it a little bit, but what about TikTok stands out to you and in this context of the attention economy?

TH: OK, I’m so glad you asked this question. And I think it’s just the most critical thing. And also from a national security perspective, if I could get the U.S. national security community to ban TikTok right now, I would. And we’ll get into that. Let’s quickly talk about how to TikTok, as you mentioned recently, overtook YouTube and Facebook and Instagram and I think engagement time.

I can’t remember which place it might be globally. Or in certain markets, but they have basically eaten the lunch of all the other social media companies. In fact, I actually I was at a events theater a couple of months ago and a woman came up to me. She said, oh, I saw The Social Dilemma. Thank you so much, I stopped using all social media. Now I just use Ti​​kTok. And I was really sad by that. Because the real message of the film is not You should be angry at one or two of these apps. It’s the race to the bottom of the brain stem to get attention that thing is going to hurt human humanity and TikTok, the way to beat Facebook and YouTube in the attention wars and the race to the bottom of the brain stem is they optimized for one thing, which is you’re watching a video and I’m going to point the biggest supercomputer in the world sitting in Beijing at your brain stem and your kid’s brain stem and at your uncle who you know, and I’m going to calculate the perfect next video to show that and it will know you better than you know yourself. It will know what will keep you there better than your prefrontal cortex or the part of your brain that basically has your willpower and says, oh, I have to do something else. It will calculate how to defeat your prefrontal cortex.

People should be very concerned because the Chinese Communist Party has influence over ByteDance, which is the company that owns TikTok.

TikTok has basically become the number one social media app around the world. Would you have allowed the Soviet Union during the Cold War to basically control television programing for all Western democracies and pick what they see and they don’t see? I mean, it is absolutely insane that we’ve allowed this to happen.

And by the way, for those who think that I’m being, you know, a conspiracy theorist, or xenophobic, China does not allow any U.S. social media platforms in their country. So they don’t have Facebook or Twitter because they think the U.S. would be running an influence operation on their citizens. Why? Given the fact that that’s their worldview, would we assume anything less than they would intend to use this platform for psychological?

And for most people, they think about TikTok. They’ve heard in the news about the U.S. government’s concerns about data and that they’re going to, TikTok will basically siphon up all this data on Americans. That’s true. But I’m even more worried about them being able to remotely influence basically the next president, the United States. They can say, yeah, we want it to be this person.

They can look at all the voting districts in the swing states and they can say per zip code. I mean, I don’t need Frank Luntz, the political pollster, because I can basically look at people’s sentiments with an AI that calculates what people’s opinions are in all the key voting areas. And then I can strategically up regulate everybody who starts to say, you know, China’s really not so bad.

We should look more what they’re doing. If China were to invade Taiwan tomorrow, they could up regulate all the American voices who are saying that Taiwan was always a part of China. And I was actually talking to a U.S. senator who said, who do you think the Chinese government considers the biggest rival to its power? And of course, you would think it’s like the United States but they said, no, no, no.

They consider their own technology companies to be the biggest rival to the CCP. And for those who don’t know, China actually massively regulates its own technology company. So if you use TikTok in China, when you scroll, you don’t get influencer videos. They actually feature science experiments you can do at home, educational content, patriotism videos. They have closing hours and opening hours.

It actually goes lights out at 10 p.m. They limit you to 40 minutes a day because they want their kids to be scientists, astronauts and engineers, and they want their military to be successful in the future. And where they’re happy to ship the version of TikTok, that’s essentially opium for the masses to the entire West and I think it’s like a reversal of those, you know, history. You know, what they call the great humiliation, which is when their society fell behind because of opium. This is kind of a reversal of that and I don’t think it’s because of one diabolical plan. I think that there’s this Chinese company that was just super successful at playing this game called The Race to the Bottom of the Brainstem.

And now the West has to respond.

EJ: Unbelievable. And I guess there was a Sarah Fisher report in Axios just yesterday about how Oracle is now auditing TikTok data and that was both on two fronts. It was on the front of privacy, and it’s on the front of algorithmic control. And, you know, it’s sort of like, look at Oracle’s doing this and they’re going to, you know, disincentivize any sort of nefarious behavior on behalf of the CCP in the United States.

But it’s also incredibly hard to trust that process. So just on what do you think about is that a solution going forward of TikTok, is that something that’s really a meaningful step in a better direction?

TH: Well, I think on the Oracle side, that’s about data. So where is the data going? Where is it getting stored? Who has access to it? Oracle could provide some slightly more trust in that area, but how would we know that TikTok is not manipulating which content people are seeing in all the swing states? TikTok, for example, does not release to researchers a what’s called an API or something where you can researchers can basically look at the full firehose of all the videos. Twitter, for example, does offer that firehose or researchers can actually, you know, do computation and analytics and math on what is actually moving through through Twitter and what is getting amplified, what, you know, what are the networks of, you know, where teams are traveling TikTok doesn’t offer that. 

So they could and they’ve already done this, by the way, there’s been research by the Tech Observatory at UC Berkeley that has been studying, especially how TikTok was influenced during this war in Ukraine, where because Russia passed what they call the, quote, fake news law in which basically if you said it’s at the beginning of the war, all the videos that were posted on TikTok, most of them from around the world were against Putin. And after Russia passed the fake news law, because they have a lot of citizens who use a TikTok. I was told just today, this morning that Russia is the fifth biggest market for TikTok in the world. So, you know, this is a big market but because they’re allied, TikTok basically no longer pushes any pro-Ukraine content and anti Russia content. And all the videos that you see are, you know, why Putin’s great and why the war is justified and everything else. And again, that’s slightly more visible. But they never announced that they were making that change. 

So TikTok can we actually know that they have the controls to flip a switch. And in certain countries, they can basically make people feel one way or the other and they can do that per country.

If I was China and the CCP, I would take TikTok and remotely influence, you know, all of Africa. So along with my debt traps that I’ve laid for all the African nations and building airports and then putting them in debt to have influence over them in the future. I would also use TikTok to make people think in Africa that, hey, look, this is Putin just actually fighting back against Nato.

This is totally justified and I could basically control and gerrymander the future votes of other countries and my soft power to coordinate how the world works. So we’re going all the way from email addiction and pulling to refresh in the slot machine all the way up to a geopolitical power game. But I really, really want people to get just how critical this is.

And I think it’s something that everyone can be aligned on, it’s not a partisan issue.

EJ: No, it’s not at all. And you’ve been having these conversations increasingly, I know, with people in politics, with people in business. And, you know, you just mentioned having this conversation with a U.S. senator. Are you finding that some of this is like completely unchartered territory to the point where people don’t, haven’t this put two and two together because it’s happening so quickly?

And it’s so new that when you do explain it to them, they get the gravity of it or is it still just sort of hazy?

EJ: So the meta problem, as it were in media, I don’t mean Facebook Meta, I mean the Meta of the problem, but that’s about the problems. We go back to E.O. Wilson to answer your question, which is that the fundamental problem of humanity is we have Paleolithic brains and emotions, medieval institutions and God like accelerating technology. And it’s so important to come back to that because you know, it is always the case that technology moves faster than institutions.

You know, we get cars before we get seatbelts in traffic lights, we get railroads before we get the regulation of railroads, we get dynamite before we realize, hey, we should actually not sell dynamite in every, you know, CVS down the corner, which, by the way, you used to be able to get. So we don’t usually realize that we’ve created a God like technology until afterwards.

And it needs some kind of guardrails. And guardrails is what we’re looking for here, not free speech, ministry of truth. We’re looking for. How do we set up the incentives in some safe ways for this to work? But I think the problem to answer your question, talking to U.S. senators or national security leaders, you’re sort of saying, hey, do they do they put two and two together?

I think slowly some are. But you have to paint the whole picture. Because if I’m a general that went to West Point and trained in all this military theory and, you know, I know about, you know, the position of all the U.S. Navy and, you know, where the nukes are. And I know how to think about the power calculus.

And I worked at the office of net assessment at the Pentagon or all that stuff. Why would I know what my 14 year old daughter is doing on TikTok or that China might be just influencing certain zip codes? Like how would that reach my dashboard? And there’s all sorts of cases, even after 9-11, where there’s an institutional gap where we have some institutions, we have a Space Force and we have a Navy and we have an Air Force, we don’t have a Metaverse force.

We don’t have a social media defense force. And I do think that population centric psychological warfare is the new is one of the new modes of warfare along with cyber warfare. I don’t have to actually have F-35 to match you in kinetic military capacity if I can actually do a cyber attack and, you know, hack the Colonial Pipeline, which later was released, apparently that the Colonial Pipeline was two days away from basically having major cascading effects that could have shut down the U.S. economy if it’s just been two days longer than it was and a cyber weapon cost a lot less than an F-35 and a social media weapon that makes your population angry at each other so that all the energy goes into …because everyone’s just fighting with each other all the time. And meanwhile, I’m just playing chess and beating you by routing around all your decisions. That’s the thing I want to shake people out of. It’s like the US has to reboot its society a bit into a 21st century society that also has a positive view of how tech can be embraced.

I think the last thing is, you know, China is consciously employing the full suite of 21st Century technologies from big data to A.I. surveillance to quantum computing and to make a stronger form of authoritarian society, a society that we wouldn’t want in with Western values, but they’re consciously employing tech plus authoritarianism equals stronger, better authoritarianism. Meanwhile, democracies are not consciously employing the full suite of tech to make stronger democracies.

Instead, we have allowed private tech companies to profit from degrading democracies into not working as well as they need to. So we can’t just be satisfied with slightly less toxic social media. I want to move to a world where tech plus democracy equals stronger democracy that has an embrace of tech, not an opposition to tech. And that’s a nuance people might not expect from me because they think Social Dilemma is kind of a Luddite film or something.

EJ: Well, the Gmail example that you gave earlier is really sticking with me because speaking of Meta, when you described looking around a cafe and seeing how more than half of the people were on Gmail and people keep it open all day and in some sense it is this very physical workspace. It’s become over, especially after the pandemic and going forward even more so. And I have an Oculus and I see the programs that they have for people to work virtually, to work in the metaverse. And they’re kind of crude right now. But we know that one of the most powerful companies in the world, Meta, is investing really heavily on making those spaces, well, addictive in the same way that Gmail was being made addictive.

But despite the fact that this was on everybody’s laptops at the cafe, that company didn’t have a team working on the ethics of it, period. Even though it was consuming so much time, so much of the world’s time, and being controlled by a fairly, you know, small team compared to the rest of the world. And in Northern California, so I’m wondering what you think is at stake in the metaverse.

I remember Sheryl Sandberg in the pandemic basically bragging about talking to faith leaders, about getting their churches into the metaverse. And that might sound great in the same way that a lot of other steps in tech sounded really great. But what ends up happening is you have control of your worship spaces, you have control of your political spaces, in the hands of one company.

So what worries you? Or maybe what gives you optimism about the metaverse going forward?

TH: Well, I think we’ve already seen how positively framed mission statements like we’re going to connect the whole world together and we’re going to bring the world closer together, which was the new mission statement for Facebook or, you know, we’re going to host the, you know, the public square for Twitter. Each of those positive mission statements and narratives were not what the underlying machines that they built were designed to optimize for.

Like Facebook was not designed to make the world more open and connected. It was designed for engagement. What can I show you? What can I get you invited to? What can I tag you with that keeps you coming back to this system over and over again? That’s what they’re designing for. And there’s a gap between the mission statement and what the machine is optimized for.

Same thing with Twitter. It is also a place where people can publicly post their thoughts to the whole world. But the actual machine, like if you talked to the engineers, how are they incentivized? What are they literally performance bonuses? So if I’m an employee at Twitter and I work on the newsfeed, how do I, like make a bonus?

Well, I showed that engagement went up by 10% while I was working on this feature. And that’s what gets me my bonus next year. And so long as that’s how the incentives are structured, we’re going to see those results. So when you ask about VR and the metaverse, we’re going to hear the same thing. We’re going to hear the positive narrative story. We can be able to connect the world and connect church groups together, and people are going to spend time together.

There’s all those things could happen, and many of them will be positive. But that’s not the machine that they have built. The 10,000 engineers that they want to hire to build VR in the metaverse are not going to be asked to say, how do we, you know, liberate the best of humanity, they’re going to be tasked with how do we make our VR platform the most engaging and addictive it can be, especially so long as it outcompete the other VR platforms.

So it’s another race to the bottom of the brainstem. Now to leave people with some hope, you know, in the social media world before we get to VR. But in the regular, today’s world, Apple is one of the interesting actors that I think needs to have far more pressure put upon them because they often get away with they’re definitely not one of the bad guys, and that makes them a good guy because they’re not a social media company and they don’t optimize for engagement.

And their business model is selling you a new phone every two years. And I also want to applaud the things that they have done, the privacy tracking features that they’ve done, the screen time features, the Do Not Disturb features. You know, these are all in the direction of more humane technology. And when Tim Cook announced, by the way, that they were doing these privacy tracking features, he said out loud, We cannot allow a social dilemma to become a social catastrophe.

So they’re on board. But Apple could do so much more to change, if you will, almost like the digital Constitution for how all apps can and cannot behave. So if we want a Geneva Convention in this arms race for attention, Apple could implement it because all the apps have to live in their app store and you know, people would say, well, Apple, that has too much power.

We shouldn’t have a private entity like they have too much, which is exactly true. We shouldn’t have. You know, which is the point you were also making, is the VR world going to get owned by $1 trillion, you know, $2 trillion tech companies? That’s not a good environment. We need to be in the democratic interest. So we have to figure out some different governing structure but I do think that short term, if you wanted to change all the perverse stuff that we’ve laid out in the last hour, I think that Apple could a year from now, within less than a year from now, ship on the next iPhone, a different set of rules that says you can’t do auto playing videos for underage kids and then no end. A lot of playing videos for underage kids and suddenly you’ve solved the TikTok problem. And they’re already making changes that are in this direction. We just need to demand so much more.

EJ: Yeah. I mean, it’s amazing. That’s kind of my next question, I was going to ask you about the Farhad Manjoo column “I was wrong about Facebook” in that long series of mea culpas that The New York Times tasked its editorial staff to do.

TH: To be honest, I actually haven’t seen those. Could you say more about them?

EJ: Yeah, no, absolutely. So they all had this challenge where they issued a mea culpa, something they were wrong about. So some of them were, you know, very political, some of them were more cultural. And Farhad’s was “I was wrong about Facebook.” And he wrote “The site has crossed a threshold. It is now so widely trafficked that it’s fast becoming a routine aid to social interaction like email and antiperspirant.”

And it’s a sort of a clever way of putting it, but a very true one. And the big question here is whether there’s a way really to put the toothpaste back in the tube. There are some even bigger questions about what it means that people on one side of the world I mean, I think about TV. Walter Cronkite broadcasting from Vietnam.

Betty White was on an experimental broadcast of TV, and she passed away in the age of TikTok. The stuff is just moving so quickly and connecting people from different cultures so quickly that in ways that were never envisioned and were not clearly keeping up with the rate of change. So is there a way to sort of put the toothpaste back in the tube and have a healthier economy or is it sort of, it should we be more defeatist and practice some sort of Benedict option to just get out while we can?

TH: Yeah, well, I think it’s important to note and again, I don’t want to say let’s copy China, but notice it’s almost like they saw The Social Dilemma and they enacted a whole horizontal, you know, slate of reforms right there. They don’t let 14 year olds, they limit you know time they do opening hours and closing hours so that no one has to use it at ten after 10 p.m. because there’s no social pressure that that your friends are not continuing to tag you and stuff at 11 p.m. and midnight and one in the morning now you’re not getting sleep like they’re enacting a whole suite of reforms.

I think that tech plus democracy can equal stronger democracy. I think the question is, what is adequate to accomplish that goal? Is just privacy legislation that limits the amount of data that they can collect going to get us to a world where the actual personalized engagement driven news feeds and on playing videos for 3 seconds are not going to continue to ruin your kids attention and mental health? Privacy doesn’t get you all the way there if you have antitrust and you just break them up into, you know, from two or three companies that are pursuing engagement, too. Now you have 100 companies that are even more aggressive in the race to the bottom of the brainstem. That also doesn’t get you to a world where they’re not racing for that engagement.

So we have to change the underlying North Star from engagement to something that actually has some public interests at heart. But you can’t be affecting children without caring about children and families getting better throws the technology not worse. If you know, tech plus attention has people better attention. Tech plus family should equal stronger families. Tech plus kids should equal better, healthier, better educated kids.

Now, that might sound uncomfortable, some people, but when you realize just how central it is, it’s like roads plus cities should equal, you know, better transportation for the whole city. Like the city should be humming along as a better, healthier you know, more functioning city with higher bandwidth and more throughput. And what we need is for social media to be like that.

But for the way our society works at a more comprehensive level, and I do think, like I said, that you’re kind of asking people to put the toothpaste back in the bottle. You know, we might need to kill a few cancer cells, like there’s some bad cancer cells out there. And I want a name that if a cancer cell has a social responsibility council or an advisory board or something like that, like it’s still a cancer cell, you know, if a cancer cell has a a research department to sort of give transparency reports about how how it’s affecting various organs, it’s still a cancer cell while it’s giving you that transparency we have to change the DNA of Facebook, Twitter, TikTok from the cancer cells that they are for our society into something that’s actually healthy. 

Now, we can either do that through regulation and actually forcing and deepening a change that has to comprehensively hit the way the employees and the incentives are structured. Or we also probably need some better newer platforms that are more optimized from the ground up to strengthen the way society works.

I know that might sound bold and aggressive, but I also think that’s the size of the kind of 21st Century digital democracy we should be living in too, which is the only way we’re going to compete with China anyway.

EJ: No, absolutely. I mean, on the right, people come all the time pitching their new social networks. I remember talking to the CEO of Parler a couple of years ago and saying, Well, speech stuff, great. What are you going to do differently than Twitter in terms of addictiveness? Because if you can’t answer that question I’m not interested in this product as a solution to the problems.

So let’s end on that. Tristan, is there reason for optimism? Is the mood in Silicon Valley one of innovation right now? Are people trying to come up with these ideas and these products, or is there still a brick wall? Or is it maybe a combination of all of those things?

TH: So I think I’ve been working on this for close to ten years now. And so there’s been a lot of patience and I want to, I’m sure a lot of people might be feeling pretty daunted by what we’ve kind of laid out, if it is so comprehensively, you know, toxic to many of the core support structures of the way our society works.

And I’m sure that they’re like, it’s a hard thing to look at every day. But back in 2013, if you told me that Facebook stock price would be cut in half, which it recently was through the Apple privacy changes that happened that the CEO of Apple, Tim Cook, would be saying we can’t allow a social dilemma to become a social catastrophe, that the attorney generals, just like they did for Big Tobacco, created a lawsuit that increased the costs that changed the business of tobacco from being a social norm to now being a defunct industry.

They did the same thing. We have an energy lawsuit for a consumer harms to kids that is now emerging. We have employees coming out and whistleblowers coming out, giving transparency to some of the externalities that have been hidden for a long time. We have, you know, Apple making changes in the right positive direction. We have culture and regulators starting to catch on.

We had the EU, at least, that’s passing the Digital Services Act, Digital Markets Act. So we’re seeing way more movement than we’ve ever seen in the ten years that at least I’ve been working on this. And I want people to feel the momentum because it is picking up and picking up speed do I think that Facebook or Twitter or TikTok have taken in this critique and said, you know what, you’re right, we have to totally transform everything we’re doing now and I also don’t expect them to.

There’s a great line by Upton Sinclair, the American author. “You cannot get someone to question something that their salary depends on them not seeing,” and I think that Zuckerberg has built his identity, and just like the guy from TikTok and just like any of the companies you built your identity and your wealth around the success of something that does do some benefit in the world. 

But the optimization for engagement produces externalities and costs that live on the balance sheet of society, that break that society. And we can no longer afford those things. So whether they recognize it or not, I don’t want to wait for their consciousness to emerge. I want to make sure we get there through all the means possible through national security, through litigation, through employees coming out through Apple, making the relevant changes and creating the Geneva Convention.

There’s ways that this could change within a year from now in a pretty dramatic way. But we would need all those sectors acting at the level that we need to. We would need national security saying, hey, this is going to be a tier one priority in the national security strategy. 

You know, you could have these kinds of changes happen really quickly, but we need a focused effort from all these sectors acting a once.

EJ: Move fast and put things back together. Maybe that’s the new motto going forward.

TH: You said it, not me.

EJ: Well, Tristan, thank you so much for your time and your insights and all of the work that you do in this space. I think it is some of the most important work in American politics and culture and honestly in the world right now. So thank you for joining Federalist Radio Hour.

TH: So good to be with you. I really hope everyone takes this to heart and wherever you are listening to this, like in ways that you can contribute to this whole system moving, please do. Because I think a lot of people look at that and say somehow, you know, it’s going to change on its own. And we just do need everybody talking about it and making it a central issue.

We found even talking to the national security community, it was through their children who were affected by it, who saw Social Dilemma, who recommended it to their parents, who are in, you know, top positions that led to some of the change. So change can happen from a lot of places. Just keep that in mind.

EJ: Yeah, I’ve heard that, too, actually. I’ve heard similar stories about Social Dilemma’s impact on people. And, you know, on an optimistic note, Sinclair, who you just cited, was also reacting to tech and tech that we have discovered better ways to deal with and all of the years since then. And parents, by the way, should go to the Center for Humane Technologies website and look at their Ledger of Harms.

If there are any members of Congress listening, their staff go to the Ledger of Harms. It’s incredibly useful. You and your team do a great job with that, Tristan.

TH: Thank you so much. Yeah, we try to put out as many resources we can. We also have a Foundations of Humane Technology course for technologists, and we have people from Apple, Facebook, you know, Google, the United Nations, all sorts of places that are taking it. That’s also to try to train technologies in a different way to think about these problems.

And that’s also helpful for regulators and other folks working on these issues. So we’re all doing the best we can.

EJ: Yeah so if folks are interested they can check out the Center for Humane Technology and listen to Your Undivided Attention podcast that Tristan is the co-host of. You have been listening to another edition of The Federalist Radio Hour. I’m Emily Jashinsky, culture editor here at The Federalist. We’ll be back soon with more. Until then, be lovers of freedom and anxious for the free.


Emily Jashinsky is culture editor at The Federalist and host of Federalist Radio Hour. She previously covered politics as a commentary writer for the Washington Examiner. Prior to joining the Examiner, Emily was the spokeswoman for Young America’s Foundation. She’s interviewed leading politicians and entertainers and appeared regularly as a guest on major television news programs, including “Fox News Sunday,” “Media Buzz,” and “The McLaughlin Group.” Her work has been featured in the Wall Street Journal, the New York Post, Real Clear Politics, and more. Emily also serves as director of the National Journalism Center, host of The Hill’s weekly show “Rising Fridays,” and a visiting fellow at Independent Women’s Forum. Originally from Wisconsin, she is a graduate of George Washington University.

The Federalist logo eagle mark

Unlock commenting by joining the Federalist Community.

Subscribe


Read More From Original Article Here:

" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content

It’s Time To Treat Social Media As A Public Health Emergency

The following is a rush transcript of my recent interview with Tristan Harris, co-founder and president of the Center for Humane Technology and co-host of the “Your Undivided Attention” podcast. Tristan, a former design ethicist at Google whom you may remember from Netflix’s “The Social Dilemma,” joined Federalist Radio Hour on Thursday to discuss the urgent threats posed by social media.

We discussed everything from TikTok and national security to the metaverse to the psychology of social networks and their incentive structures. Our personal, professional, and political lives are now largely filtered through these poisonous platforms, the largest of which is owned by a company that operates out of a hostile foreign country.

We are in the midst of a tech crisis and a public health emergency. Tristan’s insights make that abundantly clear. It’s time for consumers, lawmakers, media, and business leaders to start recognizing that.

You can read the rush transcript and listen to the full episode below.

Emily Jashinsky: We’re back with another edition of The Federalist Radio Hour. I’m Emily Jashinsky, culture editor here at The Federalist. As always, you can email the show at radio at the Federalist dot com, follow us on Twitter at FDRLST. Make sure to subscribe wherever you download your podcasts as well. Today, I’m joined by Tristan Harris. He’s the co-founder and president of the Center for Humane Technology, which is catalyzing a comprehensive shift toward humane technology that operates for the common good.

He’s also the co-host of Your Undivided Attention, which consistently ranks among the top ten technology podcasts on Apple Podcasts and explores how social media’s race for attention is destabilizing society and the vital insights we need to envision solutions. Tristan, you probably remember from “The Social Dilemma”, which we talked about here, and we talk about the Center for Humane Tech, I cite it all the time here. So our listeners are certainly familiar with that. Formerly a design ethicist at Google. Tristan, thank you so much for joining Federalist Radio Hour.

Tristan Harris: My pleasure to be here, Emily.

EJ: So many, many, many people have seen The Social Dilemma. Just incredibly successful documentary that aired on Netflix and it covers your background pretty extensively. But for anybody who may have missed that, could you just brief us on your career and how you ended up at the Center for Humane Technology?

TH: Yeah, so I let’s see, I grew up in the Bay Area of California, and for the beginning of my life and career, I used to be very optimistic on technology. I actually thought I wanted to work at Apple and be part of the next Macintosh team. And actually since I was about 11 years old, I thought I wanted to do that.

So I was really early on what the positive and optimistic and inspiring cases that technology can really, you know, be in our lives. And I was really affected by, I think, the people and the culture that built the Macintosh. And so I knew I wanted to go into technology early on. I went to Stanford, got a degree in computer science and psychology not a degree in psychology, but kind of studied at the intersection of psychology, persuasion, sociology.

What influences human beings outside their awareness? I was a magician as a kid, and I took a class that was connected to this lab at Stanford called the Persuasive Technology Lab with Professor B.J. Fogg. And it was really the combination with the background with magic that I got interested in, how is technology restructuring our attention, our feelings, our thoughts, the choices that we make?

And that might sound like a conspiracy theory to many people, hopefully less so now, now that The Social Dilemma has kind of come out on Netflix and made that clear. But that was really where my attention went is how is technology shaping society, not just shaping in like a minor way, like rearranging on a restaurant menu, which item comes first which might affect your choices by like 2% or 5%, but really reshaping the entire basis of how 3 billion people make sense of the world. You know, so anyway, we can get into all that. But that’s kind of a bit of my background. And today we have let’s see, I could quickly summarize it to the best of my bio that might be relevant for folks. 

I was a tech entrepreneur. I started a small company called Aperture. It was acquired by Google. I landed at Google and then became concerned about the issues of the attention economy in this race for attention and how I thought that was going to create perverse side effects starting in early on in 2012, 2013. And so I then became a design ethicist at Google studying how do you ethically be in the position of responsibility for 2 billion people’s attention starting with things like Gmail and notifications and then also going all the way up to structuring the competition of different social media platforms for our attention which would then lead to news and sense making in the way that newsfeeds rank things. So all of that then led us, led me to leave Google because I was not able to get anything done there and started this organization nonprofit called the Center for Human Technology, which is how we actuate the change in the world that we’re trying to see. 

EJ: Google is such a good example because if I’m remembering correctly, in The Social Dilemma, you talk about Gmail, and Gmail is not Facebook, it’s not Twitter. It’s not Instagram or TikTok. It’s not something people often think about in terms of this context of the attention economy. It’s, you know, just the sort of blunt device that just delivers your mail in the same way that the USPS does.

But when you’re acting as a design ethicist at Google and you’re looking at Gmail, you’re basically trying to capture as much of any human being’s attention as you possibly can. Can you explain a little bit about the work of a design ethicist? And another question I have is how common those positions are in Silicon Valley, how they’ve changed or shifted or proliferated even over the last decade?

TH: Yeah, my other position name was head of the Ministry of Truth. No, I’m kidding. It’s funny actually watching that conversation go that direction, actually, because people think, OK, you know, Google is making there’s— this person called Tristan who’s thinking about the ethics of design choices, and they’re deciding what I’m going to get and what I’m going to see.

And that’s kind of alarming to people like who are they to choose how my life is structured or designed. But as you said, with Gmail, it’s more subtle, right? Because you think, Gmail, you think it looks like a neutral platform like I only get an email if someone sent me an email. But if you think about it, I mean, what drove me interested in into Gmail first?

I was addicted to email. I don’t know. Many of us, some people laughed at that in the film, actually. But, you know, you refresh your email, like 4 seconds later,I might pull to refresh again. I think it’s really important for listeners, just think about that. There’s a million behaviors we do on our smartphone every day that make no rational sense.

You’ll refresh to see how many likes you got or what new email you got, like a slot machine and then you do something else for about 30 seconds and you go right back to refreshing again. And notice that makes no rational sense. And I watch myself doing this, and I knew from the Persuasive Technology Class at Stanford in which we read a book called Don’t Shoot the Dog about clicker training for dogs, and also the way that Las Vegas slot machines work and I just noticed that this was what was going on for me, and I knew exactly how this worked.

I literally read the books on it. I studied with the people who built it, and I was addicted to my email. Now, an example like you’re saying, OK, so how does Gmail actually make design choices that affect what we see? Well, I’ll give you a real example. It was 20, I think it was 2013, And I was with the handful of people designing Gmail, which has a billion users at the time, and new cafe. I would go into anywhere around the world, have the laptops open, had Gmail open, right. Like it was where people live, not just an app to use. It’s a digital habitat. People do their work there and they leave it open all day.

And so, for example, if when a new email comes in, in your browser tab, it adds the parentheses, you know, the number goes four, five, six, ten, right? That’s attracting your attention. So whatever else you were doing, suddenly a design choice about should we update the title in the tab that says the number of unread emails that’s going to draw people’s attention there.

So even that which seems this like, well, that’s just a rational thing to do. Of course, I’m going to let the user know how many messages they have, but just that tiny design choice is causing people to switch tabs and basically not get anything done all day because they’re constantly distracted by checking the latest email. A concrete example of how Gmail at the time at the mobile app was trying to get more aggressive in this arms race for attention was they were competing with the regular email app on everyone’s phone and they needed to get you to switch from the regular email app into their custom Gmail app.

And so they, and I was in a meeting and an engineer said to me, like, Why don’t we make it buzz your phone every time you get an email because then you’ll never miss. We’ll turn email into text messaging basically. And that’s how we’ll outcompete the regular mail app. And you know, it seems like such an innocuous decision.

But there I was in that moment, he said that in my mind, visualized like these puppet strings going out into the sky, into the cloud and coming down and interrupting a billion dinner table conversations, you know, couples who are out on a date and, you know, or with, you know, with their families or something. And then their phone buzzes and it makes their attention just like 15% less.

They’re less available, less present for their family members. And so in a subtle way, even something as tiny as that would affect and restructure kind of these these billions of moments. But then, of course, that was way before we got to social media driving polarization, making us, you know, breaking down a shared reality, making democracies you know, dysfunctional.

We can get into more of that later, but I think that’s helpful context for your listeners.

EJ: Well, and it does highlight the irreconcilable, what feels like an irreconcilable tension between profit and consumer demand in that, you know, if you’re conditioning consumers to demand something very particular, you get into cigarette territory, you get into the territory of, you know, if you’re taking away or you’re eroding people’s free will intentionally, you want you’re profiting off of the question of ethics is is huge.

And, you know, this is something you think about all the time, obviously. But how does that is it possible to to reconcile that like is it possible to ethically design a cigarette? Is it possible? Great question. You know, or, you know, you can shift it. You say is it possible to ethically design Doritos? Well, probably. But, you know, it’s different degrees in different scales.

And we’ve seen Apple, for instance, you know, they tell you how much time you spent on your phone every week and TikTok, that’s a whole other can of worms, allows users now to track their time. Like we have seen some efforts in this direction. They don’t seem particularly meaningful to me.

And I’m curious, they’re not how to resolve those tensions if it’s possible.

TH: I love your question of can you design an ethical cigarette or can you design an ethical Ben and Jerry’s that isn’t just diabetes in a spoon? You know, I mean, it’s and many people say, well, that’s just what people want. We’re giving people what they want. But I think and so according to the logic of what people point their attention at is what they want.

If you use that logic. So if you put your attention on something, then I’m going to give you more of the thing you’re putting your attention on. We’re not making a distinction between what we want versus what we can’t help but look at. And I think this is really important because there’s a lot of things we can’t help but look at conflict, the most extreme version of what every political tribe is doing, you know, and also things that we’re traumatized by.

Like if you are a person who, if you’re Asian-American and you have had an experience of being, I don’t know, hate or belittled in the street or something, some kind of violence or something against you, that’s like a thing that you’re traumatized by. So you’re going to really look at it when it shows up in the news feed.

And then you click on one of those examples on, say, Twitter. And then Twitter’s like, oh, I’m going to show you a million more examples like that, because that’s the thing that worked for getting your attention. Now, meanwhile, you have a bunch of other people who are not clicking on that first example, and then they’re not getting reinforced with a thousand other examples of Asian-American hate.

So when they say there’s this thing that’s happening, it’s a really big deal and everybody is like I don’t see that happening. I mean, I’m, you know, a guy living in San Francisco, white. I don’t have that background. I’m not seeing any of those examples. So they sound kind of like they’re overreaching or crazy to me if I don’t see those same examples in my newsfeed.

So but that’s true for every side. Every side has a set of things that they’re seeing, like the you know, the pro CRT people, the anti people, the pro Black Lives Matter people, the anti-Black Lives Matter people. Everyone is seeing a different reality because you click on two videos and then it gives you infinite evidence that confirms what you already feel.

And so I’m not trying to say, well, there actually is this one absolute reality. That’s true. One of the problems of the modern world is like how big a problem is I don’t know, systemic racism. I mean, where do where do we where do we see that? How could you know, my eyeballs can’t see what 3 billion people are experiencing.

So how do I size up where that problem lives? And we’re only left to media that represents that to us. And so I’m not saying what is or isn’t true. I’m just, I want to really point people’s attention at the degree to which we’re missing each other. And there’s a great group called More in Common. We did actually a podcast episode with their director, Dan Vallone, on what they call perception gaps, which is can one political tribe estimate the beliefs of another political tribe?

So can Democrats accurately assess what they think Republicans believe as measured by what = percent of them say they believe that? And can Republicans accurately assess what Democrats believe? And what they found is that the more people use social media, the worse that they were at estimating what other people believe, meaning the bigger the gap between what they perceive the other side believing and what they actually believe.

And so I think that’s just a really important point, because no matter what you feel or believe or think is true or real, the important thing is that we have an exaggerated view of the gap between things the other side believes, the most extreme version instead of the reality.

EJ: And it’s interesting because I have a libertarian friend who will use the example of the wristwatch. He’ll say, you know, there is a tech panic. We wouldn’t think of it as tech. But at the time it was technology, the wristwatch. And people at the time were becoming sort of obsessed with looking at their wrist and checking the time.

And people were concerned about this. And there’s similar reports from the mirror. I mean, you can go back to basically any aspect of technology. Absolutely. There is some sort of moral panic about it. But what those technologies, with maybe the exception of the printing press didn’t do, was transfer vast swaths of our personal and professional lives to those devices or to the smartphone.

And the smartphone has the social media on it. And as somebody who’s been around for a huge part of the history, the young history of the sort of common tech or the modern tech industry, how much of this is smartphones? How much of this is social media and how much is the combination of both? Because there was AOL Instant Messenger that could be very addicting, but you had to go dial up and yeah, a desktop and you had to go to a special room in your house where the desktop was and dial up the Internet to get on air.

TH: I remember those days.

EJ: Right. The smartphone seems to play a big role in all of this as well.

TH: So I’m so glad you bring this up because let’s steelman instead of strawmanning the opposing view here. OK, so the opposing view is I saw The Social Dilemma. It sounds like a moral panic where we’ve always been worried about new technologies when they arrive on the scene, you know, television, radio, rock and roll. Elvis was shaking his hips.

Reefer madness, right. And there have been moral panics about all sorts of new technologies every time one new one arrives. So I want to name and acknowledge the full history of that phenomena of culture, having a moral panic about something new. I think the difference between, let’s say, a wristwatch and say social media. Now, take the example of how it’s affecting children if I’m like a 15 year old and I’m in high school and all my friends use Instagram, not just to post photos of their lives, which they do, but they also do their communication through Instagram.

Meaning I think a lot of adults who might be listening to your podcast might not get this, that teenagers actually might use Instagram as their primary way of messaging their friends. So there’s kind of been making plans like I can’t participate in my social life without using Instagram as a messaging app. But Instagram bundles the messaging with the infinite feed of We got an eye pointed at your brain stem to figure out and reverse engineer what’s going to resonate with your nervous system.

And that’s the next video, photo or thing we’re going to show you based on what was most engaging for you. And so a wristwatch that you put on your wrist didn’t control your social, whether you were excluded from your friends social group. Whereas social media and Instagram have controlled and taken over what it means to be participating in a society.

If I’m a politician, can I win an election without having an Instagram profile or a Facebook profile or a Twitter profile? Well, a few, maybe a while ago, you could do it, but now you can’t. Right. And the ones that do use these platforms are going to get an advantage. But the advantage that they get is in a bad game because they have to basically say the most extreme things that their follower base are going to respond to.

And so they’re not actually responding to their real constituents proportionately. They’re responding to the constituents that are resonating on social media. But social media has this amplification factor on the more you say an extreme thing about a cultural fault line, you add inflammation to that cultural fault line will pay you in more likes and followers that that’s the conversation with Elon, by the way, is you know, how do we fix Twitter?

It’s like it’s not just about speech versus censorship. That’s an important conversation. But the real problem with Twitter is it is a video game that I log into every day and I’m paid in more likes and followers. I get paid only ten points if I add information to a cultural fault line versus I get paid one point.

If I say, you know, here’s what I think the other side might be. Actually, I get paid negative points. If I say, here’s something I think the other side is thinking about that we should consider. We don’t get paid by acting in a good way. You get paid by being, you know, provocative. And people will say that’s no different than any other media.

But that has changed when you have an algorithm of Twitter that’s reinforcing things that you’ve already clicked on, and that’s different and unprecedented in the time of media. So hopefully that is true. Some of the concerns on this is a new and distinct issue no.

EJ: I mean, and this is one of the things that I found myself explaining to just slightly younger people, because the rate of change has gotten so quick that, you know, you can be a vastly different have vastly different life experience just by being a little bit younger. But anywhere you go with your phone, the difference between what you’re not looking at when you’re not looking at your phone, when you are looking at your phone could be the best news of your life. It could be the worst news of your life. And if you don’t look at your phone at that given moment, you might be missing it. 

And that’s very different than even the beeper, than the telephone. There’s just no there’s nothing there’s no precedent for that in human history. And I guess I’m curious as to in your experience how aware these companies are. You know, Google might say, listen, we’re working on this. We’ve hired design ethicists. We have, you know, put a portion of our profits into ethical design.

TH: I need to answer that question because you asked that earlier, actually, about how does one get that job? Are there many design ethicists? Actually, I mean, but just to briefly clarify, there was no job for design ethicist at Google or Apple or Twitter or Facebook or anything like that. I didn’t apply for a job. I again, the people remember The Social Dilemma, which I recommend anybody who hasn’t seen watch The Social Dilemma. 

I go through the history of my own trajectory, which is that I became concerned about this arms race for attention and then made up a slide deck basically that I sent to just 15 friends and colleagues at Google wanting to get feedback, saying, hey, I’m worried that this arms race for attention is going to value humans more when they’re addicted, distracted, outraged, polarized and misinformed and narcissistic and that that’s going to kind of break the way society works.

And then instead of it just being sent to those 15 people, they started sending it to other people. And within basically 48 hours it had spread to, you know, 40,000 people at Google. And then through the success of that presentation and some executives seeing it, I either was going to quit or I was going to be in this position where I can maybe research what do we do about this problem?

And to the generosity, maybe you could say of Google, they allowed me to research, how do you think about this question? How do you ethically design the environment in which applications compete for attention and news competes for attention? So I did that for a few years. But unfortunately, even though I tried very hard to change Android and Chrome and some of the core interfaces that could be changed, they would say, actually, if we do this, it will look paternalistic for Google to make these top down changes.

And it was clear to me that we needed to create a public awareness movement, and that’s why it came out on 60 Minutes. That’s when The Social Dilemma came out all that work coming out to the public was to create more public awareness that this is a massive problem and we have to do something about it.

EJ: You know, I’ve known people who have changed their phones to grayscale as I’m sure you have, and that might even be mentioned in Social Dilemma. But how aware are people when they are engineering these products, when they’re coming up with the new way for Facebook, for instance, they just announced they’re doing more of what TikTok does. And instead of circulating content from people already in your network, they’re going to be circulating and pushing more content from outside your network.

How aware are people in these positions down to the colors, down to all of this in terms of designing it to capture our attention, designing these things like slot machines? In terms of the psychology, how aware are people in those positions of kind of exactly what they’re doing on the granular detail level?

TH: It’s a great question. So first of all, I think there’s a lot of justification. If you work at Facebook or Twitter or TikTok today, you point to the thousand positive examples of things that happened today in the world because of what you were doing. So on Facebook, I point to blood donors who got to meet with other blood donors for rare diseases.

I see cancer support groups. I see the people who lost their horse. And then because of someone on Facebook tagged on, they found the horse. And I see the examples of, you know, Hollywood. I mean, high school sweethearts reconnecting because they found each other. So I can point to a list of a thousand amazing things that happened today because of the existence of social media.

And if I’m TikTok, I’d make a similar list about all the creative findings. I say, here’s all these these creators that now have an entire lifestyle and economic well-being. You know, they basically have a job where they can make money from just being creative and making videos. So I can point to all those examples of the goods.

Now you see a different question. Are they aware of the design choices that they make that are manipulative? I think this is a very interesting question. The people at the very top of the company, I think are aware like I forgot his name, the guy who runs TikTok, and also Zuckerberg, I think are both very aware because anybody who was there in the early days, the way that Facebook got successful was by manipulating human, social, psychological vulnerabilities.

The clear example of Facebook was photo tagging. I don’t know if you were in college when that, I was a sophomore at Stanford when that feature came out. The Facebook was like months old. And for those who don’t remember or know this history, they created the feature where you don’t just upload a photo of some stuff that’s in your life.

And, you know, this is back when you had to do on a desktop computer and do it manually from a digital camera.

EJ:  You had to plug the camera in. Yeah.

TH: It’s funny. How that’s like history now, right? So but they added a feature called photo tagging so I could tag, you know, Tristan, and then in front of all of his friends, he would know that he was tagged in a photo by Emma, and then he’d say, Oh my God, Emma tagged me in the photo, I should, like, respond to that.

Or I have to, I have to make sure it’s a good photo. Otherwise I have to take it down and that was the best way to just get people getting sucked into Facebook for hours and hours a day. Because now all these photos are linked to people’s faces and names and you’re clicking through photos and you’re tapping into people’s vanity and sense of social validation.

And Zuckerberg and his team knew exactly what they were doing when they designed that feature. And they, you know, in different cases when they invented the like button, they thought they were spreading love and positivity in the world, but they also knew they were creating a feedback mechanism. And if you post something and instead of getting no feedback, you suddenly within minutes or seconds you would get ten likes or something like that, that that would be an addictive feature because you suddenly you have this reason to post and you know, you can get some dopamine and some oxytocin and serotonin from just dosing people with likes so they knew what they were doing when TikTok invented the, you know, the infinite scroll for, you know, for second video interface that you just infinitely go through, they know that if they remove the stopping cues, meaning they make it so that it automatically scrolls to the next video. They hide the clock, for example. They know exactly what they’re doing. Actually, a metaphor for listeners isin the casino.

Many of you may know that the designers of the casino actually hide clocks because they want you to lose track of time while you’re in the environment. They want you thinking about time. I didn’t realize it until recently, but if you’re using a TikTok on an iPhone that doesn’t have the notch at the top, but a flat top, TikTok will hide the clock on your phone because they don’t want you thinking about what time it is and how much time has passed since you’ve been scrolling.

These are deliberate design choices, and they do optimize down to the color, down to the, you know, the animations, down to the speed and time it takes for the new video to load. This is all optimized like a machine, and the results of which is that when we’re the product and not the customer, we are worth mor again, when we are addicted, distracted, outraged, narcissistic, validation, seeking and polarized than we are as healthy democratic citizens or as growing children. Right? 

These people who are designing streaks on Snapchat, which show the number of days in a row that kids have sent a message to their friend, which they did just to addict people. Because now if you if you’ve been sending a message to your best friend for 30 days, in a row, now if there’s 30, there’s a number 30 and a fireball next to your name, I don’t want that number 30 to go away just like that I’ve been working at the gym for ten days in a row. It makes it harder to not go the next day, but they’re using that not to help me go to the gym more often, which I might independently choose. They’re using that to support their goal of driving you to keep engaging. And that’s the problem.

EJ: I had a 700 day snap streak once.

TH: Wow. OK, so tell me then, so you have it for share. I mean, I might sound like I’m spinning a conspiracy theory, but it was probably pretty hard to to let go of that 700 day streak, right?

EJ: It was accidental. Yeah, it was. it was really a traumatic moment when the streak went away because my friend that I had it with was in a different time zone. And we just we lost track and it got messed up. But I’m curious what the R&D looks like in that space. Like, is this informed by scientists? Is it AB tested? Is it focus grouped? I mean, how do they optimize these platforms to function in that machine-like way, they’re optimized for this addiction and for keeping you engaged? 

TH: The word is engagement. That’s the word everybody should know is what is engagement. It means seven day actives, meaning users who have been active in the last seven days. That number should go up. The number of sessions that you’ve engaged with per day that should go up, the number of amount of minutes per procession should go up.

The number of total users should go up. The number of users who invite other users should go up. That’s all engagement that was captured in The Social Dilemma where you had the artificial intelligence robot guy sitting behind the control screen that’s basically in a control room that’s trying to increase, you know, time spent advertising per minute spent and then also how many users you invite because all those drive up engagement.

So at this point and even back then, it was pretty well known how to manipulate people’s social psychology. And again, I mentioned, you know, there is a field called Persuasive Technology, which I want to make sure I’m very clear. B.J Fogg, the professor who created this field and wrote the book on Persuasive Technology, who I studied with. He wanted he warned the FTC and I think the late nineties about the ethics of persuasive technology and the need to regulate the space because it’s going to turn into a huge problem.

And he’s not the evil guy with the mustache, you know, twisting it to say, how can I ruin the world? He really was trying to actually use persuasive technology for good. He actually created a whole initiative. How do you use persuasive technology to create more peace in the world. So he did a lot of things there, but there’s a whole space in which once you show people, once there’s this increasing encyclopedia of techniques of how to manipulate people.

And again, I was a magician as a kid, so I knew that there’s a hundred techniques to manipulate people, sense of cause and effect attention, their reasoning, their memory of what happened. There’s a thousand techniques and most people don’t want to admit this is true, but it is totally true. It’s just built into how our minds work. And behavioral economics kind of reinforced that.

And there’s a bunch of conferences in Silicon Valley, in books. There’s a guy named Ariel who had a conference called Hooked, and I spoke at it once. And I remember distinctly he had the head of growth at Pinterest. They’re giving a talk and he said, Now, what would happen if a user gets burnt out and they stopped using the service like a drug lord?

It’s like you stopped using what do they do then? What? He said, Oh, that’s no problem. You just do the following ten techniques and it gets them using it right again, right back. And that’s what people should know is that there is an entire discipline to about how to kind of influence people psychology and it wasn’t done because they wanted to wreck the world.

It’s they wanted to be successful for their business. But the underlying terrain, on top of which that success would be built, was by degrading the soil out of which our society grows. Right? It’s like you’re telling the soil for, you know, for children’s development, how their minds work and you instead of saying, how do we cultivate that soil?

So it grows the best children and, you know, you know, maturity and, you know, healthy development in the world, they’re just saying, how do we keep kids addicted to Snapchat? How do we keep kids addicted to TikTok? And it works really, really well. And I recommend for any parents out there, you have to note, this is not built for the health of your kids and simply limiting the time that they spend is like saying, well, I’ll just limit the amount of heroin that I get every day to like a few milligrams.

So let’s just not give the kids heroin and let’s remove their social communication from an environment which is dominated by that kind of heroin. Like how do we get communication out of that space? There’s a great organization called Fairplay that used to be called the Campaign for Commercial Free Childhood that has been working on that for a long time.

And we’re all allies trying to liberate essentially humidity from the shackles of this manipulative environment.

EJ: Yeah, and you know, it’s your personal lives and your politics. It’s everything that’s in the heroin is informing every conversation that we have. And Social Dilemma came out. I think it was like right on the cusp of TikTok becoming really big. I know you’ve been doing some work in the national security space, too, which is a huge conversation, obviously, when it comes to TikTok which, is now, I think the number one website in America. What stands out to you? You just sort of talked about it a little bit, but what about TikTok stands out to you and in this context of the attention economy?

TH: OK, I’m so glad you asked this question. And I think it’s just the most critical thing. And also from a national security perspective, if I could get the U.S. national security community to ban TikTok right now, I would. And we’ll get into that. Let’s quickly talk about how to TikTok, as you mentioned recently, overtook YouTube and Facebook and Instagram and I think engagement time.

I can’t remember which place it might be globally. Or in certain markets, but they have basically eaten the lunch of all the other social media companies. In fact, I actually I was at a events theater a couple of months ago and a woman came up to me. She said, oh, I saw The Social Dilemma. Thank you so much, I stopped using all social media. Now I just use Ti​​kTok. And I was really sad by that. Because the real message of the film is not You should be angry at one or two of these apps. It’s the race to the bottom of the brain stem to get attention that thing is going to hurt human humanity and TikTok, the way to beat Facebook and YouTube in the attention wars and the race to the bottom of the brain stem is they optimized for one thing, which is you’re watching a video and I’m going to point the biggest supercomputer in the world sitting in Beijing at your brain stem and your kid’s brain stem and at your uncle who you know, and I’m going to calculate the perfect next video to show that and it will know you better than you know yourself. It will know what will keep you there better than your prefrontal cortex or the part of your brain that basically has your willpower and says, oh, I have to do something else. It will calculate how to defeat your prefrontal cortex.

People should be very concerned because the Chinese Communist Party has influence over ByteDance, which is the company that owns TikTok.

TikTok has basically become the number one social media app around the world. Would you have allowed the Soviet Union during the Cold War to basically control television programing for all Western democracies and pick what they see and they don’t see? I mean, it is absolutely insane that we’ve allowed this to happen.

And by the way, for those who think that I’m being, you know, a conspiracy theorist, or xenophobic, China does not allow any U.S. social media platforms in their country. So they don’t have Facebook or Twitter because they think the U.S. would be running an influence operation on their citizens. Why? Given the fact that that’s their worldview, would we assume anything less than they would intend to use this platform for psychological?

And for most people, they think about TikTok. They’ve heard in the news about the U.S. government’s concerns about data and that they’re going to, TikTok will basically siphon up all this data on Americans. That’s true. But I’m even more worried about them being able to remotely influence basically the next president, the United States. They can say, yeah, we want it to be this person.

They can look at all the voting districts in the swing states and they can say per zip code. I mean, I don’t need Frank Luntz, the political pollster, because I can basically look at people’s sentiments with an AI that calculates what people’s opinions are in all the key voting areas. And then I can strategically up regulate everybody who starts to say, you know, China’s really not so bad.

We should look more what they’re doing. If China were to invade Taiwan tomorrow, they could up regulate all the American voices who are saying that Taiwan was always a part of China. And I was actually talking to a U.S. senator who said, who do you think the Chinese government considers the biggest rival to its power? And of course, you would think it’s like the United States but they said, no, no, no.

They consider their own technology companies to be the biggest rival to the CCP. And for those who don’t know, China actually massively regulates its own technology company. So if you use TikTok in China, when you scroll, you don’t get influencer videos. They actually feature science experiments you can do at home, educational content, patriotism videos. They have closing hours and opening hours.

It actually goes lights out at 10 p.m. They limit you to 40 minutes a day because they want their kids to be scientists, astronauts and engineers, and they want their military to be successful in the future. And where they’re happy to ship the version of TikTok, that’s essentially opium for the masses to the entire West and I think it’s like a reversal of those, you know, history. You know, what they call the great humiliation, which is when their society fell behind because of opium. This is kind of a reversal of that and I don’t think it’s because of one diabolical plan. I think that there’s this Chinese company that was just super successful at playing this game called The Race to the Bottom of the Brainstem.

And now the West has to respond.

EJ: Unbelievable. And I guess there was a Sarah Fisher report in Axios just yesterday about how Oracle is now auditing TikTok data and that was both on two fronts. It was on the front of privacy, and it’s on the front of algorithmic control. And, you know, it’s sort of like, look at Oracle’s doing this and they’re going to, you know, disincentivize any sort of nefarious behavior on behalf of the CCP in the United States.

But it’s also incredibly hard to trust that process. So just on what do you think about is that a solution going forward of TikTok, is that something that’s really a meaningful step in a better direction?

TH: Well, I think on the Oracle side, that’s about data. So where is the data going? Where is it getting stored? Who has access to it? Oracle could provide some slightly more trust in that area, but how would we know that TikTok is not manipulating which content people are seeing in all the swing states? TikTok, for example, does not release to researchers a what’s called an API or something where you can researchers can basically look at the full firehose of all the videos. Twitter, for example, does offer that firehose or researchers can actually, you know, do computation and analytics and math on what is actually moving through through Twitter and what is getting amplified, what, you know, what are the networks of, you know, where teams are traveling TikTok doesn’t offer that. 

So they could and they’ve already done this, by the way, there’s been research by the Tech Observatory at UC Berkeley that has been studying, especially how TikTok was influenced during this war in Ukraine, where because Russia passed what they call the, quote, fake news law in which basically if you said it’s at the beginning of the war, all the videos that were posted on TikTok, most of them from around the world were against Putin. And after Russia passed the fake news law, because they have a lot of citizens who use a TikTok. I was told just today, this morning that Russia is the fifth biggest market for TikTok in the world. So, you know, this is a big market but because they’re allied, TikTok basically no longer pushes any pro-Ukraine content and anti Russia content. And all the videos that you see are, you know, why Putin’s great and why the war is justified and everything else. And again, that’s slightly more visible. But they never announced that they were making that change. 

So TikTok can we actually know that they have the controls to flip a switch. And in certain countries, they can basically make people feel one way or the other and they can do that per country.

If I was China and the CCP, I would take TikTok and remotely influence, you know, all of Africa. So along with my debt traps that I’ve laid for all the African nations and building airports and then putting them in debt to have influence over them in the future. I would also use TikTok to make people think in Africa that, hey, look, this is Putin just actually fighting back against Nato.

This is totally justified and I could basically control and gerrymander the future votes of other countries and my soft power to coordinate how the world works. So we’re going all the way from email addiction and pulling to refresh in the slot machine all the way up to a geopolitical power game. But I really, really want people to get just how critical this is.

And I think it’s something that everyone can be aligned on, it’s not a partisan issue.

EJ: No, it’s not at all. And you’ve been having these conversations increasingly, I know, with people in politics, with people in business. And, you know, you just mentioned having this conversation with a U.S. senator. Are you finding that some of this is like completely unchartered territory to the point where people don’t, haven’t this put two and two together because it’s happening so quickly?

And it’s so new that when you do explain it to them, they get the gravity of it or is it still just sort of hazy?

EJ: So the meta problem, as it were in media, I don’t mean Facebook Meta, I mean the Meta of the problem, but that’s about the problems. We go back to E.O. Wilson to answer your question, which is that the fundamental problem of humanity is we have Paleolithic brains and emotions, medieval institutions and God like accelerating technology. And it’s so important to come back to that because you know, it is always the case that technology moves faster than institutions.

You know, we get cars before we get seatbelts in traffic lights, we get railroads before we get the regulation of railroads, we get dynamite before we realize, hey, we should actually not sell dynamite in every, you know, CVS down the corner, which, by the way, you used to be able to get. So we don’t usually realize that we’ve created a God like technology until afterwards.

And it needs some kind of guardrails. And guardrails is what we’re looking for here, not free speech, ministry of truth. We’re looking for. How do we set up the incentives in some safe ways for this to work? But I think the problem to answer your question, talking to U.S. senators or national security leaders, you’re sort of saying, hey, do they do they put two and two together?

I think slowly some are. But you have to paint the whole picture. Because if I’m a general that went to West Point and trained in all this military theory and, you know, I know about, you know, the position of all the U.S. Navy and, you know, where the nukes are. And I know how to think about the power calculus.

And I worked at the office of net assessment at the Pentagon or all that stuff. Why would I know what my 14 year old daughter is doing on TikTok or that China might be just influencing certain zip codes? Like how would that reach my dashboard? And there’s all sorts of cases, even after 9-11, where there’s an institutional gap where we have some institutions, we have a Space Force and we have a Navy and we have an Air Force, we don’t have a Metaverse force.

We don’t have a social media defense force. And I do think that population centric psychological warfare is the new is one of the new modes of warfare along with cyber warfare. I don’t have to actually have F-35 to match you in kinetic military capacity if I can actually do a cyber attack and, you know, hack the Colonial Pipeline, which later was released, apparently that the Colonial Pipeline was two days away from basically having major cascading effects that could have shut down the U.S. economy if it’s just been two days longer than it was and a cyber weapon cost a lot less than an F-35 and a social media weapon that makes your population angry at each other so that all the energy goes into …because everyone’s just fighting with each other all the time. And meanwhile, I’m just playing chess and beating you by routing around all your decisions. That’s the thing I want to shake people out of. It’s like the US has to reboot its society a bit into a 21st century society that also has a positive view of how tech can be embraced.

I think the last thing is, you know, China is consciously employing the full suite of 21st Century technologies from big data to A.I. surveillance to quantum computing and to make a stronger form of authoritarian society, a society that we wouldn’t want in with Western values, but they’re consciously employing tech plus authoritarianism equals stronger, better authoritarianism. Meanwhile, democracies are not consciously employing the full suite of tech to make stronger democracies.

Instead, we have allowed private tech companies to profit from degrading democracies into not working as well as they need to. So we can’t just be satisfied with slightly less toxic social media. I want to move to a world where tech plus democracy equals stronger democracy that has an embrace of tech, not an opposition to tech. And that’s a nuance people might not expect from me because they think Social Dilemma is kind of a Luddite film or something.

EJ: Well, the Gmail example that you gave earlier is really sticking with me because speaking of Meta, when you described looking around a cafe and seeing how more than half of the people were on Gmail and people keep it open all day and in some sense it is this very physical workspace. It’s become over, especially after the pandemic and going forward even more so. And I have an Oculus and I see the programs that they have for people to work virtually, to work in the metaverse. And they’re kind of crude right now. But we know that one of the most powerful companies in the world, Meta, is investing really heavily on making those spaces, well, addictive in the same way that Gmail was being made addictive.

But despite the fact that this was on everybody’s laptops at the cafe, that company didn’t have a team working on the ethics of it, period. Even though it was consuming so much time, so much of the world’s time, and being controlled by a fairly, you know, small team compared to the rest of the world. And in Northern California, so I’m wondering what you think is at stake in the metaverse.

I remember Sheryl Sandberg in the pandemic basically bragging about talking to faith leaders, about getting their churches into the metaverse. And that might sound great in the same way that a lot of other steps in tech sounded really great. But what ends up happening is you have control of your worship spaces, you have control of your political spaces, in the hands of one company.

So what worries you? Or maybe what gives you optimism about the metaverse going forward?

TH: Well, I think we’ve already seen how positively framed mission statements like we’re going to connect the whole world together and we’re going to bring the world closer together, which was the new mission statement for Facebook or, you know, we’re going to host the, you know, the public square for Twitter. Each of those positive mission statements and narratives were not what the underlying machines that they built were designed to optimize for.

Like Facebook was not designed to make the world more open and connected. It was designed for engagement. What can I show you? What can I get you invited to? What can I tag you with that keeps you coming back to this system over and over again? That’s what they’re designing for. And there’s a gap between the mission statement and what the machine is optimized for.

Same thing with Twitter. It is also a place where people can publicly post their thoughts to the whole world. But the actual machine, like if you talked to the engineers, how are they incentivized? What are they literally performance bonuses? So if I’m an employee at Twitter and I work on the newsfeed, how do I, like make a bonus?

Well, I showed that engagement went up by 10% while I was working on this feature. And that’s what gets me my bonus next year. And so long as that’s how the incentives are structured, we’re going to see those results. So when you ask about VR and the metaverse, we’re going to hear the same thing. We’re going to hear the positive narrative story. We can be able to connect the world and connect church groups together, and people are going to spend time together.

There’s all those things could happen, and many of them will be positive. But that’s not the machine that they have built. The 10,000 engineers that they want to hire to build VR in the metaverse are not going to be asked to say, how do we, you know, liberate the best of humanity, they’re going to be tasked with how do we make our VR platform the most engaging and addictive it can be, especially so long as it outcompete the other VR platforms.

So it’s another race to the bottom of the brainstem. Now to leave people with some hope, you know, in the social media world before we get to VR. But in the regular, today’s world, Apple is one of the interesting actors that I think needs to have far more pressure put upon them because they often get away with they’re definitely not one of the bad guys, and that makes them a good guy because they’re not a social media company and they don’t optimize for engagement.

And their business model is selling you a new phone every two years. And I also want to applaud the things that they have done, the privacy tracking features that they’ve done, the screen time features, the Do Not Disturb features. You know, these are all in the direction of more humane technology. And when Tim Cook announced, by the way, that they were doing these privacy tracking features, he said out loud, We cannot allow a social dilemma to become a social catastrophe.

So they’re on board. But Apple could do so much more to change, if you will, almost like the digital Constitution for how all apps can and cannot behave. So if we want a Geneva Convention in this arms race for attention, Apple could implement it because all the apps have to live in their app store and you know, people would say, well, Apple, that has too much power.

We shouldn’t have a private entity like they have too much, which is exactly true. We shouldn’t have. You know, which is the point you were also making, is the VR world going to get owned by $1 trillion, you know, $2 trillion tech companies? That’s not a good environment. We need to be in the democratic interest. So we have to figure out some different governing structure but I do think that short term, if you wanted to change all the perverse stuff that we’ve laid out in the last hour, I think that Apple could a year from now, within less than a year from now, ship on the next iPhone, a different set of rules that says you can’t do auto playing videos for underage kids and then no end. A lot of playing videos for underage kids and suddenly you’ve solved the TikTok problem. And they’re already making changes that are in this direction. We just need to demand so much more.

EJ: Yeah. I mean, it’s amazing. That’s kind of my next question, I was going to ask you about the Farhad Manjoo column “I was wrong about Facebook” in that long series of mea culpas that The New York Times tasked its editorial staff to do.

TH: To be honest, I actually haven’t seen those. Could you say more about them?

EJ: Yeah, no, absolutely. So they all had this challenge where they issued a mea culpa, something they were wrong about. So some of them were, you know, very political, some of them were more cultural. And Farhad’s was “I was wrong about Facebook.” And he wrote “The site has crossed a threshold. It is now so widely trafficked that it’s fast becoming a routine aid to social interaction like email and antiperspirant.”

And it’s a sort of a clever way of putting it, but a very true one. And the big question here is whether there’s a way really to put the toothpaste back in the tube. There are some even bigger questions about what it means that people on one side of the world I mean, I think about TV. Walter Cronkite broadcasting from Vietnam.

Betty White was on an experimental broadcast of TV, and she passed away in the age of TikTok. The stuff is just moving so quickly and connecting people from different cultures so quickly that in ways that were never envisioned and were not clearly keeping up with the rate of change. So is there a way to sort of put the toothpaste back in the tube and have a healthier economy or is it sort of, it should we be more defeatist and practice some sort of Benedict option to just get out while we can?

TH: Yeah, well, I think it’s important to note and again, I don’t want to say let’s copy China, but notice it’s almost like they saw The Social Dilemma and they enacted a whole horizontal, you know, slate of reforms right there. They don’t let 14 year olds, they limit you know time they do opening hours and closing hours so that no one has to use it at ten after 10 p.m. because there’s no social pressure that that your friends are not continuing to tag you and stuff at 11 p.m. and midnight and one in the morning now you’re not getting sleep like they’re enacting a whole suite of reforms.

I think that tech plus democracy can equal stronger democracy. I think the question is, what is adequate to accomplish that goal? Is just privacy legislation that limits the amount of data that they can collect going to get us to a world where the actual personalized engagement driven news feeds and on playing videos for 3 seconds are not going to continue to ruin your kids attention and mental health? Privacy doesn’t get you all the way there if you have antitrust and you just break them up into, you know, from two or three companies that are pursuing engagement, too. Now you have 100 companies that are even more aggressive in the race to the bottom of the brainstem. That also doesn’t get you to a world where they’re not racing for that engagement.

So we have to change the underlying North Star from engagement to something that actually has some public interests at heart. But you can’t be affecting children without caring about children and families getting better throws the technology not worse. If you know, tech plus attention has people better attention. Tech plus family should equal stronger families. Tech plus kids should equal better, healthier, better educated kids.

Now, that might sound uncomfortable, some people, but when you realize just how central it is, it’s like roads plus cities should equal, you know, better transportation for the whole city. Like the city should be humming along as a better, healthier you know, more functioning city with higher bandwidth and more throughput. And what we need is for social media to be like that.

But for the way our society works at a more comprehensive level, and I do think, like I said, that you’re kind of asking people to put the toothpaste back in the bottle. You know, we might need to kill a few cancer cells, like there’s some bad cancer cells out there. And I want a name that if a cancer cell has a social responsibility council or an advisory board or something like that, like it’s still a cancer cell, you know, if a cancer cell has a a research department to sort of give transparency reports about how how it’s affecting various organs, it’s still a cancer cell while it’s giving you that transparency we have to change the DNA of Facebook, Twitter, TikTok from the cancer cells that they are for our society into something that’s actually healthy. 

Now, we can either do that through regulation and actually forcing and deepening a change that has to comprehensively hit the way the employees and the incentives are structured. Or we also probably need some better newer platforms that are more optimized from the ground up to strengthen the way society works.

I know that might sound bold and aggressive, but I also think that’s the size of the kind of 21st Century digital democracy we should be living in too, which is the only way we’re going to compete with China anyway.

EJ: No, absolutely. I mean, on the right, people come all the time pitching their new social networks. I remember talking to the CEO of Parler a couple of years ago and saying, Well, speech stuff, great. What are you going to do differently than Twitter in terms of addictiveness? Because if you can’t answer that question I’m not interested in this product as a solution to the problems.

So let’s end on that. Tristan, is there reason for optimism? Is the mood in Silicon Valley one of innovation right now? Are people trying to come up with these ideas and these products, or is there still a brick wall? Or is it maybe a combination of all of those things?

TH: So I think I’ve been working on this for close to ten years now. And so there’s been a lot of patience and I want to, I’m sure a lot of people might be feeling pretty daunted by what we’ve kind of laid out, if it is so comprehensively, you know, toxic to many of the core support structures of the way our society works.

And I’m sure that they’re like, it’s a hard thing to look at every day. But back in 2013, if you told me that Facebook stock price would be cut in half, which it recently was through the Apple privacy changes that happened that the CEO of Apple, Tim Cook, would be saying we can’t allow a social dilemma to become a social catastrophe, that the attorney generals, just like they did for Big Tobacco, created a lawsuit that increased the costs that changed the business of tobacco from being a social norm to now being a defunct industry.

They did the same thing. We have an energy lawsuit for a consumer harms to kids that is now emerging. We have employees coming out and whistleblowers coming out, giving transparency to some of the externalities that have been hidden for a long time. We have, you know, Apple making changes in the right positive direction. We have culture and regulators starting to catch on.

We had the EU, at least, that’s passing the Digital Services Act, Digital Markets Act. So we’re seeing way more movement than we’ve ever seen in the ten years that at least I’ve been working on this. And I want people to feel the momentum because it is picking up and picking up speed do I think that Facebook or Twitter or TikTok have taken in this critique and said, you know what, you’re right, we have to totally transform everything we’re doing now and I also don’t expect them to.

There’s a great line by Upton Sinclair, the American author. “You cannot get someone to question something that their salary depends on them not seeing,” and I think that Zuckerberg has built his identity, and just like the guy from TikTok and just like any of the companies you built your identity and your wealth around the success of something that does do some benefit in the world. 

But the optimization for engagement produces externalities and costs that live on the balance sheet of society, that break that society. And we can no longer afford those things. So whether they recognize it or not, I don’t want to wait for their consciousness to emerge. I want to make sure we get there through all the means possible through national security, through litigation, through employees coming out through Apple, making the relevant changes and creating the Geneva Convention.

There’s ways that this could change within a year from now in a pretty dramatic way. But we would need all those sectors acting at the level that we need to. We would need national security saying, hey, this is going to be a tier one priority in the national security strategy. 

You know, you could have these kinds of changes happen really quickly, but we need a focused effort from all these sectors acting a once.

EJ: Move fast and put things back together. Maybe that’s the new motto going forward.

TH: You said it, not me.

EJ: Well, Tristan, thank you so much for your time and your insights and all of the work that you do in this space. I think it is some of the most important work in American politics and culture and honestly in the world right now. So thank you for joining Federalist Radio Hour.

TH: So good to be with you. I really hope everyone takes this to heart and wherever you are listening to this, like in ways that you can contribute to this whole system moving, please do. Because I think a lot of people look at that and say somehow, you know, it’s going to change on its own. And we just do need everybody talking about it and making it a central issue.

We found even talking to the national security community, it was through their children who were affected by it, who saw Social Dilemma, who recommended it to their parents, who are in, you know, top positions that led to some of the change. So change can happen from a lot of places. Just keep that in mind.

EJ: Yeah, I’ve heard that, too, actually. I’ve heard similar stories about Social Dilemma’s impact on people. And, you know, on an optimistic note, Sinclair, who you just cited, was also reacting to tech and tech that we have discovered better ways to deal with and all of the years since then. And parents, by the way, should go to the Center for Humane Technologies website and look at their Ledger of Harms.

If there are any members of Congress listening, their staff go to the Ledger of Harms. It’s incredibly useful. You and your team do a great job with that, Tristan.

TH: Thank you so much. Yeah, we try to put out as many resources we can. We also have a Foundations of Humane Technology course for technologists, and we have people from Apple, Facebook, you know, Google, the United Nations, all sorts of places that are taking it. That’s also to try to train technologies in a different way to think about these problems.

And that’s also helpful for regulators and other folks working on these issues. So we’re all doing the best we can.

EJ: Yeah so if folks are interested they can check out the Center for Humane Technology and listen to Your Undivided Attention podcast that Tristan is the co-host of. You have been listening to another edition of The Federalist Radio Hour. I’m Emily Jashinsky, culture editor here at The Federalist. We’ll be back soon with more. Until then, be lovers of freedom and anxious for the free.


Emily Jashinsky is culture editor at The Federalist and host of Federalist Radio Hour. She previously covered politics as a commentary writer for the Washington Examiner. Prior to joining the Examiner, Emily was the spokeswoman for Young America’s Foundation. She’s interviewed leading politicians and entertainers and appeared regularly as a guest on major television news programs, including “Fox News Sunday,” “Media Buzz,” and “The McLaughlin Group.” Her work has been featured in the Wall Street Journal, the New York Post, Real Clear Politics, and more. Emily also serves as director of the National Journalism Center, host of The Hill’s weekly show “Rising Fridays,” and a visiting fellow at Independent Women’s Forum. Originally from Wisconsin, she is a graduate of George Washington University.

The Federalist logo eagle mark

Unlock commenting by joining the Federalist Community.

Subscribe


Read More From Original Article Here:

" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker