image description
1:10 mins

The Problem Podcast

Jon Talks Misinfo With Ev Williams and Dr. Joan Donovan

Well, Jon has gone and stepped in it again! So this week Jon is joined by Harvard professor and misinformation expert Dr. Joan Donovan to discuss RoganGate and how to combat misinformation. Later, Jon speaks with Ev Williams, co-founder of Twitter, the website where misinformation goes to superspread alongside Wordle scores.

THE PROBLEM WITH JON STEWART PODCAST

Episode 15 Final Transcript

Jon: We’re ready to do this thing. Should I — do I have to wait or no? We’re good? Already rolling. Here we go. Here we go.

[INTRO MUSIC CUE]

Jon: Hello, everybody, and welcome to The Problem With Jon Stewart podcast, we’re joining you live with Jay Jurden and Kris Acimovic where, you know, last week we did a podcast as well. I don’t even know if you know this, we do this every week now.

Kris: We do.

Jon: We do it every week.

Jay: It’s required.

Jon: Jay Jurden was on the podcast last week.

Jay: I was?

Jon: Along with Chelsea Devantez, yes, who was our head writer. And now we have our new head writer, Kris Acimovic.

Kris: That’s right.

Jay: That’s right. That’s right. The podcast last week was so crazy we had to change head writers.

[JON LAUGHS]

Kris: We had to switch. [KRIS LAUGHS] She was like, I’m leaving. I’m done with this.

Jay: Chelsea-

Jon: She said you know what? [JAY LAUGHS] I’m out of this joint. I am out of here.

Jay: She heard the fallout and she said, “You know what? Back to California.”

Jon: I’m going to miss that woman dearly, but Kris, we’re delighted to have you step up. And the reason, Kris, you know, we have, obviously, there’s a few writers on the show, but Kris, we elevated because of her fierce disciplinarian nature. She is —

Jay: Jon. We —

Kris: That’s right.

Jay: — already have summer reading from Kris. [KRIS LAUGHS] We already have homework.

Jon: It’s the physical intimidation for me. It’s not even the way that she forces.

Kris: Yeah. Jon’s like, if we’re going to bring someone up, she’s got to be just a huge b****.

[JAY LAUGHS]

Kris: And I was there. It was an easy call.

Jon: So we’re very excited, Kris, and delighted as we move forward. Solid podcast today. We’re going to be talking about misinformation, which I’m not sure if it’s a problem or not. I haven’t heard much about it recently. But first, we are going to talk about last week’s episode, which was a big pig pig f***

[KRIS LAUGHS].

Jay: Y’all I —

Jon: I’m a professional.

Kris: Pig f*** is a great term.

Jon: Last week we were talking about Spotify and Joe Rogan and some other things, and I had some comments on it. I thought, Well, geez, I — I’m always someone who prefers engagement and generally the commentary back from it, I thought was very measured coming my way.

[LAUGHS]

Kris: We essentially just sat back and waited for it to explode all over.

Jon: Amongst the f*** you, I’m done with you, Stewart. I’ll never forgive you for x, y, z. You’re off the rails, old man. Go away. I thought there were some interesting stuff, if you sifted through it, that was constructive. And I think some people made the point that economic pressure is also just another pressure point that you can apply to-to misinformation. You know, I think one of the greatest issues that we will face is — is that — that nexus of misinformation and disinformation and how do you deal with it on a corporate level, on a personal level? So I thought to myself, Well, geez, what if I get somebody who’s studied this sort of thing who is a doctor a –

Kris: Who knows something about it?

Interview with Dr. Joan Donovan

Jon: That’s right. And so that is Jay Jurden and [LAUGHS] so Jay is a doctor. No, we are delighted to be joined by Dr. Joan Donovan. Doctor, thank you so much for joining us.

Dr. Donovan: Thank you.

Jon: Please, please tell me what is-what is your role at the Harvard Kennedy School Shorenstein Center?

Dr. Donovan: So. Yeah, the Shorenstein Center, I’m the research director and I’m also the director of the Technology and Social Change Project. So I’ve been researching the internet for about a decade. I’ve been at Harvard about three years now and I look at misinformation, disinformation, media manipulation campaigns, and I look at how the internet is a tool just like any other, for bringing about the world you want to live in.

Jon: Doctor, you are the Rosetta Stone. And we appreciate you being here because this is the key to the future. So, did you have a chance to listen at all to the podcast from last week?

Dr. Donovan: I did.

Jon: Oh, OK.

Dr. Donovan: I did.

Jon: What in your opinion, did you feel like we got right? And what did we get wrong? And was the reaction to it in your mind expected knowing what you know of how the internet is where nuance goes to die.

Dr. Donovan: Yes, in the ivory tower is also where knowledge goes to die. So —

Jon: Aw boy.

Dr. Donovan: — we are very much.

Jon: Nicely done professor.

Dr. Donovan: I mean-

Kris: All right.

Dr. Donovan: Well, the problem is this, which is that, you know, people who are experts, doctors, have a really hard time communicating and getting their message out there. And so in this moment, when I was listening to what you were struggling with, there were a couple of key things that I think some definitions might help with. Right?

Jon: Boom.

Kris: Right.

Dr. Donovan: Not to sound too pedantic here, but when we talk about misinformation in this field, we really are talking about information that is shared where people don’t know its veracity or its accuracy. So Joe Rogan really falls into the misinformation camp of someone who’s just asking questions, has some ideas. He wants to hear from a range of different people. But the misinformation is never error corrected, right? Like a good editorial magazine or even a newspaper is going to have a way to do corrections that the next day you’re going to hear, “Hey, you know, we printed this thing and it was totally wrong.” But the internet has really facilitated this flow of information where error correction just never happens. And the way in which —

Jon: How does it differ from disinformation?

Dr. Donovan: That’s a campaign or an operation where you have —

Jon: Purposeful.

Dr. Donovan: — people who —

Kris: OK.

Dr. Donovan: — are purposeful there is intent there and which gets all the lawyers really rankled because they’re like, “How can you know what’s in a man’s heart?” But we know Rudy Giuliani was really out here to try to, you know, upset the outcome of the election.

Jon: OK.

Dr. Donovan: He said as much. And then there’s a lot of background information that help us make sense of it. But disinformation, to put it simply, is either it’s sharing inaccurate information for mostly political ends and sometimes financial ends.

Jon: And with purpose?

Dr. Donovan: With very real purpose and veracity and planning and, but when we talk about like digital disinformation, this is different than when you would say something like, you know, the way in which we handled the question around weapons of mass destruction in the 90s where, you know, politicians really had hoax some of the journalists into believing a state sponsored campaign.

Jon: Right?

Dr. Donovan: Digital disinformation.

Jon: By the way, I just want to point out very quickly.

Dr. Donovan: Yeah?

Jon: When you started talking about disinformation, doctor, Jay Jurden disappeared [LAUGHS] from the program. He’s there audio wise, but all of a sudden his video has gone out, which is uh very convenient.

Dr. Donovan: They don’t want us to talk about this, Jon. [JAY LAUGHS] It’s what they don’t want us to know.

Jon: That is exactly my point. I don’t want to and again.

Dr. Donovan: It’s in that conjuring of the “they.” That is where we start to look around.

Jon: Look who’s back.

Dr. Donovan: For evidence of who the “they” is.

Jay: Dr. Donovan.

Dr. Donovan: Hi Jay.

Jay: The reason this is so funny is because I didn’t know that I disappeared.

[KRIS LAUGHS]

Jay: So I might be a bad actor, but not even know that I’m spreading misinformation. I’m just here to retweet.

Dr. Donovan: Yeah and that’s sort of like, but that’s the like, “My uncle said it on Facebook” kind of excuse right? [JAY LAUGHS] He’s spreading misinformation.

Jon: Right.

Kris: Right. Right. Right.

Dr. Donovan: What changes it for someone like Rogen is it’s his brand. Controversy is his brand. Controversial conversations —

Jon: Right.

Dr. Donovan: — is what Spotify paid $100 million for.

Jon: Right.

Dr. Donovan: And Spotify wants to reject the fact that they’re a publisher. And in this moment of the digital revolution that we’re going through, to call yourself a platform actually means a very specific thing, where Spotify wants all the benefits of being called a platform where there’s a lot of user generated content, which creates a lot of chaos and opportunities for —

Jon: Right.

Dr. Donovan: — disinformation and misinformation, but they really knew what they were getting. All of these problems were there when Rogen was primarily using YouTube —

Jon: Right.

Dr. Donovan: — as a means to gather an audience.

Jon: So here’s where I run into trouble, and here’s why I believe — well, first of all, I know Joe, so I think you always grant more understanding and nuance to people you know, because you know them as more three-dimensionally than what that appearance is. So. And we always demonize those that we maybe feel alien to us. So that goes right in there. I’m already guilty of a bias, right?

Dr. Donovan: Mm hmm.

Jon: But the second part of it, where I run into trouble is the thing you just said. You talked about they want the benefit, but they don’t want the accountability, and you mentioned weapons of mass destruction and it reminded me like, the New York Times, right, was a giant purveyor of misinformation and disinformation, I don’t know that the Times was purposeful. But misinformation. And that’s as vaunted a media organization as you can find. But there was no accountability for them. And I think where I get nervous is, in the run up to the Iraq War and in the prosecution of the Iraq War, I was very vocal [JAY LAUGHS] sometimes cursed about that. But the mainstream view, of the New York Times, was “They have weapons of mass destruction, they have these tubes that can only be used for nuclear war.” Saddam Hussein is this, he’s that. Couldn’t I have gone down and fallen down this — if Viacom or Comedy Central had wanted to censor me or had wanted to take me out? Look, I’m not owed a platform. Nobody is. So it’s not a First Amendment issue. It’s not any that. We’re really — once you’re in bed with a corporation, the deal is they have to sell enough beer and you get to do what you want. [JAY LAUGHS]

Kris: Right.

Jon: But my point is this is these are shifting sands. And I think I get concerned with, well, who gets to decide what that — I mean, in the Iraq War, I was on the side of what you would think on the mainstream is misinformation. I was promoting what they would call misinformation. But it turned out to be right years later and the establishment media was wrong. And not only were they wrong. In some respects, you could make the case that they enabled a war that killed hundreds and hundreds of thousands of people. And never paid a price for it and never had accountability. And just having an ombudsman print a retraction, to me, isn’t an accountability. So it’s very easy to attack Rogan, and I’m not saying that that’s not your right and that there aren’t things there to talk about. But what I’m saying is let’s be careful. Because the sands can shift.

Dr. Donovan: Yeah, people are in a new information ecosystem and they’re trying their best. But that’s the thing about these platforms over the years —

Jon: Yeah.

Dr. Donovan: — is that we haven’t, we’ve asked no accountability from them.

Jon: Right.

Dr. Donovan: They’re not built by librarians who are actual stewards of information. And so it’s been, it’s taken us a long time, at least the last decade, to get into the moment where we ask more from these companies. We’re asking, we want access to the truth. We want it to come up first. We want more public interest content. And so someone like Rogan really straddles the line because he reaches so many people. And he described it as a juggernaut. It’s out of his control now, but really, it’s not. It is in his control. It’s well within Spotify’s control.

Jon: So that’s kind of the idea then is, because what I was saying is I’m generally more concerned about the algorithm than I am about the individual. Because if the algorithm can earn your trust, it will place you into places that, you know, you assume that there’s a gatekeeper. It’s sort of like the news. When it’s in the New York Times, you assume that there is a gatekeeper there that has vetted this. But in reality our modern media is kind of an information laundering system where where the information comes from, gets laundered through the aggregation process or the click bait process or any of that. And so gossip and rumor become truth and fact become cannon very quickly.

Dr. Donovan: Well, this is interesting because if you think about the history of the internet, the early web sites that were really popular, Perez Hilton, right, people were here for gossip and rumor.

[JAY LAUGHS]

Kris: Yeah.

Dr. Donovan: They weren’t here for the truth.

Jon: Right, right, right.

Dr. Donovan: And news was actually really slow to get online, especially.

Jay: Dr. Donovan.

Dr. Donovan: It’s part of the infrastructure of the internet itself. So to be in this moment where we’re demanding more truths means that these platforms are becoming institutions right, like the New York Times, in a way, where people are asking them to be more accountable to the audiences that they claim to serve. And the Neil Young part of this is really interesting because tied up within it and very little is talked about is a labor dispute about Rogan getting $100 million and musicians getting, you know, a penny for three hundred and fifty plays.

Jon: Oh wow.

Dr. Donovan: And so within Neil Young’s twist –

Jon: — so there’s also an economic aspect to this that’s very different.

Dr. Donovan: Exactly. And a lot of it gets brought up, you know, in the moment where Neil Young just is like a catalyst for a bunch of different grievances that have been happening in the background, particularly about Rogen, but then also about Spotify having to stand on its two feet and say, “This is what we are.”

Jon: Right.

Dr. Donovan: So there’s been allegations of racism with Rogen. COVID misinfo, of course, that Neil Young talks about. There’s also lots of people that believe Rogen is anti-transgender, right, and has his own opinions about trans people in sports. But where is the platform at equal volume that allows trans people to counterweight what Rogen is saying? Right? And so there’s something happening in the information ecosystem where we went from platforms that were supposed to serve people —

Jon: — don’t you think that ended years and years ago? Because I would point to —

Dr. Donovan: — I mean.

Jon: Let’s use this as an example, let’s say you’re The Simpsons, right?

Kris: All of The Simpsons. [KRIS LAUGHS]

Jon: All of The Simpsons. You’re Lisa, Bart, Homer.

Dr. Donovan: OK, I’m Lisa though. [JAY LAUGHS] I’m Lisa.

Jon: All right. I’m Homer.

Dr. Donovan: You can be Homer. OK, yeah, I knew what was coming.

Jon: You’re on Fox, right? The same people that pay you, pay Hannity. What is your responsibility, you know, when you say, where’s the platform of equal voice? That’s the fairness doctrine, not for politics, but for social issues, right? But we’ve never had that. And as an artist, what’s your responsibility to the tube that you’re in and the company that you’re on, and I struggle with that. Like, I don’t know what you do with that. Like, do we now expect The Simpsons to say we will no longer be a part of this company? Or like, how far do we go with that?

Dr. Donovan: It’s it’s a big question right now, and I think we’re moving from culture wars to content wars in the sense that the way in which these fragmented, fringy opinions start to bubble up and coalesce online makes it seem like there’s a lot more people with the same ideas.

Jay: Yeah, like there were a lot of people that thought Lady Gaga should have gotten an Oscar nom, and that’s not the case.

Kris: It’s not.

Jay: Not the case at all. And that’s that’s innocent.

Dr. Donovan: Leave Lady Gaga out of this.

[LAUGHS]

Jon: I just want to point out very quickly, that’s @JayJurden.

Dr. Donovan: I would rather think about it, not from the perspective of being an artist or a comedian, but from the perspective of owning a technology company. Well, what do you do? How do you protect and provide safety for your audiences or your customers, which by and large, the customers are advertisers. The users are like the rose of cabbage that just get harvested, right? Like me and you are inconsequential individually.

Jon: Doctor, did you just refer to me as a row of cabbage?

Dr. Donovan: I mean, you just be, you know, you’re being cultivated.

Kris: I think just a single cabbage.

Jay: A single head of cabbage, not a row.

Dr. Donovan: Listen, listen. I’m a sociologist, and it’s an old reference to sociological theory by Weber, where it’s, you know, like.

Jon: You don’t have to explain to us, doctor. We all know. We know Weber.

Jay: Yeah, I didn’t know this was a freshman course.

Dr. Donovan: But the idea here is pretty simple, which is to say that, if you’re a technology company, there’s been all this confusion about what a platform is. A platform can mean a place from which to speak. A platform can mean a political agenda, and a platform now can mean a computational infrastructure that delivers content. And so that designation of a platform is something that we’re going to argue about, because we’re going to say, is it the responsibility of the person who’s speaking to be responsible to these audiences or we’re going to say it’s the computational infrastructure, the actual technology itself? And what’s interesting about the New York Times example is the New York Times would put the burden or the onus of disinformation on the sources and say these are the people who are most responsible. And we’ve seen very similar things with Facebook where they’re saying, “Well, we don’t know what’s traveling on this crazy superhighway of information. We don’t know where it goes. We don’t know.” And then you see something like, Stop the Steal happen and you’re like, if not for your technology, these groups never would have been able to get aligned and meet and plan. And so you are culpable or responsible or accountable for these actions. And that’s really why we have to understand these platforms is doing the organization in the coordination of things that happen from the wires to the weeds. And it’s really important that we understand that.

Jon: So what is our vetting system for this? Is it crowdsourcing? Like, I mean, Wikipedia to some extent does that. But you know, is the answer the blockchain of information?

Dr. Donovan: God, no. No. [KRIS LAUGHS]

Jon: That would be wrong. I have just failed out of the doctor’s class. [JAY AND KRIS LAUGH]

Dr. Donovan: Well, it’s slow and it’s reactionary in the sense that what we need to do is actually stem the flow and the tide of the damage that an individual, individual or small groups of people can do. And this is where the problem of platforming really comes in, because if you are able to distribute your information to millions and millions of people and you have no responsibility to what the aftermath is, the true costs are then paid by journalists that have to debunk it. The true costs are paid by academics that have to research it and then down the line, anybody who might be harmed by believing, you know that you can treat COVID with some kind of bleach or light therapy. And so what’s important is that we think about platforming differently in terms of the scale. And one of the things that I’ve really advocated for is that we reduce the scale and the speed by which information travels so as to be able to do what you’re suggesting is to have some kind of crowd sourced intervention or not to let information scale to a huge amount of people before we can actually have any evidence that it’s true or false, and this is like misinformation that ends up leading us to run on the, you know, the grocery stores where there’s no toilet paper and we’re like, why are we having a toilet paper shortage? And it’s because people are reading that, you know, there’s going to be a military call to arms and everything’s getting shut down.

Jon: What!

[LAUGHS]

Dr. Donovan: Again, it’s happening again. Get toilet paper Jon, get toilet paper. But that’s the thing is, it has these real world effects. The wires to the weeds is important.

Jon: So in terms of platforming, is engagement in your mind a fool’s errand? Engage — do you recommend pulling back or do you recommend engagement? Because I still believe in engagement like I’ve taught. I mean, damn, I had Donald Rumsfeld on my show. Like, how do you learn nuance without engagement and how do you get understanding without nuance? And I guess that’s my fear is that we lose that.

Jay: Can I say something also about that?

Jon: Please.

Jay: There’s one more thing that also kind of ties into the story in the narrative of that. For any other person, for any marginalized group. You don’t have the privilege or even the space to not engage with people who might not have your best interest at heart. There was never a time when women, black people or queer people were were able to live in America and say, I don’t want to engage. That is, it is a new take on interactions and power dynamics, but it is also a very privileged take and I don’t throw that word around all the time because it starts to, like, lose a lot of its meaning. But I’m a queer black person from Mississippi.

Jon: Wait, what?

Jay: I — Yes, yes, yes.

Jon: How did you get past our hiring process?

[LAUGHS]

Jay: Jon, I want to say Twitter does a lot of bad, but it also does some good. Now, the way that people talk about not engaging with someone is they go, “Oh, I just would never even talk to them.” And that’s amazing as long as that person isn’t your boss, as long as that person doesn’t provide you with housing, as long as that person doesn’t provide you with an opportunity to make some money. And as long as that person isn’t a gatekeeper for you to have access just on day to day things. So I think engagement for people online shouldn’t start to mean engagement in general and in total, because the story of America is a lot of people having to engage with other people who do not have their best interest at heart. And you don’t. It’s not a kumbaya moment, and you don’t even always have to engage with them in a positive way. But I think there’s an ability of a lot of people to say, “Hey, f*** you, that’s wrong.” And that’s still engagement. I think black people saying, “I don’t like this.” Black people telling Joe Rogan, “Hey, man, I know you have friends that are black comics who were f***ing get in your a**over this new compilation that just popped up on the internet.” That’s a level of engagement that you have to discern, and you have to be very intentional and specific when you say, “Oh, I would never engage” because women don’t have the ability, minorities don’t have that ability. A lot of people for the longest time haven’t had that ability.

Jon: Boy, that’s — Jay, I think that’s such a great point. For the majority of the world, you’re right, Jay. It’s not the privilege of like, “I’m taking my ball and going home.” This is my life.

Jay: and saying f*** you, saying f*** you, or saying I don’t believe that or even saying, OK, this is the study you’re going to cite, this is the study I’m going to cite. That’s engagement. Yeah.

Jon: Doctor?

Dr. Donovan: Well, you’re all right. Everybody gets an A-plus.

Jay: Yay! Hit the air horn.

[JON AND KRIS LAUGH]

Dr. Donovan: But I think there’s, you know, there’s a couple of things going on here where engagements, on the one hand, we’re talking about it, is this interpersonal relationship something where you’re saying, you know, should I talk to my aunt who’s like, gone down this rabbit hole, right? And the familiarness that you’re talking about earlier about why and when you’re willing to give someone a pass and what you’re going to engage on and how you’re going to have your boundaries, that’s really important. We just came out of four years of having one of the most divisive presidency is political polarization moments in our history, not just because, like Trump is who he is, but because everybody was called to a tone. Everybody was called to say something. Everybody was called to have an opinion. And social media became the way in which they express that. And then Twitter trends in other kinds of technologies. And these algorithms really worked on harnessing those and that information and then polarizing it so that you would go one way or another. And remember that the internet itself is highly participatory. Facebook is empty shelves. Amazon is an empty warehouse. You know, like YouTube is a blockbuster on a Friday night, where you can’t get any of the new releases. You know, there are all these places where people fill in the content, right? And so you have to think about it in terms of like, how do you measure this participation where everybody is being called to have an opinion on things that they might not have had an opinion on and would maybe in public conversation say, “Gee, I don’t know which way to come down on that issue” seems highly contentious.

Kris: I don’t think that’s allowed anymore to not know what side to come down on.

Dr. Donovan: It’s not. It’s like, but it’s as you think about it, though, and you start to scale up in the aggregation of all of those opinions makes us feel further and further pulled away from each other, which is why we look to influencers to set the kind of terms of the public debate that we’re going to have. And so someone like you engaging with someone like Rogan? People are going to argue that he or you are punching up. I’m reminded, of course, of the crossfire moment where you were just like, “Can you take this work more seriously?” You know, to Tucker Carlson. And that was an important moment, though.

Jon: But it was much misinterpreted because everybody said it was about civility. It had nothing to do with civility. It had everything to do with honesty. I don’t care if people yell at each other. I just want them to be honest.

Dr. Donovan: Yeah, you could see a trajectory in his hardening of his position over the years as he became more and more wedded to being an opinion maker, more and more wedded to being someone that has these really outside of the mainstream position on race.

Jay: I could see The Newsweek article now, “Jon Stewart responsible for Tucker Carlson.”

Jon: It’s a villain origin story, my God!

[LAUGHS]

Dr. Donovan: It kind of is, you know, but it’s to say that media is part of the issue that we have to address.

Jon: Would you have talked to Rumsfeld, doctor ?

Dr. Donovan: I have a brain trust of people that would never, ever let me do a public show with Rumsfeld because their jobs are on the line.

Jay: Because you don’t have a Ouija board that’s why?

Jon: Last last question, doctor. Here’s the last question. We have a show and we have a platform. So what would you suggest for us as measures to guard against? You know, even accidental harm, but still maintain kind of like my belief in engagement, which I think unfortunately I’m going to end up, you know, having that forever.

Dr. Donovan: Yeah, I think that’s OK. And that’s, you know, be an advocate for the truth. Mm hmm. What brings us towards clarity is hearing from other people and understanding from other people, but don’t get hoaxed going through the vetting process and making sure that the person isn’t just trying to turn a dime on colloidal silver or whatever supplement of day. You know, that they’re not going to come out here and be like, “I got a great new treatment. It’s called horse tranquilizers. You won’t feel a thing.” Right? You know, just do that background research and always try to tell the impact story. And this is something that I tell journalists all the time, which is that platform the people who are harmed by this stuff. Platform the people who don’t have voices in the debate. Or that people who are struggling with how to understand the world around them and what’s going to matter.

Jon: Thank you so much, doctor, for taking the time.

Dr. Donovan: Thank you.

Jon: I really enjoyed the conversation, such a fascinating world.

Dr. Donovan: Well, always welcome to clarify things. And if you need me at a moment’s notice, I’ll be here.

Jon: Thank you very much for all that. And I’m so glad not to have gotten detention.

Dr. Donovan: Well, I’ll see you in my office at 5:00. [JAY LAUGHS]

[WHOOSHING SOUND CUE]

Jon: Holy s***. Dr. Donovan was. She was awesome.

Kris: Yeah, I see why Harvard wants her there.

Jay: Too funny for Harvard. If you ask me, too funny.

Jon: Solid point. I would have, I would have pegged her on a maybe Brown. Maybe UPenn.

Jay: Yeah, yeah, that’s some Dartmouth level humor. I liked her.

Jon: Here’s the good news. Apparently, now we have earned a third of a doctorate. [JAY LAUGHS]

Kris: That is good news.

Jon: That was just from that. But fascinating conversation. Jay Jurden, by the way, I thought your point on minorities, people of color and women like not having the privilege of not being able to engage f***ing top notch.

Jay: I just think that we have to look at these things historically. If we’re going to say, “Oh, I would never engage” because at a certain point, you are kind of showing your implicit bias because you have to. At least–it sucks. I hate using Twitter words on here, but you have to be able to say, “No. My engagement is saying, ‘Hey, f*** you or, Hey, I don’t like this.’ “That is a level of engagement, people.

Jon: You’re absolutely right. And it’s part of the dialog. But along the lines of that, I did get a chance to talk to Ev Williams, who founded Twitter.

Jay: You went straight to the source.

Kris: Belly of the beast.

Jon: Here’s why he founded it. So his name is only two letters. And he was like, if my name can be two letters, I could probably create a whole platform where people only communicate with like, let’s give them 11 letters and then people are like, “I think you’re going to have to expand it a little bit.”

Jay: Oh, does he know? I just want to–Does he know how many people are naked on Twitter now? Is he aware of that of transformation?

Kris: Yeah, what do you mean naked on Twitter?

Jon By the way, Jay Jurden, if I can say, I don’t know that there’s a technology that has ever been invented for humankind that didn’t very quickly find a way to distribute porn. So it’s not like —

Jay: I just want to say, he’s got the best porn website out. Ev Williams, thank you.

Jon: Twitter?

Jay: Yes. Hey, you guys look in the chat I’m sending in the link, OK?

Jon: Oh, great, that’s going to completely corrupt my computer and the whole thing, and it’s not going to be good. All right. So we’re going to talk to Ev Williams, co-founder of Twitter, of blogger, founder and CEO of — If you’ve ever communicated through the spoken or typed word Ev Williams is, this is why. Can’t be stopped. Here we go.

[MUSIC]

Interview with Ev Williams

Jon: Ev, Thanks so much for joining us.

Ev: Thanks for having me, Jon. Very much my pleasure.

Jon: You know, you have empowered people to express themselves with these new technologies in a way that was unheard of years ago. In my day, not to date myself, but the only way you could express yourself is sitting at a bar yelling at the TV. What drove you to begin this journey of giving people the tools that they need to express themselves in a personal way?

Ev: Well, I grew up in the middle of the cornfields in Nebraska.

Jon Okay, so you were a literal child of the corn. You were. You were raised in —

Ev: — in Nebraska, in really the middle of the corn fields. And this is probably the other reason I wanted to get into computers because I was not really agreeable with the farm life and or the football, which is the other important thing in Nebraska.

Jon: So I think when you refer to it as the football that would be your —

Ev: — is it that obvious?

Jon: — that would be the giveaway.

Ev: Before the internet, so I did not have a lot of outlets, didn’t travel much, didn’t really connect with a lot of people. But I was very, very curious and really wanted to, you know, see, see the bigger world. So when the internet came along, I was just like, mind blown! And to me, the cool thing was the idea that you could access knowledge and ideas and put your knowledge and ideas out there, whether it was sort of that utopian technological vision of the world without gatekeepers and we’re all going to be smarter. And this is before I knew that that was the same thing said about radio and cable television, every other media that had come before. But that really captivated me.

Jon: Did you think we would all vote on it? That that would be the way, that we? I guess that’s reddit though. It would all be upvoting. [EV LAUGHS]

Ev: I just thought it’d be obvious. Oh, you make a good point, sir. You know, I agree with you. Let’s go with that.

Jon: Yes. But you start out with that idea of of blogger and sort of this — it’s essayistic, it’s almost an online diary.

EV: Mm hmm.

Jon: And how does that lead to, you know, the concision of Twitter? You know what? Let’s start with these paragraphs are a little much. Let’s try a sentence and an emoji.

Ev: If you look at the evolution of almost all technology, it’s about making things easier.

Jon: Mm hmm.

Ev: And that wasn’t necessarily the conscious thought beyond Twitter. We saw it as a new form. We didn’t see it as, “Oh, this is better than blogging or writing at length because it’s easier.” It was just, “What would happen if we made it really easy?” And we made it more real time. Twitter used to say, “What are you doing?” It wasn’t even thought of initially as make a statement about something. People would write things like eating tacos.

[JON LAUGHS].

Ev: You know, it was all very innocent.

Jon: I don’t wanna say anything, but I think that’s still on there. I think there’s a great deal of taco eating still going on on Twitter. Was there a moment in the evolution of this and where you sort of felt like we’ve done it? We’ve created this utopian community and democratized communication form. There must be a sense of elation in that.

Ev: Yeah. Well, I was a twenty-something kid trying to do something cool that would, you know, make money and whatever. But then I started getting emails, and this is back in blogger from women in Iraq who wrote me and said they could never express themselves before, and now they were writing. And I was just like, I didn’t quite know how to handle that, but that started to open my mind a bit to this. This maybe this may be kind of important and and a really powerful thing. And then with Twitter, so it’s a little more prepared for that. But again, it was about eating tacos. And then, you know, we saw all kinds of crazy things happen. And that was pretty weighty to hear about. And we never thought, Aha, we did it. We created the ultimate utopian world. But I think in those early days, we were very elated by the success.

Jon: Was there an oh s*** moment? Was there a moment in that where you went, “This will be perverted by those who don’t view it as benevolently as we do?”

Ev: Yeah, yeah. I think that became pretty apparent a couple of years in 2008, 2009 when that stuff was happening. We got a call from the State Department because we had posted about doing some maintenance again because it was barely working, and the U.S. State Department asked us to reschedule our maintenance because there was going to be a protest happening at that time. I forget which country.

Jon: I think that was Iran. I think that was in Tehran. They were cracking down and they were worried that the maintenance was going to prevent people from being able to effectively organize.

Ev: Right, right. We were 50 people and we certainly didn’t have any policy experts. We didn’t have any political expertise and mostly we were seeing upside. But I think it was later that we saw more of the down side and people can say it should have been obvious at that time and, you know, hindsight is 20/20.

Jon: It’s interesting to hear your perspective on it because in many respects, you know, it is, like you say, it’s a bunch of guys in a room trying to figure out, how do you get something like this to work? Oh, that’d be neat. What if we call it grunter? No, that’s not going to work. What if we call? You know that I think when it emerges, there’s always a sense of grand design and grand intention. And to hear your stories about it, it does seem a lot more happenstance than design.

Ev: Yeah, and I don’t want to underestimate the design of lots of people who are very thoughtful about. The system some people have talked about is the retweet a good thing or a bad thing? And there have been people who say that the retweets a very bad thing because it makes it too easy to spread disinformation. And I I was intimately involved in the design of the retweet, and there was an intent behind that design certainly wasn’t to spread misinformation, but it was to spread quality information. Like the idea of Twitter, we thought about the time, it’s not a social network. We kind of oppose that definition, at least I did early on because I wanted it to be an information network. And I thought the best possible thing is if you get quality information faster and can spread and we’re going to use the intelligence of the world to going back to that original notion of the internet. So that’s going to make everybody smarter, is going to make them more informed about important things. So let’s build the most efficient mechanisms possible to spread when something’s good. And in fact, that was meant to fight an echo chamber where you’re only hearing from your friends if there’s something good that anyone can say. The retweet is this amazing mechanism to broadcast it throughout the network based on its quality because it’s an intentional choice. And if your friend says this is important, you want to hear it, that was actually the intent. And I think there’s, I would defend that decision as well, and a lot of users didn’t like it because they’re like, “Oh, there’s strangers in the timeline. I don’t know these people who are in my timeline. I don’t like this retweet thing.” And we’re like, “Well, that’s how it works, because Twitter is not just about your friends.”

Jon: Right. We all now know what everybody thinks all the time for, for better or for worse. But in terms of the more high minded theoretical communication aspects of it, like this is a way for us to spread information more quickly. Or it’s a way for you to get information that other people think is important that could help expand your worldview. How much of that is tied to what you thought the business model of this whole thing was?

Ev: I think business model has a huge effect on how systems are designed, especially once they’re growing, especially once they are public companies. I think for Twitter, the equation is more or less how do we get more usage? More usage means more money, period. And it’s just a matter of what is the most efficient system for for generating clicks from from the right people. And Facebook and Google are dramatically more efficient at that than anyone else in the world because they have more data and they don’t pay anything for content. And so once that happened, then that became a game. They just sucked enormous amounts of money out of both existing advertising money and new new money. It doesn’t matter what you say, the quality has no relevance on whether you can make ad money,

Jon: So it starts to incentivize to the lurid or to the extreme, because that’s what’s going to draw wider engagement, more engagement, conflict and excess is going to sell better. I guess it’s funny because it just speaks to the high mindedness of sort of democratized communication versus the reverse engineering of those who wish to weaponize information. And that battle feels like the crux of all of this.

Ev: I actually worry a little less about the weaponizing, oddly, because I think it’s easier to identify and weaponizing runs the gamut from political intentions to profit intentions, but it’s more blatant. Remember early days of email like there was a lot of spam and like, “Oh, it’s just going to get worse and worse.” And we kind of figured out spam. I mean, I didn’t figure out spam, but smart people at ad companies figured out spam and we don’t deal with spam that much. You know, you could create a thousand Twitter accounts a day now to create a Twitter account, you actually have to enter a phone number. Now, can sophisticated people generate phone numbers? Yes. But your average person was like, Oh, how do I get another phone number? Sort of like, we thought we had a utopian society, so, you know, no locks on our doors and we left the keys in our car like we did in the cornfields. And it turns out we got we got to lock the doors, and I think that will happen and that will get better and better.

Jon: Is the difficulty there if the weaponization incentive aligns also with the profit incentive. And that’s what we see when, you know, like a Facebook will come in and they’ll say, “Hey, your algorithm drives extreme content and radicalization.”.

Ev: Mm hmm.

Jon: But unfortunately for them, it’s also their business model. Yep, yep. And so that puts you in kind of a netherworld.

Ev: Yeah, for sure. And the thing I worry about more is where good, you know, well-meaning people are spreading misinformation because they believe it. If my uncle shares anti-vax story.

Jon: That was your uncle.

Ev: That was my uncle.

Jon: I knew it was because those were coming to my inbox as well.

Ev: Yeah. Well, these systems are efficient at spreading that information. Then it’s essentially the same problem as offline media, I believe. You’re more aware of this than most people, I think, but like all the consternation about social networks spreading and bad information, and then you have Fox News and they, you know, they play off each other at this point to an extreme degree. But Fox News has a very clear profit motive. But the people who then propagate the stuff online don’t have the profit motive. They actually are concerned citizens. And so how do you do have the concerned citizen who has bad ideas in spreading those to other people? In the beginning, it was like, “Oh, well, people, just like it will be obvious what’s true?” Clearly, clearly that was not the case.

Jon: Well, in some ways when you look back, I don’t know if you feel this way, but it’s-I would imagine anybody that created, you know, the radio or a form of communication didn’t realize how quickly it could be perverted. You know, I would imagine Alexander Graham Bell, you know, when he made the first phone call, his second question to Watson was, “And what are you wearing”.

[EV LAUGHS]

Jon: Like because ultimately, these are tools and tools will always be only as good as those that are that are wielding them. And it does feel like we’re in a moment where the tools of social media. Are too easily exploited. By those who seek to create. Instability. Mm hmm. Like you say, oh, we have to lock the doors like there’s these tools can be used for good or bad. And that’s the negotiation I think that we make with everything in our society is that we have, as they say, all these wonderful tools. But they’re only as good as the quality of the individual that’s wielding them. And I guess the difficulty is, there are actors out there who have the time and tenacity to use them for perverse gains of power and money. And it feels like we don’t have a lot of bulwarks against that. Gotham City seems to be thriving, not a lot of Batman’s, how do we create more Batman’s?

Ev: Yeah, some people think the answer is decentralization and that the problem is, well, there’s only a few of these massive networks, and it’s not likely that a lot more of them are going to be created. But there are two things happening. One is those big networks will exist, I think, for a long time. There’s more niche networks where the behavior is much worse. I mean, those are like,

Jon: Oh, sure, you know, but they started them because they were not allowed to behave in the way that they wanted to.

Ev: Right, right. It’s like there’s this massive house party going on. And then there are some people who are kicked out and there’s questionable behavior and then the people are kicked out was like, “Oh, we’re going to go start our own party.”

Jon: Right, you’re all upset about Reddit and then you go to 8chan and go holy s***.

Ev: [EV LAUGHS] Right.

Ev: Yeah. So I mean, one could argue that Twitter and Facebook have done a relatively good job of keeping things civil for, you know, hundreds of millions of people. I mean, you go out and check out what could be happening and that will continue and that will be completely unmoderated and things like Telegram and Signal and just private groups, encrypted services.

Jon: Right?

Ev: Right. No one’s watching them. And then there are those who, in a decentralized world, whether it’s people trying to imagine what’s the Twitter that no one owns and that can’t be controlled centrally. It’s not clear at all why that won’t just be more anarchy and chaos if it were to be successful. I don’t think that would be necessarily successful. But the flip side of that is, can you also decentralize the enforcement of standards and reputation in moderation? I think that’s kind of interesting. If the analogy is like these things should be policed like a city and we or we need more Batmans, can that be something that people get together to do as citizens or private companies? Even that can do that, they can’t. Currently, if these systems are completely closed and centralized. But in a decentralized world, theoretically, that would be possible.

Jon: Maybe what the idea is, then if you might not look at it like? Decentralization, but view it as a utility to some extent. That if this is a utility, then maybe an algorithm of engagement is not necessarily the manner in which they need to conduct themselves, but part of it is you have to look at what are the corrosive elements of it and what are the things that are valuable to society? And how do we maximize some of that value while trying to minimize maybe some of the corrosion? And it’s hard not to have it come back to business model at some level, because that’s how it’s been designed. How do you break that apart without destroying what is their model?

Ev: I think YouTube and TikTok, you know, share that and maybe even more directly. It’s like if you’re into something, you can go very deep and that can be amazing. People can learn incredible things from these systems. In that sense, it’s by design, but I wouldn’t say necessarily. It’s by design that more engagement is good on almost any system, especially if it’s advertising based, even if your purpose is to help people learn something. You could go on a path on TikTok that is all about conspiracies, and it’s like you could go very quickly down that road and you can go down a completely different road. There are just a million roads.

Jon: On YouTube, you can learn how to, you know, rebuild an old car or join ISIS, and it’s basically the exact same algorithm that’s going to take you there. And you said something earlier that maybe I thought, what was a clue to it, which was, not crowdsourcing in the sense of like Wikipedia, but in the way of like crypto or blockchain. Is there a decentralized vetting that can occur? Look, I think we’d all agree that going down a rabbit hole to be radicalized for terrorism is very possible on some of these social media, but is a bad outcome. These are really sophisticated algorithms, and I find it hard to believe that the part that illuminates can’t be made brighter, and the part that corrodes can’t be slowed, it strikes me as not that that doesn’t feel right.

Ev: I completely agree. I think that is entirely possible, and there’s a bunch of ways you could imagine doing that. And I think that the companies are trying, it’s really not obvious what to do. And they do a bunch of stuff, but it’s not obvious how to like, how do we enable this, this going down this rabbit hole, but not this one is is really hard. But that said, I’m optimistic about a couple of approaches, one would be building more reputation into the system, so yes, these are powerful tools. The solution isn’t to get rid of the tools for everybody, like we can’t have nice things. That’s what it is that you earn your way to have influence over time. And then you get to the question, well, well, who should earn that and on what basis? And that’s where maybe the answer is not just the company or how much engagement they’re driving, which is basically how reputation systems work today. And I think of a flaw with them —

Jon: — It’s user-generated, usually those types of reputation things.

Ev: They are, but they can be.

Jon: Up votes, or you know.

Ev: Yeah. But I’m hopeful about the decentralization path in some aspects. I just think it’s going to take a very long time to work it out. I think people will opt in to healthier systems. That might not be a completely different systems. It could be just like, well, I’m only going to see tweets from people who have been verified. I’m making that up off spot. So I’m not actually saying that’s a good idea. I think these systems will evolve where people can kind of shield themselves. That could be bad. It could be like, OK, then we’re all living in our gated communities.

Jon: I think it’s important that everybody get a sense of what everybody’s thinking. I think where it changes is when the algorithm is driving you to see more and more divisive and conflict areas, like if the platform itself was relatively neutral, it would still move towards the lurid. Mm hmm. But I think it’s when it comes to that the business model itself is built on creating conflict and misinformation, because misinformation, it travels better. Crazy s*** travels better than normal s***. That’s just a maxim. The difficulty is when your business model celebrates that dichotomy.

Ev: Mm hmm.

Jon: And you know, and I know that and you’ve been inside it. You say there’s a lot of thoughtful conversation that goes around it, but I never get the sense that there is an urgency from those individuals over what exactly this is doing, how this is hyper accelerating the kinds of conflicts, that we’re seeing. My concern is more, if your business model celebrates that and you don’t have something in there to help mitigate it. What are we doing? It’s too important, it seems to be left to thoughtful boardroom conversation.

Ev: Yeah, I think it’s a fair critique to say there should be more urgency and more to be done. It’s been quite a while since I was on the inside of any of those conversations, and so I can’t speak to the current nature of them. I know that there’s a lot of a lot more is done, and a lot of the questions are just really, really thorny. That’s not meant to be an excuse. I just think it’s the business model is really get people to engage and then they’ll click on ads. It happens to be, you’re right. You know, misinformation and some of these things are profitable. It’s not the intent. I mean, I say my Twitter is delightful.

[JON LAUGHS].

Ev: If you can, it’s full of smart people who are.

Jon: — Alright, Ev Williams ,he’s a good follow. Hey, everybody. Click on the site, subscribe.

Ev: I barely tweet. I’m saying the people I follow. It’s full of smart people. But I mean, this is part of the reason at Medium we’ve totally focused on how do we align ourselves as a business with quality content?

Jon: Right, and you’ve made it and you’ve made it subscription. It’s a totally different model.

Ev: One hundred percent subscription and people pay for quality content. When we switch the subscription, we were thinking about advertising for a while and it became very clear to me once I thought deeply about it is especially when it comes to content written content that ads were completely contrary to quality, not just because bad information gets more clicks, but good information is expensive.

Jon: Yeah, no you’re right.

Ev: Right. And it costs money. It’s not just like what gets most attention. It’s what cost the least amount of money to create, and that’s the vast majority of content online. It’s like we got to get this stuff out there. It’s clicks per dollar really is the equation. And you know, that applies to a ton of media, as you know.

Jon: When you create Medium, is that a direct reaction to some of the negatives you saw from the more virulent social media? And where do you see its place? You know, where does it get a foothold? And are the more thoughtful communication tools relegated to a narrower audience just by the nature of them.

Ev: I started Medium almost 10 years ago, so a lot has happened since. And so much of what was being published because of this dynamic I was just talking about, it was just getting worse and worse. There wasn’t the right feedback loops to really support really good stuff online. I think that’s still — I think it should be much, much better than it is. And incentive structures, I think everything comes back to incentive structures.

Jon: No question.

Ev: Right. Money, money and status are the main reasons people put things out there. And so I wanted to create a system that rewarded really good stuff and that was longer form ideas and knowledge.

Jon: Mm hmm.

Ev: And so that’s really what we’ve been focused at.

Jon: So now, this is the way I sort of look at the media ecosystem, right? So the 24 hours or radio are the content engines. There’s sort of the sun. There is the fission that is creating just a constant stream of thoughts and ideas and talk and everything else. And then you have the sort of satellite internet aggregators. And they’re almost like, what you would consider the, you know, the what do they come to the panhandlers and they’re just they got the pans out and they’re sifting through and they’re looking for nuggets. So out of all this mass of content, they’re looking for s*** that’s going to pop on a headline. But that’s the business model. I’m going, I’m a bot.

Ev: Perfect example.

Jon: Finding that I’m going to pull out a nugget that I can weaponize, put it out there and then watch it, watch it, go and watch it spread. And it’s a nuclear reactor. You know, it’s pretty clear how this ecosystem has evolved.

Ev: Right. I did an interview a few years ago with the New York Times and the reporter kept asking me if I was sorry about Trump.

Jon: Right?

Ev: You know, if not for Twitter, Trump wouldn’t have been reelected, probably. And he kept saying like, “Are, you know, do you feel bad about it?” And I was just hemmed and hawed and, you know, I finally said something that sounded like, “Yes.”

Jon: You know, the headline, I see, I see it right now. Ev Williams apologizes for Twitter’s role in Trump’s rise.

Ev: Exactly. And the New York Times headline wasn’t that, but every other headline was that.

Jon: Because they’re the ones that are mining for that gold.

Ev: Yeah. What I have also seen is there’s a secondary effect of there’s the people mining for the gold. Like, how do we reframe this in a way to get a click? And I’m not letting The Times off the hook because what I see there’s the media ecosystem that plays out largely on Twitter, where people are writing for each other. And, you know what’s very uncool if you’re a journalist, is writing anything positive about a tech company or any institution whatsoever.

[JON LAUGHS].

Ev: Like that, you will get some s*** on Twitter like that. It’s a different formula.

Jon: This is one of the things that I remember at The Daily Show Towards the end. I started to feel the weight of social media and it’s, you know, one of the things I think is so important about any artistic pursuit is you develop kind of an internal barometer and integrity about the quality of what you’re doing, and you don’t ever want that bent by a magnetism of an audience reaction. But, It reminds me that we’re not supposed to know what everybody thinks of us.

Ev: One hundred percent. I learned not to look at my mentions on Twitter long ago. It’s bad for your mental health.

Jon: Right. Is it Frankenstein’s monster? Have you? Is it? Did you create a monster or do you still feel like know what, on the whole, it’s worth it.

Ev: On the whole, I think it’s great in general, and I use Twitter a lot. I’ll read people’s tweets. I don’t tweet much and I don’t look at what people are saying because I completely agree with what you’re saying. In fact, last night I was talking to my wife, I’m talking to Jon Stewart tomorrow. I used to do a lot of more publicity. I was like, “Well, this kind of nervous about this.” And I go, “Why? Why are you nervous?” Well, you know, Jon’s really smart, but I’m out of practice, but it’s a f***ing minefield out there.

Jon: That’s right.

Ev: And it does affect what you say.

Jon: And even when it’s not a minefield, like, even when you’re not trying to get somebody, somebody will find a way to make it that. Dude, man, I’ve really enjoyed this, it’s fascinating. I hope it was all worthwhile and that you don’t feel nervous.

Ev: Well, thanks for having me, Jon. It’s been a pleasure.

Jon: All right, sir.

[MUSIC]

Jay: It’s interesting that Twitter came about because he did not like football. I did not know that was going to be the reason.

Kris: Right, right. He is corn fed.

Jay: And now, people talk about football on Twitter, Ev.

Jon: I think honestly, it was. It’s, you know, the way he described it was like, “I was bored. And so, I created. I just there was nothing to do out there. And so I decided.”

Kris: To run an experiment on the world.

Jon: Yeah. You know what? I think also it kind of speaks to you always have the mindset that things are more intentional and manipulative than they are. And there’s a lot of happenstance and a lot of coincidence and a lot of good fortune and a lot of things that occur that aren’t, I would like to reduce communication to grunts. And it’s going to cause an explosion and it’s going to help fuel the divide in this country like, it’s just it’s not as of design as you might think.

Kris: Yeah, it’s all an accident. They just they have this idea and then it gets super, super big and then they’re like, how do we make money off this thing that everybody has?

Jon: That’s right.

Kris: And it’s not, it doesn’t work out very well, ever when they do that.

Jay: And venture capitalists say, you don’t have to worry about money for the first few years.

Kris: Yeah just make huge you.

Jay: I will give you money if you can make everyone do this.

Jon: There was a gold rush. I don’t know if they have it anymore, but like in the day when people are, like, what if anything, had a dot com on it? Here you go $10 million on the startup. I mean, people were just pouring money into these things.

Kris: Well and still the number one thing they ask you and you’re in like early seed rounds for these ideas is how can you scale it to essentially everybody in the world? And that’s the only plan you really need. You don’t really need a good idea. You need to know how to.

Jon: How to scale.

Jay: Kris, do you understand, but that’s like the problem we talked about with Dr. Donovan is scale and the ability to disseminate anything faster than you can even think.

Jon: But we have always had problems. I don’t think the human immune system adjusts very well to these new technologies, and it is my hope that my kids or they’re going it will develop an immunity to it. I mean, radio was when it first came out, just mind bogglingly destructive. TV as well. And now this. But I do think if the immune system can catch up and be more robust, we may find ourselves in a better, a better place.

Jay: Or we become cyborgs and the AI becomes part of us and we just —

[KRIS LAUGHS]

Jon: Jay, I don’t want to hear any more of your porn fantasies. I’m just done.

Jay: Yeah, my favorite part of that porn was when the girl went zero zero one zero one zero one zero zero one zero one zero one one one.

Jon: Also, before we get out of here, we do have a new segment coming from Alexa Loftus, one of our fine, fine writers. It’s a little something called “Let me distract you”. I hope it’s juggling. I always find that to be the best distraction, but here’s Alexa Loftus.

Writer Segment

Alexa: Hello there. There’s some stressful news out there, like another winter storm cancels hundreds of flights, U.S. deploys troops to NATO allies in Eastern Europe. And don’t get me started on high mercury levels in the Amazon. I will lose sleep about monkeys. So let me distract you with some not so stressful headlines. A recent pig to human kidney transplant was successful, and that’s good news for everyone except pigs. Not only could this save your life, it could also give you a great opening line at a party. All you have to do is say, Hi, I’m Alexa. Inside of me is a pig’s kidney. It hasn’t changed me at all, except now I crave the taste of slop. A man invented an instrument that measures odors, which is a great way to brag about how you have a lot of free time. I’ll buy one now if it’s under 20 bucks, and then when I’m like, Does something stink? I can whip out my nasal ranger and be like, Yup. And finally, whales don’t choke. And this kind of stuff is why I pay for the New York Times. They say it’s because they have huge throats. Yeah, I feel like that was assumed. They weigh 400000 pounds. Thanks for letting me distract you. Now feel free to get back to Tesla will drop self-driving feature that runs stop signs. Hey, how thoughtful Tesla. I have to say, though, it is unsatisfying giving the finger to an empty car.

Jon: That is our show, everybody, thanks to Kris, Jay. This awesome guys, thanks so much. Thanks to Dr. Joan Donovan, Ev Williams. Thanks for listening. For more content from The Problem, check out our newsletter. Subscribe at our website, which is like a newsletter, but with an address. I don’t really know the difference, to be quite honest with you. It’s got content and links lots of words. Problem.com. Check out the Apple TV Plus show, link in the episode description, and we’ll talk again next week.

Kris: Yes, we will.

Jon: Until then, bye bye.

Alexa: Bye!

[MUSIC].

Jon: The Problem with Jon Stewart podcast is an Apple TV Plus podcast and a joint Busboy production!

You MAY ALSO LIKE