MICHEL MARTIN, HOST:
Many Americans are facing different realities - disparities in COVID-19 deaths, for one. And right now, they're expressing outrage over different things. For some, stay-at-home orders have moved them to protest. For others, it's police violence against unarmed citizens. It can feel as though we're living in two different countries, so we're talking about that today.
And one of the places where you can feel America's divide is social media, where divisions are also stoked by trolls and disinformation. Social media companies try to monitor that to varying degrees and with varying success. But even that is divisive, as we've seen this week when President Trump expressed fury after some of his tweets were flagged by Twitter for potentially misleading information about mail-in balloting. Then
yesterday, the company flagged one of his tweets about the Minneapolis protests for, quote-unquote, "glorifying violence." President Trump signed an executive order aiming to weaken legal protections tech companies have had for years. It's unclear whether he has the authority to do that. But given all the events of recent days, we wanted to talk more about this, so we've called three prominent figures in this debate.
Kara Swisher is the co-founder of Recode - that's a site focused on tech news - and a New York Times contributing writer who's been pushing Twitter in particular to take far more aggressive steps in policing false information.
Kara Swisher, welcome back. Thanks for joining us once again.
KARA SWISHER: Thanks so much.
MARTIN: Vivian Schiller is the executive director of Aspen Digital. That's a nonpartisan organization that researches issues related to tech and digital media. And she has a long resume in both digital and traditional media, including as a president and CEO of NPR.
So very much welcome back to you, Vivian.
VIVIAN SCHILLER: Thanks, Michel. Glad to be here.
MARTIN: And Matthew Feeney is the director of the Project on Emerging Technologies at the Cato Institute. That's a libertarian research and advocacy group here in Washington, D.C.
Matthew Feeney, welcome to you. Thank you for joining us as well.
MATTHEW FEENEY: No, thanks for having me.
MARTIN: So, Kara Swisher, I'm going to start with you because on Wednesday, you wrote an op-ed in The New York Times. This was after Twitter labeled two of the president's tweets regarding mail-in ballots as potentially misleading, and they added some contextual information. And you called this move a baby step. Why?
SWISHER: Well, it's the first thing they can do. I mean, a lot of - they could have banned him. They could have taken down the tweets. They could have done a lot of things. But this is the first time Twitter has ever done any kind of - I don't want to use the word punishment because it's the wrong thing - they've ever taken any action against President Trump when he violates their rules, which he does all the time.
MARTIN: You know, we could spend a whole program talking about this. But tell us more broadly, what's your point of view about this?
SWISHER: Well, I think they are like publishers, and so they have some responsibility. Instead, they allow all this dreck to wash over their platforms that, in fact, get taken advantage of by malevolent forces. And so I think they should do more monitoring of it to - first of all, I'd love them just to get rid of some of the stuff that's obviously used for propaganda. But they just let anybody do anything, and what it does is create this sort of giant mess of humanity which nobody knows what is true or not. And then other players take advantage of it. Alex Jones was the most famous, obviously, and he got tossed off all the platforms.
MARTIN: So, Vivian Schiller, what - I want to say you are here in your own capacity as a media executive, as a person with a deep background. In addition to working at NPR as president and CEO, you've also worked for Twitter. But I'm going to ask you on your own hook to tell us what you think about this week's news.
SCHILLER: Look. I think Twitter has been slow on this. I think they should have taken action some time ago. That said, these are really tricky decisions. Twitter - now that they've taken an action with the president, it's going to be really difficult for them to keep up. You know, there are 500 million tweets roughly that are sent every day. And there are already calls across the board of, what about this tweet? What about that tweet? And it's going to be very, very tricky for them. I'm glad they took the move, but now the camel's nose is in the tent.
MARTIN: Mr. Feeney, what are your thoughts?
FEENEY: I think this puts not just Twitter but other social media firms into a tough spot here, which is that they are trying to figure out how to moderate this kind of content while remaining in the eyes of their users legitimate. And it's not a surprise to me at all that, in the wake of Twitter's decision to fact-check, that there have been allegations of inconsistency.
But anyone looking for inconsistency in this debate is sure to find it. There are hundreds of hours of footage uploaded to YouTube every minute. There are hundreds of millions of tweets posted every day. The system won't be perfect. And, in fact, the way that these firms treat content depends not oftentimes on the actual content itself but what's being said about it. And that is something I wish more people understood in the debate, which is, these are really, really hard decisions.
And unfortunately, I think you're seeing a lot of people on the political right in the United States feeling that somehow agencies would be better at this than private companies. And while I think we'll never be completely happy with how these private firms navigate these difficult content moderation decisions, I would much prefer private failure to public failure in this field.
MARTIN: Why so?
FEENEY: Because I think the history of government regulation of speech reveals lessons - namely, that we should be wary of it. The fact is that social media is much more than Silicon Valley. And there are alternatives out there, of course. The interesting thing about all of this is that, in this debate, many people on the American conservative space seem to forget that competition is valuable. There are other social media networks out there.
MARTIN: Well, Mr. Feeney, you've raised the issue that is - it's - you know, we're raising it kind of as a backdrop here, but it is actually a very big part of this whole discussion, which is that there is this argument on the right - which is certainly embraced by President Trump - that these companies, social media companies, are silencing conservative voices.
But Democrats and progressives have their own criticisms, which is that these platforms have no standards and allow people to be libeled and slandered in some ways very painful and in some cases dangerous ways. I mean, the data on the fact that women - how women and particularly women of color are treated on these platforms and targeted, particularly public figures who are women - there's a great deal of data about the fact of how they are targeted.
So, Kara, so the question then becomes, you know, how do these things get addressed? So, Kara, you want to jump in? Please do.
SWISHER: Yeah. I think - let's just be clear. These are private companies. They are not public squares. And that's the one thing that seems to happen with the right. They're, like, my free speech - I should be able to free speakly (ph). You can't. These are private companies owned by billionaires, and they can set whatever standards they want, No. 1. No. 2, there's no evidence that there's any bias in any way. What there is evidence of is abuse of people, of people getting abused, by the way, on all sides.
And that's really the problem here, is the ability to allow toxicity and lies to continue. And that's something a private company should be dealing with but not legislated by the government. There are certain things you can do and liabilities and things like that that media companies have dealt with for a very long time. And so there's all kinds of solutions that are possible. But I think the really problematic thing is considering these places to be the public square that you can do anything on.
MARTIN: So, Vivian, what do you think should happen here?
SCHILLER: Well, I completely agree that there is no place for government regulation here. That is a path to a very dangerous place. But social media companies must take responsibility. I applaud Twitter for actually making an effort here, even though I think the path they're choosing is very difficult.
I was very disappointed in Mark Zuckerberg's statements on Fox News, of all places, saying that they don't want to be, quote-unquote, "the arbiters of truth." They already are. Algorithms serve certain truths even if there's not human intervention involved. They are going to have to, I think, eventually take some action. The oversight board is a step, but it's not enough.
MARTIN: We are in a moment now where these divisions have come to the fore. We know for a fact that sometimes these mechanisms are used to kind of organize protests. They're organized to - really, and globally, certainly it is the case that they've been used both by people who are trying to find ways to resist government oppression, but they've also been used by governments to advance oppressive mechanisms, right? So do they have a role here to play right now that you think would be constructive? And I guess I'll start with you, Kara Swisher.
SWISHER: Yeah, absolutely. They've been doing a very good job on COVID-19 information. They've been stamping out all kinds of bad health information except for President Trump's. They took down Bolsonaro. They took down Maduro. They took down a lot of stuff that - and a lot of other stuff. So during COVID, it shows all of them - Facebook, Twitter, all of them can do it. And they can do it well, and there was no controversy about that. So they're totally capable.
But they're also capable of creating commonality. Look at all the really wonderful stuff from Sarah Cooper to comics to people making jokes. They can be used for commonality in a really positive way. It's just they're designed towards enragement, and that's - engagement is designed towards enragement, and that's the problem.
MARTIN: Matthew Feeney.
FEENEY: These firms, I think, should probably be better at actually saying, you know, we do arbitrate truth. There's tons of First Amendment protected speech we don't allow, such as pornography and beheading videos and things like that.
The problem, though, is I think you can understand a lot of American conservatism has a bit of a persecution complex, and the more actions these firms take against prominent figures like Donald Trump, the more it feeds into that. And I don't have the answers for what these firms should do necessarily in response to that, but I think that should certainly be a consideration.
MARTIN: Vivian Schiller, I'll let you have the last word.
SCHILLER: You know, I think, look. You know, we haven't really talked about the fact that this country has been largely in lockdown now for several months. And I think many people, including myself, have found some platforms to really be a blessing in terms of a way to connect with people. So let's not forget there is somewhere in the original promise of these platforms to connect us remotely and, you know, to give us a - literally a platform to support each other and to coexist virtually when we can't physically is a wonderful thing.
MARTIN: Vivian Schiller is the executive director of Aspen Digital. We also heard from Kara Swisher, who is the co-founder of Recode and a New York Times contributing writer, and Matthew Feeney, who is the director of the Cato Institute's Project on Emerging Technologies.
As I said, we've only just scratched the surface, but thank you for helping us scratch that surface. Thank you so much for joining us.
SCHILLER: Thank you.
FEENEY: Thank you.
SWISHER: Thanks. Transcript provided by NPR, Copyright NPR.