Web general

Berkeley Talks transcript: Berkeley experts on how to fight disinformation

100 cardboard cut outs of mark zuckerberg stand in front of the Capitol
(Photo by Joe Flood via Flickr)

Listen to Berkeley Talks episode #125: “Berkeley experts on how to fight disinformation”:

[Music: “Silver Lanyard” by Blue Dot Sessions]

Intro: This is Berkeley Talks, Berkeley News podcast from the Office of Communications and Public Affairs that features lectures and conversations at UC Berkeley. You can subscribe on Acast, Apple Podcasts or wherever you listen.

[Music fades]

Henry Brady: Hi, I’m Henry Brady, former dean of the Goldman School of Public Policy and a political scientist who studies American institutions, especially such things as trust in American institutions. I’ll be moderating today’s session.

We have an all-star cast here today, and I’ll introduce them in a moment. First, let me set the stage. We live in an era where a majority of Republicans believe that Donald Trump won the presidential election, whereas Democrats believe overwhelmingly that Biden won. Where a substantial fraction of people believe that COVID is fake or that the vaccines for COVID have not been thoroughly tested and that they have bad side effects. Where watchers of Fox News believe that Christians in America face more discrimination than Black Americans and other people of color. These beliefs exist against the background of partisan polarization between the two political parties and lack of trust for major American institutions. Republicans trust the police, the military and religion, whereas Democrats trust education, science and the press.

Partisan polarization and disinformation, the decline of journalism, especially local journalism, and the rise of the internet with its ability to spread rumors and lies as truths seem to be at the root of these problems. What can we do about them? We’re going to spend some time first asking, “What’s the problem,” and then trying to see if we can come up with some solutions.

The panel is a distinguished one. Geeta Anand is a Pulitzer Prize-winning journalist and author and dean of the Graduate School of Journalism. Erwin Chemerinsky is dean of Berkeley Law and one of the nation’s leading authorities on the First Amendment in the Constitution. Hany Farid, associate dean and head of the School of Information, is an expert on digital forensics, deep fakes, cybersecurity and human perception. Susan Hyde is chair of the Department of Political Science, co-director of the Institute of International Studies and a scholar who studies democratic backsliding, countries that are becoming more authoritarian by the day. john powell is director of the Othering and Belonging Institute and an expert in civil rights, civil liberties, structural racism and democracy.

I’m going to moderate the panel, as I said, let’s get going. What are the sources and nature of the problem? Let me start with Hany Farid, who knows a lot about the internet. What is disinformation? What has changed socially and technologically to ignite the current storm of disinformation? What are the dangers from social media especially?

Hany Farid: Thank you, Henry. Good to be here with such an amazing group of my colleagues here on the Berkeley campus. Let’s start with some definitions. Let’s start by distinguishing between disinformation and misinformation, which are often used interchangeably.

Disinformation is the intentional spreading of lies and conspiracies. Think, for example, state-sponsored actors trying to sew civil unrest or interfere with an election. Think partisan hacks and trolls on Twitter and Facebook. Misinformation, on the other hand, is the unintentional spreading of lies. Think your quirky uncle Frank’s Facebook posts about how Bill Gates is using COVID to implement a mandatory vaccine program with tracking microchips. By the way, a pretty bizarre claim that some 28% of Americans believe.

So, disinformation, of course, is not new and we should acknowledge that. For as long as there’s been information, there’s been disinformation. However, in the digital age, I don’t think it will surprise you to learn, that in particularly in the age of social media, the nature and threat of disinformation is quite distinct.

First, we’ve democratized access to publishing. Many great things have come from that, but now anybody with nothing more than a handheld device can instantaneously reach millions of people around the world. Second, the gatekeepers of social media are not traditional publishers, so posts that drive engagement are favored over just about everything else, with little consideration to journalistic standards or harm.

Now, here it’s important to understand that critical to social media success is driving engagement and time spent on the platform, and in turn, ad revenue. This is accomplished not by chance, but by algorithmically determining what shows up on your social media feed. These algorithms aren’t optimized for an informed citizenship, civility or truth. Instead, repeated studies from outside of the social media companies and inside of the social media companies have shown that social media’s algorithms favors outrage, favors anger, lies and conspiracies because that drives engagement.

It’s this algorithmic amplification that is the most significant difference today in the disinformation landscape. Let me just say a few more things because you asked a series of these questions and I want to try to hit each of them. An additional threat to this algorithmic amplification or manipulation is the risk of filter bubbles in which, as you said at the very beginning, Henry, we seem to have two alternate realities because we are all consuming content inside of an echo chamber and a filter bubble driven by social media.

Although disinformation is not new, what we are seeing is a scale in belief in even the most bizarre conspiracies that is unprecedented in history. So, here’s another example. For example, the far reaching far-right QAnon conspiracy claims, among many things, that a cabal of Satan-worshiping, cannibalistic pedophiles and child sex traffickers plotted against Donald Trump during his term as president. It’s pretty outrageous, even by Americans conspiracies.

However, a recent poll finds that 37% of Americans are unsure whether this conspiracy is true or false and a full 17% believe it to be true. In addition, we’re seeing widespread vaccine hesitancy promoted all over social media with huge, huge implications to our public health. We’re seeing, as you said at the beginning, widespread U.S. election lies with huge implications for our democracy. We’re seeing widespread climate change disinformation and misinformation with huge implications for our entire planet.

This disinformation is leading, and I don’t think this is hyperbolic, to existential threats to our society and democracy. I don’t know how we have a stable society and a democracy if we can’t agree on basic facts, because everybody is being manipulated by attention-grabbing, dopamine-fueled algorithms that promote the dredges of the internet creating these bizarre fact-free alternate reality.

I’d very much like to believe in Brandeis’s concept, that the best remedy for these falsehoods is more truths, not silence. This only works in a fair marketplace of ideas where ideas compete fairly on their merits. Social media doesn’t come even close to being a fair marketplace of ideas. It is manipulating users in order to maximize profits. There it is Henry, is the big difference today from 20 years ago is how we are being actively manipulated in terms of the information we are being presented.

Henry Brady: Thank you. So, john powell, we’ve just heard the technological reasons why things have changed and outlined really adroitly. What about human beings and our psyches and then maybe especially Americans, how much of this is based upon our tendencies towards tribalism and othering? What can we do to minimize that and to limit the degree to which those kinds of factors affect the way people process information? Is that part of the problem?

john powell: Thank you, Henry. First of all, it’s a delight to be here with such distinguished guests and I look forward to hearing and learning from all of you. We sort of have a better sense of the problems than we do of the solutions. The problems are multifaceted, as suggested, that the internet, social media has complicated the problem, by far. I’m reading Martha Nussbaum’s book now on religion and fear, and Aristotle was talking about this problem 2,000 years ago and that it could be hijacked. Part of it does mesh with human nature and society. Tribalism is interesting. I’m [inaudible] more in common than I… I looked at some of their materials in preparation for today’s call, today’s talk.

I’m not in favor of the term tribalism and I’ll tell you why. First of all, think about the U.S. history and our relationship with tribes here. In a sense you could say, by many of the accounts, the tribes are much more welcoming to the Europeans than the Europeans were welcoming to the tribe. But even more pointedly, tribalism, as we understand it, evolutionary, tribes were only small. They range from anywhere from 50 to about 150 people. There were people you had contact with every day, there were people that you knew. In that are all kinds of what we would call biases. These were the people you trusted. Tribes couldn’t be a thousand people, tribes couldn’t be a million people. What we’re seeing, I think, tribes is actually a misnomer. What allows for people who don’t know each other, who will never see each other to actually feel like they’re part of a same group and hostile to another group, rather as Blacks or Jews or Muslims?

I think tribalism, like I said, is a misnomer. I do think changing demographics actually plays a big part. As discussed there’s polarization and identity along ideological lines, but there’s also along social lines, along people. There’s very strong correlation between anxiety of change of demographics and polarization. It doesn’t have to happen. I think it sort of seeds, it creates an environment and then people use it. The elites use it to actually constitute or exaggerate the fear or the threat.

One thing that’s very important I think to point out, is that the “other” is not natural. The other is socially constructed, the meaning and content of the other is socially constructed. It’s not saying we’re all the same, but the meaning, especially saying that someone’s not fully human, that they’re a threat, that they are like an animal, that they’re smelly… There are certain words that show up over and over and over and over again. Whether you’re talking about, again, Blacks or Jews or immigrants, and it’s the dominant group, if you will, leaders oftentimes, using that to sort of create a sense of us and them.

So, this is calculated as suggested, this not misinformation, this is disinformation. The tools available are more profound than they used to be 20 years ago. Also the changing demographics… I’ll end by just saying this, think about the report of the census data. I was very unhappy with the reporting. The reporting, from my perspective, was laced with fear. It may have been implicit, but it was almost like saying, “White people, be afraid. You’re about to lose. The minorities, Black people, Latinos, they’re coming and you’re going to lose.”

It had just scores of stories about this white anxiety. It didn’t paint a picture of how we might be a society without a racial majority and living in harmony and peace and coming together. It said nothing about the explosive expression of American families. Now, one of the fastest growth are mixed race, mixed ethnicity families. That’s potentially a positive, it simply it was absent from the story.

Henry Brady: Thanks, john. That’s the human side and then there’s journalism. Historically, the way we’ve learned about others is through journalism. Geeta Anand, have things changed for journalism and is part of the problem the decline of journalism, or did journalism never have a chance with respect to the internet? Also, are there other historical periods that look like the one we’re in now and is there hope that we can get out of the mess we’re in?

Geeta Anand: Thanks Henry, and it’s a pleasure to be amongst this group discussing this incredibly enormous challenge to democracy and to our world. I mean, the rise in social media has shifted ad revenue and shifted public attention away from traditional news publications. There’s been a 62% drop in ad revenue for traditional news publications between 2008 and 2018. More than 2,000 of the 9,000 publications around in 1995 are no longer around today. More than half of people under 30 get most of their political news from social media. News publications just cannot compete with social media and with disinformation. Disinformation is cheap. It’s expensive to train people to go out and report news, to check sources, to make phone calls, to check public records. Social media companies are making billions and news organizations are barely hanging on. They’re weakened just at the time where we need them most.

I think Hany talked about this, but also john powell, negative information, controversial information draws attention and always has. Tristan Harris famously said that fake news spreads six times as fast as credible news. Again, putting traditional news organizations at a disadvantage. There’s confusion in the public minds about what actually is a legitimate news publication and what actually are facts and what are not. There’s huge distrust in the media right now. I believe this is because of the decline in local news publications, those thousands of local news publications that have gone out of business. This means that most people have never met a reporter. They don’t understand how reporters do their jobs, they don’t understand journalism ethics. When they do meet a journalist it’s when some huge catastrophe has happened in their community and someone has come in from far away to do a story on their community. Someone who doesn’t know that community very well.

People think of journalists as elite outsiders, uninformed about their world and their lives. This is a problem and the situation is getting worse. More and more traditional news publications are failing and social media is getting more of the revenue and more of the eyeballs. There’s just a cacophony of sources on the internet, many of them random organizations, many with evil intent without editors demanding accuracy, without editors deciding what stories are the most important for the day.

As all of you know, and as all of you have said, democracy needs an engaged and informed public to be having debate and dialogue. If we can’t even agree on a set of facts, if we’re so polarized and so confused about what the facts are about our information, we are at a huge disadvantage in being able to deal with the enormous crises of our times, from climate change and on and on. We absolutely need to, as a society, address this enormous challenge. The success and survival of journalism is vital to the success and survival democracy.

Henry Brady: Thanks, Geeta. Erwin Chemerinsky, the legal framework here is complicated. Could you clarify two things, I think. Exactly what does the First Amendment do to perhaps create some of these problems because of this so-called marketplace of ideas and the failure to limit some kinds of speech? Second of all, the Communications Decency Act of 1996, and especially the Section 230 which gave the internet a very privileged kind of sort of role.

Erwin Chemerinsky: Of course. It really is a pleasure to be part of this discussion. Let me put this in context, I think the internet is the most powerful tool for expression since the development of the printing press. It was already mentioned, it democratizes the ability to reach a mass audience. It used to be you had to be rich enough to own a newspaper or get a broadcast license to reach a large number of people. Now, anyone with a smartphone [inaudible], a modem and a library can do so. It gives all of us access to seemingly infinite information and it doesn’t respect national boundaries. It’s very hard for any country to exclude speech over the internet from other nations. But this comes at a cost. Speech is very cheap over the internet, that cheap speech can be used for misinformation or disinformation. Speech that’s harmful, that invades privacy, can be immediately circulated. It also lets other countries influence electoral processes. We saw what Russia did in 2016 in the United States.

In terms of Section 230, your latter question, Section 230 I believe was the key to the development of the internet. It says that internet companies can’t be held liable for that which is posted there. In fact, it’s been said that Section 230, which is 23 words, are the 23 words that created the internet. Without Section 230, the internet companies would have to monitor everything that is put there, because if they didn’t and there was something that was illegal or tortious they could be prosecuted or held civilly liable. It allows people to post things on the internet and social media without the risk of significant censorship from the internet companies. Now that’s not to say the internet companies aren’t monitoring. They’re clearly doing things like excluding child pornography. They’re excluding hate speech. They’re doing this on their own, not for fear of liability. Section 230, protects them.

It’s in that context I can talk about the First Amendment, Henry, and answer your question. The First Amendment, of course, limits the ability of government at all levels to protect speech. The First Amendment isn’t absolute. There are categories of speech that are unprotected or less protected. Child pornography is an example. Speech that incites illegal activity, speech that constitutes a true threat. And we go on with the other categories of unprotected speech. How does this all relate to the internet? Well in a couple of ways.

First, the internet companies, the major social media companies, things like Twitter and Facebook and YouTube and Google are private entities. They do not have to comply with the First Amendment. They can decide to include what they want or exclude what they want. When some of the social media companies excluded Donald Trump, he sued and said, “This violates the First Amendment.” As a matter of constitutional law that’s nonsense, because these social media companies are private. They’re not the government, they don’t have to comply with the First Amendment. Key principle, the First Amendment limits what government can do, not private entities.

But there’s a second way in which this is relevant, too. The First Amendment protects private entities from being regulated by the government. The government can’t regulate newspapers and what they publish, that would run afoul of freedom of speech and freedom of the press. Well, the First Amendment also limits the ability of the government to regulate social media companies, even if it wants to. Finally, what I would say, is the assumption of the First Amendment is that generally more speech is better and if false things are said, the best response is true things.

What we’ve talked about so far today is all of the problems with that, but I’m not sure what’s better than it. The alternative to allowing the marketplace of ideas to work is to give the government the power to decide what’s true and false and sensor what’s false. I am much more afraid of that, than I am of allowing all the ideas to be expressed, even in light of the problems we’ve discussed.

Henry Brady: Thanks, Erwin, a great synopsis of the issues. Susan Hyde, let’s get a little bit beyond America and talk about what’s happening around the world with various countries. Are these same kinds of factors operating in other countries and how are they affecting those countries with respect to the health of their democracy and the future of their democracies?

Susan D. Hyde: Thanks, Henry. It’s a real honor to be here and so important to continue talking about this, both today and continuing in the future. As Geeta said, it really influences our ability as a society and as humans around the world to address a long list of other challenges that require collective action. I think disinformation is really pernicious in our ability to address a whole host of problems. Of course, this is nothing new. In some sense, propaganda has been around for a long time and has been used very frequently to try to sway elections and voting behavior in other countries, both from within, often by the government in power, but also from without. Folks have already mentioned Russian interference in U.S. elections, but historically, it’s important to acknowledge that the former USSR, now Russia, and the United States have been active in promoting propaganda and election interference in a lot of countries around the world.

This is not new in some sense, but what is new is that this toolkit that is related to disinformation and the use of social media for this information specifically is, I think, the scale and the ability to target messages at the individual level in a manner that can be, but of course isn’t always, harder to document and therefore harder to understand the extent of, which is important for some of us who are interested in thinking about the degree to which these are problems for elections. All elections have some problems. It’s not about one voter not having a chance to vote in our typical evaluations; it’s usually about the extent of the problem. And if we’re not sure how many people were affected by disinformation or exactly what forms of disinformation they received, it can be very difficult to have a solid assessment in real time about how an election proceeded.

And this is a problem in a lot of countries around the world. We’re pretty used to thinking about election fraud as a game of cat and mouse or a game of evolving strategy, so lots of efforts to deter election manipulation have been met with counter efforts. And this is one area in which I think the micro-targeting of disinformation actually makes it even more difficult to come up with those countermeasures. One of the things that I wanted to reference is that we’re talking about this as a phenomenon that’s taking place in many countries around the world. There’s an ongoing study out by a group of scholars out of Oxford. The 2020 study documented the use by governments and political parties of social media to manipulate governance, processes, or elections in 81 countries. I don’t have time to delve into how the tools and strategies vary, but I think it’s important to acknowledge that not everything that we’ve seen in the United States is what other countries are seeing around the world.

And why is that? That’s important for us to acknowledge, in part, because we might see some of these other things here soon, but also in thinking about solutions, I think it’s important to acknowledge that what we’re seeing here is just a small slice. There are also increasingly documented cases in which businesses are using these tools on behalf of political actors. A lot of this is for hire, and I think this is connected to the more general sense that we’re in a period of democratic backsliding in many countries around the world. And it’s clear from the experiences in other countries that they’re really dozens, if not hundreds of ways to combine these new tools of disinformation with the old menu or the classic tools of election manipulation.

I think it can make things like vote buying easier, for example. It can make voter intimidation easier, for example. And because it makes it easier to target specific individuals, it’s potentially more pernicious and more difficult to document. One other thing, I’ll note that’s not disinformation, but we are seeing authoritarian governments diffuse their surveillance technology to one another, so they’re sharing technologies for the surveillance of their own citizens. This is not exactly disinformation, but I do think it’s connected to this broader conversation about how disinformation is influencing voting and democracy around the world. I’ll stop there.

Henry Brady: Thanks, Susan. Okay. I think we’ve established that there is a problem. There are a set of issues, and it’s a complicated situation. Let me now go on to ask if we’ve got solutions. Let’s start with the ones that we always hope that we have, which are technological fixes. Any other technological fixes for what’s going on? Can the internet police itself with technology?

Hany Farid: Since technology got us into this mess, you’d hope that there are some technological solutions to help us get out of this mess. There’s a number of challenges here. Let me enumerate them first. First, social media operates on an unimaginably large and global scale. Every day, four petabytes, that’s more than four million gigabytes, of data are uploaded to Facebook every day. And every minute there’s more than 500 hours of video uploaded to YouTube. Mitigating harm at that scale can be an enormous challenge just because of the volume and the borderless nature of this content.

Second, we should acknowledge that while some disinformation is easy to identify: the earth is not flat; the video purporting to show Nancy Pelosi drunk is fake; Hillary Clinton is not, in fact, running a child porn ring out of a basement of a pizza joint in DC. On the other hand, other pieces of information might be harder to classify. For example, theories of the origin of COVID have been in flux over the last year. And so, deciding what is true and what is not can also be tricky. Third, and this was already mentioned, but it’s worth mentioning again, we have seen some less than democratic countries and at least one U.S. president stifling criticism by crying fake news because of inconvenient facts.

And so we have to tread very lightly here on labeling things as true or false. That’s the bad news in some ways, but on the other hand, recent studies have shown that despite the scale of Facebook, on Facebook, 65% of COVID related disinformation originated from only 12 people. The so-called dirty dozen, as they’re called. Similarly, in our own studies, we found that by reducing the visibility of about a dozen channels, YouTube was able to significantly decrease the prevalence of conspiracies and their recommendations. In some cases, the problem is not actually that big and could be as simple as demoting a relatively small number of very, very bad actors responsible for a large amount of disinformation.

Now, we’ve also seen that tweaking the underlying recommendation algorithms that I was talking about earlier can have a big impact on mitigating disinformation. In 2020, Facebook conducted an interesting experiment called “Good for the World, Bad for the World,” in which their users were asked to categorize posts as one or the other. And what Facebook researchers found is that there was a positive correlation between the popularity of a post and its categorization as bad for the world. This is what Geeta was talking about earlier. Then, Facebook trained the recommendation algorithms to make “Bad for the World” posts less visible. They didn’t ban them. They didn’t delete them. They just made them less visible on our newsfeeds.

And the research was successful. It reduced content that was “bad for the world,” but you know what else it did? It reduced the amount of time that people spent on Facebook. And so, what Facebook said was, “Nice try, but we’re literally going to turn this off,” and now knowingly recommend posts that we know are bad for the world. There are mitigation strategies. Despite the challenges, the scale, the definitional problems, there are mitigation strategies that are fairly well understood and could be implemented. The problem, of course, is that these changes are not necessarily good for corporate profits. And here we run into the tension here. I would argue that while the problem of disinformation is complex, the problem with disinformation on social media today is not primarily one of technology, but one of corporate responsibility.

I would also argue that we can mitigate harm without, and this is to Erwin’s point earlier, without necessarily banning specific types of speech or users, but instead we can tweak, as we have already seen, the underlying recommendation algorithms to simply favor civility and trust over hatred, lies, and conspiracies. And, of course, there are some definitional things that we have to get right there. The last thing I’ll say here is we have been waiting for now several decades for the technology sector to find their moral compass. And they have not seemed to be able to do that. They continue to unleash technology that is harmful to individuals, to groups, to societies, and to democracies. And left to their own devices that will continue. We cannot sit back and say, “Well, the technology sector will self-regulate.”

We need to start thinking about modest and thoughtful regulation that will put some pressure points on the technology sector. Erwin was talking about Section 230 of CDA, which has removed many of the pressure points. You don’t want to add too much because then the government risks overreaching, but too little, we have the mess that we have right now. And so, the question is how do you balance those issues? But, again, I want to just emphasize while there are technological challenges, I think many of the issues, we actually know how to address a significant amount of them; we’re just choosing not to.

Henry Brady: Thanks, Hank. Geeta, the journalism is in decline. It has problems, especially at the local level, but also otherwise. Is there an argument to be made that the social media companies should be asked to take some of their profits and give them to journalism? And maybe that can be done through a tax on internet exchanges or ads or something like that, and that that would be given to journalists so that local journalism could perhaps thrive more?

Geeta Anand: There’s definitely an argument to be made in that regard, Henry, and I think journalism and democracy would benefit from such attacks. Other ideas, though, for rebuilding trust, because a key problem is the lack of visibility of good journalism, but also the lack of trust in the media, and something I’m really in favor of and that the journalism school here at Berkeley is investing in is local news. We really need to … Unless we build back up local news publications, people are not going to see reporters doing that work. They’re not going to know them. They’re not going to understand what journalism is about. And they’re not going to believe in journalism. We have two new news publications, Richmond Confidential and Oakland North, and we’ve been investing more in them in the last couple of years, hiring an editor, but there’s other nonprofit efforts to do the same.

Report for America is an incredible effort to put local journalists and publications around this country. ProPublica, which is the best nonprofit investigative organization in the country, is collaborating much more with local reporters to produce local investigations, holding local governments accountable. I think universities around the country could commit themselves to investing, could and should commit themselves to investing in the local publications around them as one small step to improving the sustainability of local news and to promote interest in journalism. But, to your point, Henry, when you asked about this tax, I mean, the economics of journalism are broken. Advertising revenues have … More than 50% of advertising revenues have shifted to social media from traditional news organizations.

Many of the best news organizations have shifted to the subscription model and have seen subscriptions rise astronomically, especially in the last few years, as many people have recognized the value and importance of investigative reporting on everything from our president to climate change, to Facebook. An incredible series in the Wall Street Journal, “The Facebook Files,” this past week. The problem is that people aren’t willing to pay for news when there’s so much disinformation available masquerading as news.

For example, I was talking to a friend of mine, a colleague actually here at Berkeley Journalism, who is Mexican and was in Mexico recently. All the anti-vax propaganda was being spread. Her friends were believing it. And there was a New York Times story countering exactly the misinformation or disinformation in one story. She shared it with all of her friends, but none of them had New York Times subscriptions. They weren’t actually able to open the story that countered the disinformation.

Clearly, we need a solution to fix the journalism ecosystem. And the tax idea, I think is a brilliant one, but I think that the solution to the disinformation problem will need to be multifaceted. We’ll need to convene experts in all different disciplines, as are here on this call. I think Berkeley can do that. We’re situated. We have the most incredible brains, legal brains, technology experts, public policy, government, belonging experts, journalism experts here right on our campus.

And I’m hoping, and I’ve been taking, together with you, been taking some steps to be conveners of finding a solution to this problem in which we cannot have any area that we refuse to consider or reconsider. We have to think outside the box. And we have to include the industry in helping us understand where the solutions lie in a way that doesn’t make them feel defensive, which they are. But, somehow we have to bring people together and address this in a legislative way immediately.

Henry Brady: Great. Thank you, Geeta. It’s easy to believe fantastic things about other people when you’re othering them, when they’re not your next door neighbor or your family. john powell is an expert in thinking about othering. How can we, as a nation, go beyond trying to just fix the internet or fix journalism to actually fix the problem we have, maybe that’s at the root of a lot of this, which is that we other one another and we have done so throughout much of our history?

john powell: It’s an important question. That’s a great question. And you’re right, Henry. This is a problem that’s hyper-charged by technology, but it’s not simply a technological problem. For example, if you look at voting trends, part of what voting trends shows is that racial and ethnic segregation actually increase extreme voting. When people actually only hang out with people like themselves in homogeneous groups, the groups are more likely to be extreme. And America’s never dealt with … in fact, I would say we came out with a recent study showing that the country is actually moving toward greater segregation, both racial, economic, but also an ideological segregation. If you’re conservative, you’re more likely to live with people who are just conservative. If you’re liberal, you’re more likely to live with people who are just liberal.

And to your point, Henry, once you other people, we have a lot of data showing people actually don’t understand the other side. They exaggerate their views. Some of the polarization is more perception than reality. We actually are closer together. Also, the people who drive politics are relatively small. There’s a large section of people who just turned off. And they don’t want to be in a [inaudible] fight, they actually want something different, but they don’t know how to get it there. I think these are huge problems. And the idea that fear moves faster than a positive emotion, so if you’re trying to create fear, you have a huge advantage. If you’re trying to increase hate, you have a huge advantage already on your side. It’s much more difficult to create these other mechanisms. And I agree with the other speakers, it has to be deliberate.

And I don’t think it will fix itself. I don’t think that technological … I believe the container actually has a crack in it and it might break all together. It’s not clear to me that democracies will survive this unless we do something very deliberate and very robust. And I agree, we have to put a lot of things on the table. And I mean, we’ve just watched years … and even the study that was cited early in terms of Facebook saying, “Yeah, we could fix the problem, but at what cost to us?” And the question is if you don’t fix the problem, what’s the cost for democracy. That’s not the question they’re asking. They’re asking what’s the problems with the cost of their shareholders. I think there should be a major effort, if not by Berkeley, by multiple universities and others. I mean, we really are in, someone called, an existential ontological problem, challenge.

And I mean, I started writing about authoritarianism in 2004. And I didn’t anticipate how broad it would become in 16, 17 years. It was like this [inaudible] industry, a few of us writing about it, thinking about it. Now, I say democracy is in retreat and it’s on its heels. And the last thing I’ll say is that it’s not just factual. We talk about people getting factual information. People actually join groups to belong. Actually, there’s a lot of work that’s done through social media and things. People are not just going to social media to get the facts. They’re going to feel like they belong. They’re going for community. And once you’re in that community, in a sense that community polices you. And so part of the thing is that the sense of belonging in America is in steep decline.

Most Americans feel very isolated. They don’t feel connected to the community, to the nation, to institutions, and then they become prey to these really extreme groups where you can at least belong. And so, again, it’s not just a cognitive thing. How do we have people be smarter to process facts? It’s that what’s being done … One last thing I’ll mention, there was someone being interviewed about Donald Trump. And the interviewer was citing out, “Well, this is a lie. He just lied on this.” And the person basically said, “Of course he lies. I know that. I’m not stupid, but he creates a community. He cares about us. That’s what’s important.” I think we have to be much more sophisticated than assuming that this is just a question of truth versus fiction.

Henry Brady: Actually, I’m going to follow up with john. He has to leave at one o’clock and I want to get a bit more of his wisdom. A recent report from the American Academy of Arts and Sciences, called “Our Common Purpose,” made a bunch of recommendations for how we could maybe minimize or even eliminate othering. And they suggested things like universal public service, where everybody would have a year of public service. This would not necessarily be the military. This would be AmeriCorps. This could be all sorts of community activities. And the idea would be to mix and mingle people so they would get to know one another. And the other thing they recommended is a telling our nation story initiative that would fund efforts around the country to bring people together from all diverse perspectives to tell their stories as part of the nation’s stories. Are those the kinds of things we should be supporting and thinking about to reduce the amount of othering that occurs in our society? Are there are other ways that we could do it?

john powell: That’s certainly part of it. I mean, people, especially at Berkeley, people hate for me to give this example, but one of those successful examples of addressing othering is the military. And part of it is that you bring people from diverse backgrounds together. Think about it. Oftentimes, not to exaggerate, but you bring a young Black man, young Latino man, young white man, often from the South, put them together and give them a gun. That sounds like a tragedy. That sounds like an accident about to happen, but instead what happens, and the military has worked on it, is you have good lifelong friendships. And part of it is what you’re saying, Henry, people doing something together with a common mission and getting to know each other and relying on each other. We know a lot about contact theory, telling better stories, so yes, I think something like that because a democracy depends on the idea of being able to take another’s perspective, being able to see another person.

Othering is a caricature. When we other people, we flatten them. They become single dimension, just this, they’re just Black, they’re just gay, they’re just this. All of us have multiple dimensions. I’m rushing through things. It’s not impossible to hate up close, but it is harder. It is harder, especially if we do it right. So yes, telling each other stories, telling different stories, telling stories about a larger “we” and about a new future, but also having some common purpose. Even things like football games, I mean when people come together. There’s a whole bunch of literature showing how important Jackie Robinson was in terms of breaking the color line, not just in baseball, but in society. When you had whites who otherwise didn’t know any Blacks, that did not like Blacks, cheering him on, that made a difference. We’re not using a lot of information we know, and we need new information, as well.

Henry Brady: I grew up with Jackie Robinson as a hero and the Brooklyn Dodgers as heroes and vividly remember the 1955 World Series where the Dodgers finally won the world series.

Erwin, the legal framework here is complicated. Let me use a vivid analogy, I know it’s a bit unfair, but I’m going to use it anyway. If we had a water company that didn’t check the quality of the water and people were pouring poison into the water system and that was affecting people’s health, would we be happy with that? Would we allow the legal framework to continue to do that?

Erwin Chemerinsky: No, but the analogy isn’t an apt one. To start with, the water company isn’t protected by the First Amendment. Also, there’s no harm in forcing the water company to monitor the quality of the water. There’s only good to come. I think there’s great harm if we would create liability of media companies when it comes to the false and damaging information there. A newspaper exercises editorial judgment over what’s within the newspaper, the water company exercises control over what’s in the water. The whole idea of Section 230 is that the internet and social media companies should be platforms where any speech can be expressed. That’s why there’s no analogy to your water example or to newspapers.

It was mentioned earlier, there are a 4.75 billion pieces of information posted on Facebook each day. If Facebook could be held liable for anything there that might be criminal, might commit a tort, Facebook would have to monitor all of that. Undoubtedly, Facebook would err on the side of taking things down rather than facing liability. We wouldn’t lose just the harmful water, we would lose so much of the good water as well.

For a while, I came to believe that it would be possible to say, “Well, we’ll only create liability for media companies and social media companies if they have knowledge that there’s this harmful material and if they don’t take it down.” But then I looked at an analogous law, it’s called the Digital Millennium Copyright Act, and it creates an obligation to take down things when there’s an allegation of copyright infringement. I learned that the take down provisions led to tremendous over-censorship in loss of information that we would want to have [inaudible] speech. Henry, everyone wants to criticize Section 230, the right criticizes it, President Trump threatened to veto a Defense Appropriation Bill unless Congress changed Section 230, the left criticizes Section 230. Having studied this carefully, I don’t see a better alternative. I worry that if we were to repeal or change Section 230, the result would be much worse.

In fact, you were talking about the American Academy. They said that with regard to debate about political speech online, quote, “Will require solutions outside the scope of reforming and repealing Section 230.” I think what we should be doing is putting social pressure on the media companies themselves to do a better job excluding unconstitutional, harmful, illegal, tortious, hateful speech. They can do that because they’re not the government. They can regulate speech as they choose, we should put pressure on them to change their algorithms. But I think it’s much better that it come from the social media companies and pressure on them, than it comes through government regulation. I’m convinced that any effort to try to significantly modify Section 230, will be much harmful than it will be beneficial.

Henry Brady: We’re going to get Susan Hyde in a minute, but I want somebody to maybe reply to Erwin and talk about his position, because he’s got a very absolutist First Amendment position. Let’s start with Geeta, and maybe Hany wants to say something, and then we’ll go to Susan.

Geeta Anand: I think of the burgeoning social media as just a whole new infrastructure in this world. I just think if we had a whole new rails system or a whole new air traffic system or a whole new system of transportation like air, we have over the centuries regulated whole new systems of infrastructure that have been invented. I’m just not convinced that we can count on social media, just based on the track record, to regulate itself. I hugely support pressuring social media companies to see themselves more as news organizations with a responsibility for accurate information.

But I think that there’s… I’m interested in the suggestion that Tristan Harris put forward in a piece in the Financial Times a year or two ago, in which he suggested that social media be considered like a public utility and held accountable for the public good. That perhaps there’d be licenses, sort of like companies are held accountable for the environmental impact of their work. Perhaps social media organizations should be held accountable and regulated in the same way. Maybe the experience of being brought to hearings and having to answer questions about their impact on the public good would put the same kind of pressure, that Erwin is talking about, on them to regulate themselves in additional ways. Over to you, Hany.

Thank you. First of all, let me say, I don’t like arguing with Erwin who is arguably one of the finest legal minds in the country. I’m not entirely fond of this position, but I’m going to argue with him nevertheless. So, a couple of things. One is, Erwin is absolutely right. The DMCA has been misused. We should acknowledge that, that it is an imperfect piece of legislation. But to point to the misuse of the law and not point to where it has been effective, in for example, creating the Apple iStores and the Amazon Primes and the Netflixes and the Hulus. Where we now, unlike 15, 20 years ago, were downloading movies and music all the time, and now we have a very robust online ecosystem where creators of movie and music and books are in fact paid. Has there been abuse? Sure. But to point to that abuse and say DMCA is on the whole negative, I think is incorrect.

Now, Erwin’s question is let’s put social pressure on the social media. We’ve been doing that for 10 years. I mean, it’s hard to think of a week that goes by without some scathing article around Facebook, Twitter, YouTube, TikTok, Google, Amazon, Apple and that has been going on for years now. Last year, we tried an ad boycott against Facebook. It was one of the largest ad boycotts in history. Had hundreds and hundreds of company and it fizzled out with absolutely no effect. We slapped a $6 billion fine onto Facebook and they shrugged it off the next day with their stock price going up. When you have these massive trillion dollar monopolistic companies, there is no social pressure. Understand, this isn’t like the airline. It’s not like an automotive industry where I can go down the street and buy a different brand because I don’t like the practices of this company.

We don’t pay these companies. We’re not the customer, we’re the product. That’s a very different relationship with the corporate entity when it comes to putting pressure. By the way, I will point out that as we’re talking about the abuses of these companies, we are streaming this video on YouTube and Facebook. Why are we choosing to do that? We’re just ceding the middle ground to them. Why, for the love of God… Sorry Facebook, why are we streaming this on Facebook?

To Erwin’s point that’s the problem, is we all do this. I can’t tell you how many times I’ve talked to a reporter who writes scathing articles about social media. The brilliant Wall Street Journal “Facebook Files” from last week, you know what it says at the bottom of the article? “Follow us on Facebook, follow us on Twitter, share on Facebook, share on Twitter.” It’s easy to say put social pressure, but that has not been working.

The last thing I’ll say on this is, I don’t think any reasonable person says we should repeal Section 230, but I think there are reasonable proposals for modifying it. The one I like the most is from [Ushu 00:56:26] and [Malinowski 00:56:27] that says we are not, in fact, going to hold you for every single piece of content that gets uploaded to your service. But, if your algorithms reach into that sea of data and pluck out pieces of content and slap an ad on it and monetize it, you should have some responsibility for that.

Because now, you sound a lot like a publisher to me. I think that’s a reasonable, modest proposal. Having said that, we should, Erwin is right, we should tread lightly. I don’t think we can sit back anymore and just wait, because what we have seen is horrific harm from online platforms, from child sexual abuse to terrorism and extremism, illegal drugs, illegal weapons, sex trade and disinformation that is destroying, as john was saying, existential threats to our democracy. Waiting around for Mark Zuckerberg to get it together, I just don’t think is working.

Henry Brady: So, john, do you have anything to add? If not, because I know you have to leave in a minute, if not we’re going to go to Susan who’s going to tell us how other countries have dealt with these problems and what the solutions are.

john powell: Well, I’m appreciating the conversation. Just to sort of add, I think we can’t wait and at the same time, they’re dangerous. It’s like with the COVID, the non-vaxxers, and I’ve talked to a number of them. It’s like, “There’s dangers in the vaccine,” and yes, there may be some dangers in the vaccine, but there’s also a danger in the virus. That’s a legitimate conversation. But just to point to one danger on this — I’m agnostic, I don’t know what to do, but I know we need to do something. Just saying that if the government involves in regulation, that’s a greater danger, I’m not convinced that’s accurate because maybe it’s empirical. I do know, as Hany suggested, just doing nothing, we’re just sliding into the demise of our democracy.

I’d like to see some people seriously grapple with this. Then the last thing is, as I said earlier, how do we actually deal with this not just in the United States? That sort of fear of the other, the fear of people moving around, it’s a global problem. A lot of countries don’t have the same kind of loyalty to the First Amendment. We are talking about monopolies, which years ago we said monopolies were bad, but now we sort of accepted, essentially, that these monopolies can do whatever they want to and we’re dependent on them. The terrain has shifted, so I think we have to shift the way we think about it. I’m going to put you on mute and then stay on as long as I can, but there’s going to be some background noise.

Henry Brady: Thanks, john. So let’s go to Susan and what’s happening in other countries and what lessons can we learn from them?

Susan D. Hyde: One of the questions that I was thinking about in advance is what we can do to rebuild from this moment that we’re in. I’m really intrigued by this conversation about regulating social media, thinking about what can be done proactively, because I do think this is a problem that makes it more difficult for us to do almost anything else. I wanted to say that the most hopeful thing I’ve heard about the moment we’re in was a tweet from former UC Berkeley Ph.D. Anne Meng, who’s now at the University of Virginia. She said, in a very offhand manner that I just found mind blowing, “For all of our worst case scenarios about where we’re headed as a country, this could be the moment in which the U.S. finally democratizes, not just on average, but for everyone, including those people whose participation has long been deliberately excluded.” Making participation in political life and real political representation available and accessible to all Americans is threatening to some people. That is where some of our tumult is coming from.

I think it’s important for us to think about that and to really counter that head-on. I am talking about this in reference to the U.S., not just talking about other countries around the world, but it’s in part because all of a sudden my research on election violence and election fraud and democratic backsliding is suddenly relevant to the United States. I’ll say that in other countries that are divided, for example, those that are experiencing the immediate aftermath of a civil war in which neighbors were literally killing each other, not decades ago but in the last year, there are really high levels of distrust as well.

It can be very hard to find domestic political actors that are viewed as neutral across the political spectrum, that can cut through this hyper division that we’re experiencing. This is particularly acute around elections, I think, because it has to do with whether people accept the outcome of those elections and they’re willing to protest elections that are in fact stolen, but also accept the results of elections that were democratic. Having a resource available that can offer an opinion on whether the election was in fact problematic or not, is a problem that a lot of countries have encountered. This is where international election observation basically came from, countries that had super high levels of distrust domestically, that they couldn’t find a resource domestically that was trusted across the political spectrum to make this critical judgment about whether or not elections were stolen. Which can be very difficult for an average citizen to discern.

I think that when we look towards solutions, I think there’s a number of things that we should talk about that are very common in other countries that might move the needle a little bit in the United States. First of all, I keep saying this, I’m going to say it in this forum too, election administration should be nonpartisan. We are the only country in the world that is vaguely democratic and has partisan election administration. I think that we need to change that, I just don’t think it’s sustainable in the longterm and it’s a real problem today. I can think of a number of other similar proposals that are worth a shot, including some electoral reforms, including a lot of other things.

I think it’s important to also say something about what’s going on right now with the Republican Party. Henry mentioned some of this at the beginning. It has really been taken over by a set of anti-democratic forces, many of whom also support a white nationalist agenda. They’re actively using disinformation or taking advantage of the disinformation for their own political advantage. I personally long for a Republican party that returns to its roots as a driver of policy, that puts forward serious policy ideas and that debates those ideas with other political parties. It’s very hard to think about a good outcome from the game that they are playing right now, which is a rejection of our country’s political institutions, a questioning of the way that we’ve been doing things for so long and an outright and deliberate manipulation of their supporters through fear rather than persuasion and through proposing better policies.

I think this is a very dangerous game. It’s hard for me to see a way out of the place that we were in as a country without confronting this problem. Which I don’t think is a partisan issue, I think it is a democracy issue and is a democracy issue that is particularly afflicting one of our two political parties. But in order to have a functioning democracy, we need to have two political parties that are playing by the rules of the democratic game and that are not doing what many members of the Republican Party leadership is doing right now. I think that’s a problem. I’ll say one other idea that’s concrete, it’s small. I’m just going to say it really quickly because I think it’s interesting. I’ve been fretting about what to do just for my own self. What can I do to help support democracy in this country? I think that the conversation about social media is really interesting.

But there’s another thing that a lot of other countries have experience with, again, that came out of this periods of distrust, hyper-partisanship. That is a grand coalition of organizations, leaders, civic groups that participate as non-partisan domestic election observers. It’s a very small thing, but I think it’s very possible that people need additional experience. I’m curious if it would make people more likely to put democracy above their own more narrow political interests in this country. It has worked in a lot of other countries. It also can potentially provide, maybe, an opportunity for the pro-democracy Republican actors, as well as civic, business, religious and other groups to unify around defending U.S. democracy.

We can imagine campuses getting involved with this kind of thing, too. That’s very common in other countries. It’s a little bit of a lark, it’s an idea I’m throwing out there. I know some people are working on it, but I think things like this, initiatives like this that are participatory for lots of people and that involve pro-democracy political participation, I think could move the needle a little bit in the right direction. So I wanted to end on a hopeful note.

Henry Brady: Thanks, Susan. I’m going to go back to Erwin, but I want to first propose a bunch of things. These social media companies are enormously rich. They have tremendous amounts of money, they’re monopolies. Can’t we require more of them? Maybe not that they monitor every single transaction that occurs through their pipes, but maybe we can say things like they have to support public service forums at a very large level and really support local journalism, for example. Perhaps they could support efforts to have deliberative polls in local areas that would bring together a random sample of local people and who get together and then discuss politics and that’s put on the medium, and so on and so forth.

Can’t we find some way to make sure that they’re thinking about their social responsibilities in a bigger way? Another way might be to rate them in terms of social responsibility and to make that a public thing that everybody knows about and who in fact say there’s pressure on them to make sure that they are socially responsible. Can’t we do some of those kinds of things?

Erwin Chemerinsky: Yes, unquestionably so. I don’t deny the threat that exists to democracy right now. I think to some extent though, blaming social media internet is blaming the messenger. I think the problems in our society that are leading to the threat to democracy are much greater than and not caused by the internet and social media. Though, internet and social media contribute to it. As you were just saying, and I’ll talk about, can be part of the solution.

And contrary to what you said about me, I’m not an absolutist when it comes to the First Amendment. I believe that there is speech that’s unprotected by the First Amendment, child pornography, incitement, true threats and other things. I also believe that there is a benefit in our society of having platforms for speech that anyone can participate in and anyone can use to reach a mass audience. Up until the development of the internet, one of the main problems with regard to speech, was the scarcity of media and how little most people had access to be able to get their message across. We’re now in the golden age of free speech and I want to be sure we don’t lose that.

Now, in terms of creating liability, something that was raised earlier, if you create liability on social media companies for anything that’s posted there, they will have to monitor and they will greatly over censor. There was the proposal that some have offered up, “Well, if they have noticed that it’s harmful speech, force them to take down.” It was in that context that I referred to the Digital Millennium Copyright Act. We can talk about that in detail, but I think the take-down provisions have been much more harmful than good.

In fact, there have been other efforts to regulate speech on the internet and social media. I think they’ve been counterproductive. There was a law adopted a few years ago called the Allow States and Victims to Fight Online Sex Trafficking Act, FOSTA, that was meant to try to keep things like Backpage from advertising for sex there. What we found is it hasn’t decreased trafficking. In terms of protecting sex workers, it’s been very harmful. It hasn’t achieved what it’s wanted, it did greater harms. So, my point isn’t an absolutest one against regulation. My point is that I think we’ve got to be very careful that what we do doesn’t end up being much worse than what we have now.

Now, with regard to what you said a moment ago, Henry, I think there are things that can be done. You mentioned three. Tax the social media companies to help local media. I think that would be constitutional, I think it’d be appropriate. Martha Minow, the former dean of Harvard Law School, new book has proposed something like that. I think that would be constitutional. I think our having some entity that rates social media platforms, I think that would be constitutional. That’s just more speech. Forcing the social media companies to hold events and then to publicize them, I think that unquestionably would violate the First Amendment. Because remember, the government regulating these media platforms in government regulation raises first amendment questions. Maybe if nothing else with this discussion shows is it’s enormously complicated, but in the end, I’m so distrustful of government regulation, I’m willing to accept the benefits of unregulated speech.

Henry Brady: And I think these, this is the nexus of the problem is that on the one hand, we want to make sure that we continue to have free speech but on the other hand, speech has gotten a bit out of hand. And the question is, what can we do in those circumstances? Hany, you’ve been shaking your head in various ways, tell us what you think about these issues.

Hany Farid: A couple of thoughts, again, I don’t like to argue with Erwin, but a couple of things about SESTA-FOSTA first of all, which was designed to protect children online. Let’s first acknowledge that companies like Backpage were hiding behind Section 230 of the Communication Decency Act and knowingly trafficking in young children and they got protection, that is insane.

And so, SESTA-FOSTA was in response to this absolute horrific misuse of that law. Now, if you look at the impact of SESTA-FOSTA, Erwin’s right. It didn’t actually reduce sex trafficking, but the reason is not because of the law, the reason is because of the global nature of the internet. Because this law only impacted U.S.-based companies. And so, what happened is everybody just migrated to other platforms. So, it wasn’t so much that the law wasn’t effective. It’s just that we have a very leaky border in the digital world.

And by the way, the claims that SESTA-FOSTA was going to lead to more violence against women is not sustained. There’s a recent study, large-scale study, from the Carnegie Mellon University that showed that in fact, that has stayed steady. And in fact, what happened is right after the law was passed, there was a decrease in sex trafficking as the platforms, the U.S.-based platforms were no longer able to have the ads, but then it had a rebound over time when everything migrated off shore. So was the law effective? Was it not effective? Well, it depends on how you actually count.

Now, to Erwin’s point two about, “I would rather have everybody participating because once you start taking down speech, we run into this problem.” But here’s the issue is we have a problem on say, Twitter, where women on a daily basis are subjected to horrible abuse, people of color, people from the LGBTQ community, immigrants.

And so what, what happens to their voice? What happens to their voice when the most vitriolic, hateful, spiteful angry and not necessarily illegal content shuts out other voices. So, saying everybody should have a voice, I think is a little naive on the way the internet works. Because if you are a person who is from an underrepresented group, you are going to get off Twitter and you’re going to get off Facebook. And again, get off YouTube because on a daily basis, there are horrors that you are being subjected to, which frankly, you just decide, “Look, I don’t want it.” And then, the bullies win.

So, I think this is where Erwin and I disagree. I don’t actually trust the government either, but I certainly don’t trust private companies who have one mandate and one Band-Aid only, which is to maximize shareholder returns. And what we have seen in every industry online or offline is that left to their own devices, these companies will do exactly what they are mandated to do, which is to maximize shareholder profit. And when you have a monopoly in this space, rate the social media companies, all you want, what are you going to do? Go to MySpace? I mean, where are you going to go rates Facebook as a zero on a scale of zero to 100? Where are people going to go? We’re still going to be streaming this video on Facebook.

So, I just, maybe Erwin’s right that the government can overreach, but not having the government step in and put some pressure points doesn’t seem to be working either and I saw Susan’s hand up.

Henry Brady: Yeah. Susan, maybe you can tell us about Hany has made it clear that there’s global dimensions to this and maybe you can discuss some of that.

Susan D. Hyde: I wanted to just say something else, which is just that, because I’m thinking about the change in the rules of the game, right? Not just playing the game, but we have in the United States and we’ve had for a long time upholding our constitution to do and to live in the system we currently live in. The concern that I have is that the problem that we’re facing is one that’s going to send us into a system of government, not to be too alarmist, but that is a solitary and does not allow for any kind of free speech.

And so, at the extreme, what we’re talking about is, continuing to sort of dance the tango on a sinking ship. And it’s just not working to have democracy. This is a fundamental threat to democracy. What we’re seeing right now, and the Constitution is not going to matter on some level if we get to this really extreme, worst-case scenario. I’m not there yet, but I do think about the worst-case scenario, right? And I think that we do have to confront that the Constitution is not going to matter under this circumstance.

Henry Brady: So, I want to go around a Geeta, I just want to ask you about journalism and then we’ll get back to Hany and maybe Erwin. But just ask you about what do you think that journalism could positively do to solve some of these problems? Is there a way to give voice to some of the people who Hany worries will be thrown off Facebook and Twitter through better journalism and methods like that?

Geeta Anand: Journalism itself needs to do a better job of giving voice to underrepresented groups. Leadership in journalism is disproportionately made up of privileged social classes. That has to change. As a journalism school, we’re committed to trying to change, to take the lid off who gets to become a journalist in this country, because we know who the storytellers are matters because we all see the world through the prism of our own lived experiences. But all of that said, if we change who the storytellers are, if we produce the highest-impact investigative stories, if we are telling stories from the points of view of Indigenous people and underrepresented people, if no one is reading those stories, because they are buried on social media platforms where everyone is going to get their news in an increasingly polarized world, then journalism itself becomes irrelevant.

We can be producing the best work, but if no one has access to it, because the infrastructure that we’ve created, the monopolistic infrastructure has algorithms that barrier journalism, then our work is meaningless. And this, as Susan has, and many of us have said, this is a huge threat to democracy because we are, just creating, we’re taking away the ability. We’re creating an un-level playing field where this information and angry, fearful, fear mongering proliferate, and the best stories, however excellent and equitable they are, are just hidden.

Henry Brady: Hany, is there any way to try to figure out how to rate sources so that people could know what the reliable sources are? I know there are truth-checking, fact-checking entities out there, I don’t know that they get much play for most people, but is there some way we could direct people to better places?

Hany Farid: Sure, absolutely there are. And there are many very good serious journalists and fact-checking organizations that will fact check posts, will fact check site. But here’s the problem — and it gets back to something that Susan said that I think is incredibly important is that what we have seen unfolding over the last few years is not just people believe the earth is flat. Not just believe that Hillary Clinton is running a child porn ring out of a pizza joint in DC. It’s that they also, now, as Geeta was saying, they don’t trust the media. You know who else they don’t trust? They don’t trust the government. They don’t trust institutions. They don’t trust experts. They don’t trust you, Henry. And they don’t trust me because we’re a bunch of liberal-loving academics. And when you get into that world, where we don’t trust institutions, we don’t trust governments.

We don’t trust experts. We don’t trust scientists, fact checked all you want. It’s not going to matter because people know what they know. And they listened to who they need to listen to. And back to Susan’s point is, if we get into this world, which we’ve already, are dangerously into, where we don’t trust our government, we don’t trust the media. We don’t trust the experts. How do we address social change? How do we address climate change? How do we have democracy? How do we deal with a global pandemic? And so that’s the fear I have is that we’ve completely eroded trust. And so, is it too late? Can we return from this? Are we ever going to get to a place where people are going to trust the fact checkers? I don’t know. I’m fearful that we may be getting close to that tipping point of no return.

Henry Brady: Well, in my research, what I’ve shown is that over the last 50 years, there’s been an extraordinary diminution in trust for institutions and a polarization and trust. As I mentioned at the outset, and it’s something that has me equally worried: It used to be that most people from both parties trusted major American institutions and they didn’t have particularly different opinions about those institutions. Now, it’s highly polarized and it makes it very hard for those institutions to operate. For example, in the midst of the COVID epidemic, when people don’t trust science, they don’t trust medicine, they don’t trust all sorts of institutions. So, Susan, are there other things that other countries are doing that we could think of that might help solve some of these problems or is there really very little experimentation that looks useful?

Susan D. Hyde: I tried to talk about a couple of those things in my last round of remarks, but I do think that I’m going to re-emphasize one of the problems. And I’ll ask the question, which is that I see some of that distress, Henry, this is really a question for you, but happy if others want to answer it. I have read with hope that partisans follow party leaders, and so, part of the problem that we’re in right now is not because citizens are pulling the Republican Party really far into this arena of distrust and of distrust of media, of distrust, of government, all of these things, we can go back to Reagan. We can talk about where this came from, “I’m here from the government and I’m here to help,” as a terrifying phrase. I think that I would like to talk about whether there’s anything that can be done to increase the pro-democracy nature of the Republican Party that could be through business interests.

I think there’s a lot of evidence to suggest from other countries that businesses prefer to live in democracies, right? They prefer to do business in democracies. They prefer to have their headquarters in democracies. It is authoritarian regime tend to be a lot more corrupt and you just have to pay more bribes. You have to, you have to deal with less stringent regulations.

So, I can go on about this forever, but I wanted to sort of pose the question: Is this a locus where we should be focusing attention for reform? And is there any reason to be hopeful about that? Because from my view, comparatively, you just can’t fix this problem when you have a major political party in a two-party system leading their supporters in this super extreme direction. And they themselves seem to be afraid of their most extreme supporters. I’m not sure that they’re all true believers. There may not be that many people who are real true followers of QAnon, but man, it does seem like congressional leaders are afraid of them right now.

Henry Brady: Yeah. I think it was interesting that during the last election, we’ve found some companies who actually came down on the side of democracy and of course, people on the right were highly critical of that. I think shocked, in fact, to find out that people that traditionally had been supporters of the Republican Party were suddenly being critical of the Republican Party and what had happened during the November election. And by the way, that brings me to Fox News, which a question from the audience is here and they want to know what can we do about Fox News? And I’d be interested in anybody who has any ideas about that, or are they just a fact of life and you have to live with them because of the marketplace of ideas Erwin?

Erwin Chemerinsky: Yeah, 30% of Americans who said at the beginning, believed that Donald Trump won the election. 15% of Americans believed that QAnon and what they say is true. What do we do about that? That’s the same question you asked about what do we do about Fox News? Do we want someone to have the power in our society to say that those things are false? And to exclude speech that we believe is false? Who would we want to give that power to? I’d be very afraid to say, somebody has the power to say, “This is what’s true and false with the Gordy election or QAnon or on Fox News.” Because if you give that power to us today, tomorrow, we’re not going to be in authority. And they’re going to decide that what we believe is false in censor us. So, if you don’t like Fox News, don’t watch Fox News, but I don’t think the solution can be censorship or liability for the speech that we don’t like.

Henry Brady: We’ll get to Hany in a minute. Let me just say one of the things that stuns me is the Tucker Carlson, however, is used in court cases, the defense that nobody believes pretty much what he says and therefore, how can you possibly criticize him for saying things that are untrue? Because actually he’s just a storyteller. So there with our liable loss Erwin?

Erwin Chemerinsky: This has come up in the context of Sidney Powell and the false thing she said about the election machines and there is liability for defamation, false things that injure reputation, and there should be liability for that. And I think it’s a silly defense for Sidney Powell to say, “Oh, no one believed me anyway.” That’s never been a defense to defamation. So, I’m not an absolutist. I do think we can have liability for defamation other things, but I’m very afraid of giving anyone the power to decide in our society what’s politically true and what’s false in sense of what we don’t like.

Hany Farid: I’d like to make the point that I don’t think that’s what I’m saying Erwin, I’m not saying that the government should decide what is true and false. And I don’t think any reasonable person would say that. What I’m saying is that social media don’t take two ideas and put them on an equal platform. They don’t take Trump won the election, Trump didn’t win the election and give them fair ground. They are biased. They give the most outrageous conspiratorial and sensational content more airtime than less airtime.

And I’m simply saying, I don’t want to hold the companies or the government responsible for what’s true or not, but I want a fair marketplace. And I don’t think it’s too much to ask for a fair marketplace by all means, let all the ideas be out there, but let’s let it be fair. Let’s not favor the most outrageous and salacious. And because they’re doing exactly what you are afraid of, but in the opposite direction. And I’m just trying to level the playing field, as opposed to saying what is true and what is not true, which I don’t think we want to get in the business of doing,

Henry Brady: But how do we get there, Hany? I think the question is that I think we would wish many of us that their algorithms didn’t try to focus on the most outrageous communications because that gets the adrenaline pumping and gets us all very excited. And we want to know more. But how do we do that? It’s not clear to me that there’s an obvious way to do it, short of trying to really get inside the algorithms and tell the media companies how to do those things.

Hany Farid: I don’t think there is an obvious way to do it, and I think it has to be a multidimensional. So, first of all, we need some competition in Silicon Valley. There’s basically five tech companies right now, and they’re multi-trillion dollar companies and there’s no oxygen in Silicon Valley for better ideas for better business model, for a better moral compass.

And so, we need to really think about how to give some oxygen. And, by the way, when Google steps up and says, “No, no, no, we should not be regulating the technology sector or doing anything with antitrust,” we should remind the folks at Google is that the reason they exist is because of the Department of Justice stepped in and told Microsoft to knock it off and give room to this little upstart. And so, there is room for the government to step in and say, “We need more oxygen.”

I disagree with Erwin here. I think we need modest, sensible, regulation to hold the company somewhat responsible for how their algorithms are selecting content. Not for what is being selected, but how it’s selecting. We obviously need more education, we obviously need technology to do better on the algorithmic side without introducing a whole other set of biases.

For example, we don’t want algorithms that are biased content generated by women or content gendered by people of color. So, we have to be thoughtful about that. And I don’t think any one of these things is going to get us, but I think we have to pull on all of these strings equally and find a more civilized online platform. And look, don’t get me wrong, I’m a technologist. I’m a computer scientist by training. I believe in the power of technology. I really do, but this is not the internet I was promised 20 years ago.

Henry Brady: Thank you, Hany. Susan next and then I want to end with Geeta, who’s going to tell us what she thinks the future of journalism is.

Susan D. Hyde: I just wanted to say that I’ve made this point before, but I wanted to say very bluntly — that I think it will be a real shame if democracy dies on the alter of free speech. You don’t get to have free speech in authoritarian regimes. It’s just not how it works. We don’t see that anywhere. And so, I just want to emphasize that we have to get out of this. I remember it’s been a long time since I took DSI for Constitutional Law, but I’m pretty sure that you can’t falsely yell “fire” in a crowded theater. And, I believe that still holds and there just has to be a way for us to do something about this.

The other thing I just wanted to add, for those who are interested in this, there’s a wonderful book by Robert Doll, it’s older called After the Revolution. And he talks about the power of expertise versus the power of democracy. And it is wonderful and you should read it. I think it has some really interesting insights about when expertise is necessary for a democratic system to continue to function. And I think that’s some of what we’re talking about here. Geeta, I feel like that’s journalists, in many cases.

Henry Brady: And Geeta, I hope you can say some encouraging words that the future is possible for journalism.

Geeta Anand: I think as it has become clear in this conversation, the solution has to be multifaceted. It’s not getting rid of free speech, it’s not just government regulation, it’s not just focusing on social media, because as Erwin rightly pointed out, what about Fox? So, we need to get beyond this state in human evolution where disinformation proliferates, where it’s given an unfair advantage. And when we have a deeply polarized public here in the U.S., but around the world and where authoritarian regimes can use the unfair advantage that social media gives that kind of speech to benefit themselves. And this is all putting democracy in great danger and putting journalism in great danger and it matters — like the oxygen of democracy. I’m gravely worried, but I’m also hopeful because it is so vital to democracy and because as some of you have noted, powerful business interests also want to operate in a democracy.

I’m hopeful that a coalition of the good and of people deeply committed to democracy will emerge and is emerging right now on this campus and elsewhere to stand up and invest in journalism as a vital tool for democracy and in democracy itself, and in being bold, taking risks, thinking out of the box. I’m not holding on to any one principle that worked in the past as being the ultimate one that we all need to serve for the greater good of democracy in this country.

Henry Brady: Well, thank you, Geeta. And it shows what an important role by the way the university is playing in a lot of this. I want to thank an extraordinary panel and a wonderful discussion: Geeta Anand, the Graduate School of Journalism; Erwin Chemerinsky, the dean of the law school and Erwin, thank you, especially for being willing to push a particular point of view and give us something to discuss, which was great. Hany Farid, who is the head of the School of Information. Susan Hyde, who’s the chair of the Department of Political Science and john powell, who is the director of Othering and Belonging Institute. This has been a fabulous panel. I thank you all.

[Music: “Silver Lanyard” by Blue Dot Sessions]

Outro: You’ve been listening to Berkeley Talks, Berkeley News podcast from the Office of Communications and Public Affairs that features lectures and conversations at UC Berkeley. You can subscribe on Acast, Apple Podcasts or wherever you listen. Also, check out another podcast of ours, Berkeley Voices, about the people who make UC Berkeley the creative, quirky, world-changing place that it is. You can find all of our podcast episodes with transcripts and photos on Berkeley News at news.berkeley.edu/podcasts.