Research, Technology & engineering, Business & economics, Politics & society

Hany Farid: To limit disinformation, we must regulate internet platforms

"Thoughtful, sensible" rules could make media companies legally liable for damage caused by disinformation, the scholar says.

By Edward Lempinen

UC Berkeley professor Hany Farid stands at the front of a classroom, gesturing at equations projected onto a large screen

UC Berkeley scholar Hany Farid has emerged as one of the world's leading authorities on the spread of online disinformation — and how to bring it under control. Farid is a professor in the Berkeley School of Information and the Department of Electrical Engineering and Computer Sciences.

Brittany Hosea-Small for the UC Berkeley School of Information

The U.S. is enveloped in a storm of digital disinformation so powerful that it’s putting lives, and democracy, at risk. Policymakers and many others blame Facebook, the former Twitter, TikTok and other social media, but that’s too simple, says UC Berkeley computer scientist Hany Farid.

logo with a black map of the U.S., a blue background, and words in white font: Democracy in Distress
Berkeley News is examining threats to U.S. democracy in a series drawing on the expertise of UC Berkeley scholars.

Yes, Farid says, social media deserve some blame — they’re now the main news source for most Americans. But, he argues, many politicians today lie with impunity, especially on the right. And all of us who rely on partisan news shows and social media for our understanding of the world are responsible, too.

But to change the communications landscape, "we need government oversight," he said. "That's what the government is supposed to do: take my tax dollars and keep me safe. Deliver clean water, clean air, keep the roads safe and make sure nothing kills me."

In remarks that were blunt, irreverent and sometimes ominous, Farid advocated for "thoughtful, sensible regulation" to make media companies legally liable for the very real damage caused by disinformation.

And policymakers must act swiftly, he warned, because new artificial intelligence tools are already making disinformation more powerful than ever.

Farid is a professor in the School of Information and in the Department of Electrical Engineering and Computer Sciences. He is one of the world’s leading experts on digital forensics, human perception and "deepfake" images in politics; he has made presentations at the White House and testified before Congress, the United Nations and policymakers overseas.

The interview has been lightly edited for length and clarity.

Berkeley News: Just a short time ago — 10 or 15 years — smartphones and social media were so exciting and so hopeful. Now disinformation pervades our lives, and I can’t help but think: Something has gone terribly wrong.

Hany Farid: I'm glad you asked the question, because I think it is easy to point your finger to social media and technology and say, "Aha! They’re to blame!" They absolutely have a part to play in the mess. But I don't think it's just social media.

Donald Trump, wearing a blue suit, white shirt and red tie, sits pensively in the White House Rose Garden during an October 2020 town hall event sponsored by conservative Sinclair Broadcast Group
Social media and partisan news media have responsibility for the rising tides of disinformation, said UC Berkeley scholar Hany Farid. But the increasing prominence of political leaders who "have normalized lying" — especially on the right — also plays a critical role, he said. 
Joyce N. Boghosian/White House
First of all, our politicians have normalized lying. I don't want to be political about it, I don't want to be partisan about it, but Donald Trump has normalized lying. The Washington Post counted the number of lies he told when he was president: It's more than 30,000 lies in four years. It's not subtle — it’s outright lies. That's insane.

We now have a member of Congress, George Santos, who made an entire campaign based on a fabricated backstory. Again, this issue is not uniquely on the right, but I do think you're seeing it more on the right. I'm sure there are politicians on the left who are lying, too, and in the middle.

But distortion, propaganda, lying — these aren’t new phenomena.

Some of us are old enough to remember when there were three news stations, and they reported more or less the same news. That wasn’t the best thing, but we were all on the same page. We were all getting the same information.

People overuse the word 'dystopian,' but this is the dystopia we've been worried about. I think we're in it. 
You can't say that about news today. The vast majority of Americans now get their news, if you can call it that, on social media. And that is highly curated — curated to be consistent with your worldview, because they want you to keep coming back for more.

So our politicians are lying, but they can do that both because of social media technology and also because of the mainstream media. We now have echo chambers — Fox News on one side and MSNBC and CNN on the other side — that have become highly, highly polarized. Social media amplifies and fans that flame in a way that I think is something that is new in the last 10 years.

If you're a Republican, you turn on Fox News and you don't hear about Access Hollywood. You never even get it.

Hasn’t government, in the past, tried to regulate media — broadcast media, at least — to support information integrity?

Ronald Reagan, in the 1980s, eliminated the Fairness Doctrine. The Fairness Doctrine told broadcasters, "Look, you can't be partisan." When we eliminated that, it was the beginning of the end. The shock jocks, Alex Jones — that’s the result.

Talking heads are cheap, lies are cheap — and they’re engaging. You outrage people. You make them angry. You keep them coming back for more. And that's the information landscape we’re in.

And here's the thing: There doesn't seem to be any real consequence for lying.

On a rainy day soon after the 2020 presidential election, a woman standing with other protesters holds a sign saying:
Disinformation from highly partisan news media and social media have left many Americans living in a separate reality, said UC Berkeley computer scientist Hany Farid. 
Alex Milan Tracy/Sipa USA via AP
So we no longer have two parties arguing about government intervention, taxes, international affairs based on facts. Now Democrats say over a million people have died from COVID in the United States, and Republicans are saying, "No, they haven't."

How do you have a democracy without a shared factual system? We can disagree on abortion and prayer and taxes, all of these things. But if we don't agree on basic fundamental facts, there is no conversation to be had.

You suggest that people are drawn to disinformation because of confirmation bias — they’re drawn to information that confirms their worldview, even if it’s distorted or wrong. Why is there such a high demand for disinformation?

I'm not a social scientist, but here’s my understanding: We are drawn to being outraged. You’d think we'd be drawn to things that make us happy and content, satisfied and calm and peaceful. But no, we actually want to be outraged.

That's what social media has tapped into. It's like the worst of human nature.

I don't think, by the way, that the social media set out to outrage people so that they could deliver more ads. The algorithms just figured it out. If you tell the algorithms, "Optimize for user engagement because this will help us optimize profit," the algorithms are like, "Alright, whatever it takes."

If it takes delivering images of dead babies, that's what it's going to do. It doesn't have morality.

What you’re describing — that really happened?

Back in the day when Facebook went crazy and went from, like, 100 million to 3 billion users, their policy was engage, engage, engage — engagement-driven metrics were what drove the company. We know this from internal documents, we know this from whistleblowers.

It's actually not complicated. Facebook delivers an article to you, and you click on it. Then the algorithms start learning the patterns of getting people to click on different posts. The pattern may be different for you and for me, but there’s a thing about human nature: We like being outraged.

An image produced by artificial intelligence appears to show dense black smoke outside of a building identified as the Pentagon near Washington, DC
Last May, an image emerged in social media that appeared to show a bombing the Pentagon. Initial reports sent tremors through the stock markets, but experts soon debunked the image as a fake generated by artificial intelligence.
And so the algorithms just want to deliver that because that's what kept us clicking like a bunch of monkeys.

Social media are not here to make the world a better place. They're here to maximize profits for their shareholders. … People overuse the word "dystopian," but this is the dystopia we've been worried about. I think we're in it. We just don't know it.

In fact, many people are cheering it on.

That's the landscape. There are plenty of people who blame social media. And social media are absolutely high up on the list. I think also that politicians are to blame. I think the mainstream media is to blame. And I think we're to blame — we, the people. We shouldn't let us off the hook — we're the idiots who keep clicking on this stuff and keep going back to Facebook and Twitter.

So it’s evidently harmful, but if people like it, and it’s profitable, how can we fix it?

Facebook has shown us that it’s possible. They can just change the algorithm that decides what content to show us. They know how to get rid of the nasty stuff and recommend the good stuff instead. There’s no technological limitation.

So what we need is an external pressure. How about bad press? We've been writing bad articles about Facebook for five years — nothing. They just brush it off.

How about ad boycotts? We've tried to have boycotts. There have been big ones, too, against Facebook. They all peter out — they don't work.

A lot of the discussion in recent years, and a lot of the conflict with social media companies, seem to focus on the potential impact of government regulation.

We need government oversight. That's what the government is supposed to do: take my tax dollars and keep me safe. Deliver clean water, clean air, keep the roads safe and make sure nothing kills me.

You want to deal with global pandemic, climate change, elections? You've got to deal with tech.
Europeans are doing a good job of it. They came out with a dossier recently, the Digital Safety Act. The Brits are coming out with an online safety bill. The Australians have passed one.

The Americans?

The Americans are lost at sea.

What might be a good approach to government regulation in the U.S.?

Again, some of us are old enough to remember a time when there were serious problems with products — a car would explode, pajamas would catch on fire. There was no legal liability for product safety. There was no legal incentive just to keep us safe. Ralph Nader (the consumer protection activist) changed the world. He said, "Look, we're going to sue you back to the Dark Ages if you knew or should have known that your product is dangerous and that it maims and kills people."

UC Berkeley disinformation expert testifies before the U.S. Senate Judiciary Committee in March 2023
UC Berkeley disinformation expert Hany Farid testified before the U.S. Senate Judiciary Committee on March 8, 2023. Farid urged lawmakers to scale back protections for social media and other internet platforms so that they can be held legally liable when the design of their systems leads to damage or injury for users. 
U.S. Senate Judiciary Committee
You can't say that about social media. They have no responsibility. It's enshrined in the laws — Section 230 of the Communications Decency Act. (Section 230 holds that internet service providers and platforms are not legally liable for material disseminated on their sites.)

And I'm saying: "Facebook, if over 50% of people who join a hate group do so because you recommended those groups to them, then you have a responsibility for what comes after that." That's actually true, by the way. Over 50% of people who join white supremacist groups on Facebook do so because Facebook says, "I see that you like Donald Trump. You might also like these groups." How do you not have responsibility?

What sort of regulation could address that?

The solution — this is where it's going to get hard — is thoughtful, sensible regulation that changes the calculation for the social media companies to say, "We can't have this free-for-all, where anything goes."

We should not repeat history. We should learn that if this is left unchecked, it is not going to end well for us.

The problem is, if you go to Capitol Hill and talk to our regulators on the left and the right, they all hate the technology sector. But the right has fallen into this false narrative that technology is anti-conservative. It is not. Conservatives dominate social media. The left thinks that they're destroying the world, which is closer to the truth. But we don't agree on the problem, and that means we can't agree on solutions.

Twenty percent of Americans believe Bill Gates created COVID to put a tracking device in you. Why? Because the internet told them so. Thirty percent of Americans believe climate change is a hoax. Fifty percent of Republicans believe that Donald Trump won the 2020 election. These are existential threats to public health, our planet and our democracy.

We've got to deal with tech. It's the whole ballgame.

Last question: Artificial intelligence (AI) has advanced so quickly in recent months. It’s so much more available. How does this change the landscape?

We're dealing with a decade-old problem because we've been dragging our feet for so long. Now we’re seeing all of these advances in AI and ChatGPT, and deepfakes — there's a whole new set of problems on the horizon, and we are already behind in thinking about how to manage them.

There's something exciting about it, but we’d better start getting our heads around this before it's too late again. You might excuse the first 20 years of the technology sector — we were like, "Oh, great, we’ve got an iPhone in our pocket. This is fun!"

But we should not repeat history. We should learn that if this is left unchecked, it is not going to end well for us. My hope is that the old saying is true: “You can always count on Americans to do the right thing, eventually, after all the wrong solutions have failed.”