Why conspiracies are so popular — and what we can do to stop them
UC Berkeley Professor Timothy Tangherlini uses lessons from folklore and AI to understand how social media fuels the spread of conspiracies, and how we can use storytelling tools to stem the tide of misinformation.
![An illustration shows a man being manipulated like a string puppet by a disembodied hand.](https://news.berkeley.edu/wp-content/uploads/2025/02/Illustration.jpg)
Getty Images for Unsplash
February 5, 2025
Even in the face of overwhelming evidence, false narratives can be incredibly sticky. Many people insist that the earth is flat, that childhood vaccines cause autism, or that climate change is a hoax, despite ample scientific evidence to the contrary.
“Stories are very powerful,” said Timothy Tangherlini, a UC Berkeley professor in the Department of Scandinavian and the School of Information. “We’re much more comfortable with hearing stories that confirm our beliefs than ones that challenge them.”
Tangherlini sees narratives like these, and the many other conspiracies that are rife in today’s internet culture, as a type of modern-day folklore. As a computational folklorist, he uses AI tools to study how social media networks have accelerated the spread of conspiracies and false beliefs, and what, if anything, we can do to slow them down.
Following an election cycle dominated by conspiracies and hoaxes — from elites controlling the path of hurricanes, to 20 million missing votes for Kamala Harris and immigrants eating people’s pets — Tangherlini’s work is more relevant than ever. Berkeley News spoke with Tangherlini about why conspiratorial thinking has flourished in recent years and how we might spread stories of inclusion and truth that are powerful enough to stem the tide of false belief.
UC Berkeley News: What motivated you to study conspiracy theories through the lens of folklore and storytelling?
Timothy Tangherlini: I think of conspiracy theories as narrative constructs, as fictional. And they can be very powerful because they are stories. Narratives are very efficient at encapsulating norms, beliefs and values — and when we tell them over and over, they get pared down to the most efficient kernel of narrative weight.
![Timothy Tangherlini](https://news.berkeley.edu/wp-content/uploads/2024/11/tim-tangherlini-550-01.png)
UC Berkeley
These belief narratives — stories that we tell each other that we believe to be true — can influence belief, and these beliefs then create a feedback mechanism, so that once you’ve got a belief, it’s very hard to change it. You start to seek out narratives that confirm your beliefs.
I’m particularly fascinated by the fact that so many of these stories wind up being about outside threats. Often, it’s the Ghostbusters question: When ghosts appear in the neighborhood, who are you going to call? Or, how are we going to deal with some sort of threat to the integrity of our community?
These threats can then force real world action, an example of which we saw with the Jan. 6, 2021, U.S. Capitol attack. The question becomes: How do you interrupt these kinds of narratives when they start to have a significant impact on democratic institutions and civil society?
We’re living in a world that seems rife with conspiracy theories. What is it about our current society that makes it so prone to conspiracies?
We as humans tend to surround ourselves with people who have similar beliefs, and we also align our beliefs with the people around us. You might believe things, but you want to be part of the group, so you adjust your beliefs — you negotiate the boundaries of belief.
On social media, your idea that the people that you’re interacting with share your beliefs, values and norms may not apply — in part because many of them may actually be robots.
Tim Tangherlini
This process has been profoundly interrupted by the advent of social media. The groups that we interact with online are no longer the close, homogeneous groups that we are used to and were socialized in, so the social brakes that used to be there have come off, and the speed and directionality of messaging has changed magnitude — things can get out much faster.
People have also worked really hard to erode our trust in the media. We used to have newspapers of record, like the New York Times or the LA Times. You might not have agreed with their opinions, but you could trust the underlying reporting. Now there’s been a concerted effort to challenge the underlying reporting itself. And with the advent of generative AI, it’s also possible to generate not only audio recordings but also visual recordings — deep fakes — and newspaper articles that give the illusion of being true, but really aren’t.
As soon as you start losing confidence in your news sources, then you’re going to turn to these other narrative sources — those could be your friends, they could be your family, or they could be people who you think share your values on the internet.
Could you talk a little bit more about these social brakes and how social media has interrupted them?
We’re all part of groups in real life, even if they are just friend groups or families. When I start talking, my family will often shut me down because they know that I just talk too much, right? Or, if I was out with friends for tacos and beer and I said, “Well, did you hear what happened in Roswell?” my friends would respond, “Shut up, Tim.”
Those are the kinds of social brakes that we’re all familiar with. It can be as simple as that. But there are effectively no social brakes on social media. You might be interacting with people who just love to see a train wreck, and so they give you a thumbs up and away you go, off to the races.
And on social media, your idea that the people that you’re interacting with share your beliefs, values and norms may not apply, in part because many of them may actually be robots. I like to point out that no one sits down to pizza and beer with robots, but on social media, that’s what many of your engagements are. It’s very easy — shockingly easy — to create a bot army. It does not cost a whole lot of money. And that can influence behavior.
Some recent conspiracies seem to be driven by actual problems in the world that are difficult or complicated to understand. For instance, rather than understanding that climate change is making hurricanes stronger, many people believe that elites are secretly manipulating the weather. What is it that makes these alternative stories so much more compelling than the truth?
That’s a very good and hard question, and if I ever have the answer to it, I’m out of a job. But I think there are a couple of things at play. When you don’t have access to information or when you don’t trust the information that you do have access to, that will encourage you to turn to people who you do trust to understand what is going on. This is well established. And one of the things that we do to structure our understanding is to tell stories.
When you discount somebody’s concerns, you are no longer one of them. You are not part of their group.
Timothy Tangherlini
Say I’m trying to figure out what’s going on with the climate. I trust my community, but my trust in other information sources has been eroded. Those information sources might be framed in a way that makes them hard to understand, or maybe they contradict my own personal experience. These kinds of things then promote anecdotes, and these anecdotes — particularly related to personal experiences — can trump the scientific papers that most of us don’t really have the training to read or understand.
I may not trust the global warming narrative because how could there possibly be global warming when it’s freezing cold today? Or, I may not trust the narrative that vaccines save lives because my daughter cried all night when she got her shots.
These kinds of anecdotal stories hold a lot of weight within a community, particularly when you’ve started to lose trust in other information sources. And it requires a pretty heavy lift to try and figure out how to create stories that resonate with the community.
Are there any ways that scientists, politicians, journalists, etc., can nudge people back in the right direction?
Often people have wanted to look at these (conspiratorial) threats and say, “Well, those aren’t really a threat.” But when you discount somebody’s concerns, you are no longer one of them. You are not part of their group. And so you then lose any kind of opportunity you have to engage in any potential positive strategies.
At this point, you are much more likely to pay attention to an anecdote from someone that you trust in your community than to something that’s coming from a government institution or a newspaper.
Timothy Tangherlini
One option might be to propose alternate strategies for dealing with threats. So if people believe that immigrants are eating the dogs and cats in Springfield, Ohio, then there are a couple of things that you can do. You can take the strategy that was proposed in these narratives, which is to get the immigrants out of your community. Or you can say, “This is a problem in our community. People are going hungry. Let’s do things that mitigate hunger.” You could start flooding the market with stories that might actually appeal to other parts of this belief framework that everybody is walking around with in their head.
I’m working on a project right now with colleagues at Indiana University, Boston University and Stanford University that is trying to understand belief resonance and narrative. We’re trying to understand how a narrative resonates, how long it resonates and what impact it has. So for instance, if you hear a story as a kid, when you hear the story a little bit later, maybe as a young adult, you’re going to correct the story back to the way that you’ve heard it. And even if you don’t think that it has had a huge impact on your belief network, once you’ve heard something, you can’t unhear it — it’s going to be very hard to get you off that path.
Are there ways that we can make stories of inclusion or scientific understanding as compelling as these threat narratives, so that they are able to take hold and spread, rather than conspiracies?
I certainly hope so. Part of the research that we’re trying to do is to understand resonance: What sorts of stories resonate with different groups, and how can we interrupt stories that potentially have a negative outcome for everybody involved? If you know the storytelling of a community, you can start telling stories that will resonate with that community — for instance, you can tell stories that show how vaccines are actually very helpful to the community. But it may take a while to get uptake, and you might have to push out a whole bunch of different versions of stories before it becomes part of the cultural ideology of the group.
It might be that we can use social network analysis to understand how social networks are put together and get endorsements from people who have centrality or status in a group. At this point, you are much more likely to pay attention to an anecdote from someone that you trust in your community than to something that’s coming from a government institution or a newspaper.
We have to understand community. We have to understand belief. And then we have to be very empathetic to those beliefs and try to understand how we can generate messages that resonate, that don’t insult people, but also help them get information that they just don’t have.