Work life

Pandemic provides real-time experiment for diagnosing, treating misinformation

By Public Affairs

UC Berkeley faculty discuss the nature of misinformation during the pandemic and how to assess, evaluate and engage more effectively with the information resources we consult.

With the COVID-19 virus spiking worldwide and the need for accurate information about it more important than ever, four data science experts from UC Berkeley held an online discussion on the origins, amplification and impacts of the current infodemic of mis- and disinformation that is jeopardizing measures to control the pandemic. The Dec. 8 Berkeley Conversations event was hosted by Nobel laureate Saul Perlmutter, director of the Berkeley Institute for Data Science (BIDS) , which co-sponsored it with the Division of Computing, Data Science, and Society.

The experts’ prognosis? The overabundance of information, both online and offline, about COVID-19 — including wrong information deliberately placed there — continues to polarize the nation, and it will take a massive effort over many years to overcome the problem. The general public is too divided or incognizant, social media companies are profiting from the status quo and the law exempts platforms from the responsibility of hosting incendiary postings, according to the panel. The conversation can be viewed on YouTube.

Deirdre K. Mulligan, a professor in the School of Information, opened the session by explaining that misinformation typically refers to objectively false information that is spread by people who think it’s true. Disinformation is intentionally false information that is designed to deceive the public, destabilize trust in public institutions or reap economic gain, such as through advertising. As examples of disinformation, she cited President Trump’s tweets that undermined trust in science, medicine and government institutions.

When the pandemic began in early 2020, “the social media companies had created the ingredients for the COVID misinformation and conspiratorial landscape we’re dealing with today,” said Hany Farid, a professor in the School of Information and the Department of Electrical Engineering and Computer Sciences. He cited two reasons: The firms have little to no editorial standards, and they algorithmically reward conspiratorial content, which draws more viewers of advertising than does other content.

As an example, Farid conducted a longitudinal study of what’s posted on YouTube and found that, in 2018, one in 10 of the recommended videos on informational channels like PBS and the BBC were furthering conspiracy theories. He also said that 70% of the videos watched on YouTube are those it recommends.

As an example of how this plays out, Farid cited surveys that showed people in the U.S. who self-identify as being on the right side of the political aisle are 11.9 times more likely to believe that gargling with warm water and salt or vinegar gets rid of COVID-19 and 6.3 times more likely to believe that COVID-19 was man-made and not a natural virus. Those on the left side of the aisle are 4.5 times more likely to believe that Trump tweeted that stimulus checks would only be sent to people who had not criticized him on social media.

“Our online information landscape is just a mess, and we need to start to get a handle on it,” Farid said.

Nick Adams, a former BIDS Research Fellow and founder of Goodly Labs, has developed collaborative tools for citizen scientists to engage with publicly available data. He first honed one of them, called Public Editor, with Perlmutter and others at BIDS. Today, it is being used to access credibility in the news. Suspected false statements found online are sent to volunteer reviewers who review them and assign corrective labels to misleading words and phrases as part of a system that includes eight separate steps. Eventually, these efforts could be used to create artificial intelligence-based automated reviews, but Adams said that the data science of analyzing natural language “is still the frontier.” In the meantime, “the community is excited to take on this challenge,” Adams said.

Social media platforms are reluctant to take measures on their own as it can cut into revenue, Farid said. In the days before and after the Nov. 3 election, Facebook changed news feeds to favor information from trusted news outlets, Farid said, which caused marginal content to slip lower in feeds.

“What else happened was that people engaged with Facebook less,” he said, so Facebook switched back to feeds that contained mis- or disinformation. “The crazy stuff … the most outrageous, conspiratorial, is good for business.”

The problem is that there is little recourse for making the social media companies more responsible for deterring mis- and disinformation, Mulligan and Farid said.

Users can’t really apply pressure because “we’re not the customer, we’re the product” that Facebook is actually selling to its advertisers, Farid said. And even though a number of advertisers boycotted Facebook earlier this year, the company still posted record profits, he said, adding that “the government is the last place we can look to for some relief.”

At issue is Section 230 of the Communications Decency Act of 1996, which states: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” unlike print media outlets, which are subject to libel laws.

Adams said that some clauses in Section 230 should make it easy to apply third-party filters to social media sites, but that this is not happening.

Mulligan said that the issue is not just a technical problem, but a human-centric problem.

“Increasing the average person’s awareness of the ways in which their information environment is being curated and influenced not just by their own choices, but by the choices of lots of other players who are vying for their eyeballs, is super important,” she said. “So, pull back a curtain so people understand that, in an algorithmic society, things are not just what you see, that these things are being chosen and fed to you for particular purposes.”