By Elizabeth Lepro


A conversation among three people tackling disinformation from three distinct angles surfaced salient insights on threats to journalism, including the psychology of why some people believe untruths online and how reporters can best respond to these misguided beliefs.

The National Press Club Journalism Institute hosted the conversation with the manager of PEN America’s Journalism and Disinformation program, a New York Times misinformation reporter and an associate professor of psychology and neural science as the first of four webinars on ethics in the age of disinformation. Beth Francesco, executive director of the institute, led the conversation with a simple question – one that many journalists are grappling with as they cover the 2024 election:

Why do we believe what we believe? And how do we undo beliefs rooted in misinformation?

 

 

An overhead view of the crisis

PEN America has been talking with journalists about how they’re engaging with disinformation since 2021. 

In 2022, the we released “Hard News”, a report based on a nationwide survey of more than 1,000 reporters and editors focused on how disinformation is disrupting the practice of journalism. 

More than 90 percent of those surveyed said disinformation affected their experiences as journalists in recent years; 65 percent of respondents had faced hostility from the public, 48 percent reported feeling frustrated or overwhelmed, and 42 percent felt some portion of their audience had lost trust in them.

“It was clear that disinformation is having a very significant impact on journalism,” said Shannon Jankowski, manager of the journalism and disinformation program at PEN. “These results were from 2021… you weren’t even thinking then about generative AI and the prospect of what we’re hearing now from journalists of wondering, ‘how do I communicate with a public that is starting to distrust everything?’” 

Jankowski said the rise in disinformation is coinciding with layoffs and reduced resources across the industry. 

One question for journalists is how to engage with media that can’t be authenticated when the possibility exists for almost anything to be fabricated. Jankowski said disinformation researchers are also increasingly becoming targets of those seeking to sow discord, such as through frivolous lawsuits. 

“Groups that were trying to report on social media and on the ethics of social media, on their content moderation policies, [are] getting hit with lawsuits and defamation suits.” 

PEN has created a resource, Facts Forward, for journalists looking to understand more about the foundation and framework of disinformation. PEN’s Online Harassment Manual also provides resources for journalists facing harassment.

 

The science of belief

For Jay Van Bavel, director of the Social Identity and Morality Lab and an associate professor of psychology and neural science at New York University, there’s a methodical way to answer big questions about susceptibility and conspiracy. 

“The reason that we believe misinformation or conspiracy theories is in some ways just based on the way that we believe anything,” he said. 

His research over several years and across multiple studies has highlighted four key factors in what makes mis- or disinformation go viral: 

 

Negativity

A 2023 study by Van Bavel and his colleagues used a dataset of roughly 105,000 news stories from Upworthy, a news site dedicated to promoting positivity, and found that negative words in headlines increased consumption rates. In other words, even on a site focused on positive news, more people were clicking on stories with upsetting headlines.

 

Moral/emotional language

How social posts are worded makes them more or less shareable, according to a 2017 study by Van Bavel and others. When tweets included words that draw an emotional response, such as “afraid” or “love;” or words that indicated morality, such as “crime” or “mercy” – or words that did both, such as “abuse” or “spite” – they were 20 percent more likely to be shared. 

 

Outgroup animosity 

Van Bavel said news stories that include negativity about the “other side,” politically, tend to generate more attention. He attributed this to a sense of schadenfreude, a German word for the inclination to take pleasure in another person’s suffering. 

 

Extremism

Van Bavel’s most recent co-published paper, “Inside the Fun House Mirror Factory” examines how people with the most extreme opinions often post more than everyone else. Over-exposure to extreme opinions warps our perception of reality and heightens our sense of polarization. This theory aligns with the impact of the “disinformation dozen,” a group of 12 people – including presidential candidate Robert F. Kennedy Jr. – who studies have found are responsible for the majority of COVID-19 and vaccine information online. 

 

A reporter’s perspective

Tiffany Hsu, a misinformation reporter for The New York Times, said she’s seen first-hand the way social media discourse warps our understanding of truth and reality.

“There is kind of a false dichotomy when it comes to ideas and facts,” she said. “People think it’s just two sides, when really there’s a huge blend of facts in the middle … That’s how you get this effect that Jay was talking about of the middle being shouted down by the extremes.” 

Hsu recently published a story about conspiracy theories regarding Catherine, Princess of Wales. After she took a step out of public view to recover from an abdominal surgery, the internet went into overdrive inventing conspiratorial reasons for her disappearance, including the idea that she had died and been replaced by a clone.

This same accusation has been directed at Britney Spears, Elon Musk, and Kanye West. It’s a popular notion, in Hsu’s opinion, because it comes from a place of distrust, a general sense of chaos, and the opportunity for a “communal bonding activity.” 

“Social media makes us feel like we want to be engaged, we want to be participating in the solving of these crimes or these mysteries,” Hsu said. 

Disinformation researchers have pointed out that the participatory nature of political conspiracy theories, like QAnon, draws people in and makes them feel like they’re part of a community. This same desire for engagement contributes to the spread of election-related misinformation, Hsu said, including the widespread and disproven accounts of voter fraud in 2020.

 

Tips for journalists

Here were the panelists’ quick tips on how to avoid contributing to the spread of misinformation:

  • Jankowski: When a piece of news triggers an immediate emotional response, “take a beat to stop and consider the situation before acting on your emotions.”
  • Van Bavel: Create a system of checks and balances by “embedding yourself in a community who will fact check you,” he said. “We are smarter by creating norms of accuracy and reinforcing them, and then also [joining] institutions that reinforce, support, and incentivize accuracy.”
  • Hsu: Don’t decide on a narrative for a story before you’ve done your reporting.

“I think journalists – and really all members of the disinformation-fighting community – need to remember to be sympathetic to people who disagree,” Hsu said. “Disagreements are a critical underpinning of democracy.

I think a lot of journalists are starting to understand that the only side we can take is the side of provable truth.”