Researchers compared data from a real conspiracy—the "Bridgegate" political payback scheme in which New Jersey political operatives closed down lanes on the George Washington Bridge—with those from "Pizzagate" conspiracy theories to create an artificial intelligence tool.
December 8, 2020

The eternal problem with conspiracy theories is that we know from both history and current events that there are very real conspiracies at work in the world. How can we distinguish them from the utterly fabricated fantasies that comprise the entirety of the conspiracy-theory universe?

There are some simple ways to distinguish them, but they are also fairly crude and generalized rules, and the distinctions can sometimes be nuanced. So researchers at the University of California at Berkeley have devised an artificial intelligence tool that can help people figure out whether they’re tapping into an actual conspiracy or just participating in a cockamamie fantasy.

Cal Berkeley cultural analyst Timothy Tangherlini and his team “developed an automated approach to determining when conversations on social media reflect the telltale signs of conspiracy theorizing,” using machine learning tools capable of identifying narratives “based on sets of people, places, and things and their relationships,” with the hope of forming “the basis of an early warning system to alert authorities to online narratives that pose a threat in the real world.”

Once the layers of the narrative are identified, the model determines how they come together to form the narrative as a whole. It can then map all this data out into charts that show utterly distinct shapes for actual conspiracies and conspiracy theories—indeed, showing that they have little in common.

There are some useful rules of thumb already available for distinguishing between a real conspiracy and a conspiracy theory, beyond recognizing that the former has a reasonable likelihood of being real, while the latter is almost certainly a falsehood intended to scapegoat other people. As I explain in my book Red Pill, Blue Pill: How to Counteract the Conspiracy Theories That Are Killing Us, we’re already capable of distinguishing them based on the basic parameters imposed by reality upon conspiracies:

Real conspiracies, by their very nature (including their dependence on secrecy), have three major limitations:

  • Scope. Their purpose is usually to achieve only one or two ends, often narrow in nature.
  • Time. Their actions necessarily occur within a relatively short time frame.
  • Number of participants. All successful conspiracies are the product of only a tiny handful of people.

As the boundaries of all three of these limits increase, however, the likelihood of the conspiracy failing or being exposed rises exponentially. The broader the reach—if it attempts too much—the more likely it is to meet failure simply as a matter of raw odds and the nature of institutional inertia. The longer it takes, the greater the risk of exposure, not to mention for components of the conspiracy to go awry. Similar issues arise when increasing numbers of people are involved in the conspiracy, both the likelihood that they will fail to complete their part of the conspiracy as well as the growing chances of exposure. And exposure is fatal to every conspiracy: once the secret is out, it’s no longer a viable plan of action.

Conspiracy theories, on the other hand, almost universally feature qualities that contrast sharply with these limits.

  • They are broad-ranging in nature, and frequently boil down to (or play key roles in) a massive plot to enslave, murder, or politically oppress all of mankind or at least large numbers of people.
  • They are believed to have existed for long periods of time, in some cases for hundreds of years.
  • They involve large numbers of people, notably significant numbers of participants in high positions in government or the bureaucracy.
  • The long-term success of these conspiracies is always credited to willing dupes in the media and elsewhere.

The Cal Berkeley AI model largely reflects these same parameters when it goes to work. The team studied three primary and sometimes overlapping zones of the conspiracy-theory universe: Pizzagate, the COVID-19 pandemic, and the anti-vaccination movement. (It’s currently applying to the tool to the QAnon conspiracy cult; the results should be interesting.)

The Pizzagate world (which is closely related to the QAnon phenomenon) was particularly rich with data:

We analyzed 17,498 posts from April 2016 through February 2018 on the Reddit and 4chan forums where Pizzagate was discussed. The model treats each post as a fragment of a hidden story and sets about to uncover the narrative. The software identifies the people, places and things in the posts and determines which are major elements, which are minor elements and how they’re all connected.

The analysts then also examined the same kinds of data regarding the so-called “Bridgegate” conspiracy—a very real political payback operation in which New Jersey public officials, mainly members of then-Gov. Chris Christie’s staff, deliberately created traffic jams by closing lanes on the George Washington Bridge. The results show the unmistakable differences in basic structure of the respective narratives, and how the facile appearance of similarities between conspiracy theories and the real thing falls apart in ways similar to the theories themselves.

Conspiracy theories, the researchers found, are collaboratively constructed and form quickly. “Actual conspiracies are deliberately hidden, real-life actions of people working together for their own malign purposes,” Tangherlini explains. “In contrast, conspiracy theories are collaboratively constructed and develop in the open.”

Conspiracy theories are deliberately complex and reflect an all-encompassing worldview. Instead of trying to explain one thing, a conspiracy theory tries to explain everything, discovering connections across domains of human interaction that are otherwise hidden — mostly because they do not exist.

While the popular image of the conspiracy theorist is of a lone wolf piecing together puzzling connections with photographs and red string, that image no longer applies in the age of social media. Conspiracy theorizing has moved online and is now the end product of a collective storytelling. The participants work out the parameters of a narrative framework: the people, places and things of a story and their relationships.

By mapping out how these conspiracy theories originate and spread—and particularly the networks through which they are generated—analysts may be able to anticipate when they explode into their inevitable real-world violence. More to the point, it can help researchers identify the wellsprings of misinformation on social media and elsewhere so that those spigots can be shut off.

As I explain in Red Pill, Blue Pill:

Conspiracy theories are a problem for healthy democracies not only because they encourage people to disengage from their communities and abjure their political franchise by discarding it all as useless, but also because they represent serious pollution of the information stream. Democracies rely on robust debate, but that “marketplace of ideas” cannot function if the debate is founded on falsehoods, smears, and the wild speculations that all combine to take the place of established facts in any discourse with conspiracy theorists.

Published with permission of Daily Kos.

Can you help us out?

For nearly 20 years we have been exposing Washington lies and untangling media deceit, but now Facebook is drowning us in an ocean of right wing lies. Please give a one-time or recurring donation, or buy a year's subscription for an ad-free experience. Thank you.

Discussion

We welcome relevant, respectful comments. Any comments that are sexist or in any other way deemed hateful by our staff will be deleted and constitute grounds for a ban from posting on the site. Please refer to our Terms of Service for information on our posting policy.
Mastodon