But what’s a conspiracy theory?

The New York Times reports on a new chatbot intended to combat conspiracy theories:

DebunkBot, an A.I. chatbot designed by researchers to “very effectively persuade” users to stop believing unfounded conspiracy theories, made significant and long-lasting progress at changing people’s convictions, according to a study published on Thursday in the journal Science.

The bot uses “facts and logic” to combat conspiracy theories, and in the study had some success in arguing people out of beliefs that, for example, the CIA killed JFK or 9/11 was an inside job. I do believe that most people, if guided onto a field of factual argument, can be convinced by facts and logic; the trouble is that people who hold opinions dearly may take any attempt to guide them as a personal attack. It may be that a bot, by seeming neutral and objective, may have more success.

Ah, but is the bot, in fact, neutral and objective? Specifically, what counts as a conspiracy theory? The invitation to participate in the study defines a conspiracy theory as a belief that “certain significant events or situations are the result of secret plans by individuals or groups.” But this is literally and incontrovertibly true of, for example, the JFK assassination, not to mention 9/11. Had the plans for those events not been secret, they would have been prevented! People who say 9/11 was an inside job are just arguing about which individuals or groups did the secret planning. They may be wrong, they may be nuts — but not, by this definition, because they’re conspiracy theorists.

The mention of COVID, now, is more interesting. A researcher mentions the belief that COVID was deliberately created by humans to reduce the population, which I grant is X-Files stuff. But the belief merely that it was created by humans was treated by many as a conspiracy theory four years ago, yet more recently the NYT itself ran a fairly long article saying it likely came from a lab.

I would observe that (1) some things are the result of deliberate human action, (2) some things are the result of screwups, and (3) some things happen for natural reasons. Sometimes things happen for a combination of reasons: 9/11 was 1 (the terrorists) and 2 (the FBI et al), COVID now appears to have been 2 (lab technicians) and 3 (the way viruses work).

It can take time and effort, investigation and debate, to sort out the causes of an event, and that process can be messy. But it has to happen. And once it has been settled, it’s time to move on. Right?

If only it were that simple! The trouble is that most people, I’ve observed, tend to like or dislike explanations that resemble 1, 2, or 3. Some people like having someone to blame. Some people are uncomfortable accepting the broad impact of screwups on the world. Some people appreciate the way natural causes absolve humans of responsibility; others resent it. Some people think highly of human agency; others think little of it.

If you think little of human agency, you may for example believe that aliens built the pyramids. If you want someone to blame, you may have been quick to assume a lab-leak origin for COVID. And of course blame-seekers gravitate towards particular sorts of targets—our government or theirs, labor unions or Big Pharma, liberals or conservatives, white men or immigrants. It’s possible to come to a correct conclusion for racist reasons, and to dismiss valid possibilities because we dislike the beliefs of others who embrace them. Maybe your blame-seeking leads you to embrace conspiracy theories; or maybe your blame-seeking leads you to attack people you think are inventing conspiracy theories. We’re not always consistent in our sensibilities, and they may steer us right or wrong in this or that situation—but we have them.

Those biases are at least partly natural. But they become dangerous when we don’t recognize them in ourselves, and when we assume them in others. The only way to have a vigorous, honest, valid debate about anything of real import is to recognize our own leanings, and not jump to conclusions about those of others. It requires both introspection and epistemic humility —in quantities that are extremely rare at the moment.

Back, then, to DebunkBot. What troubles me is the unspoken qualifier in that definition of “conspiracy theory.” It’s all very well to laugh at people who think Kubrick filmed the moon landing. But seriously: who decides what’s a conspiracy theory, and on what grounds? The bot’s creators, of course, along with whatever materials it was trained on (which its creators select). I don’t need a conspiracy theory to say that the bot will therefore reflect the biases of its creators: I only have to suppose that they haven’t questioned their own biases. Which is to say, I only have to suppose that they’re human.

Granted, I tend toward explanations that rely on human error. But that doesn’t mean I’m not right.

(And for the record, had Kubrick filmed the moon landing, he would have made sure Neil Armstrong got his line right.)