In 1969 Neil Armstrong set foot on the surface of the moon. The collapse of the two World Trade Center Towers on 9/11 was caused by aircraft impacts and the resulting fires. During the Second World War, the Nazis systematically murdered millions of Jews with the aim of exterminating the Jewish population of Europe.
These are all things I know. But I also know that each of these claims has been questioned by conspiracy theorists and others. There are people who seriously believe that the moon landing was faked, that the twin towers were brought down by a controlled demolition and that the Holocaust is a myth. There are people who not only believe these things but present what they regard as evidence in support of them. How should I respond to this evidence? What should my attitude be?
I might reason as follows: since I know that Neil Armstrong did set foot on the moon it follows that any evidence that suggests that he didn’t has to be misleading. Similarly, I can infer from what I already know that the “evidence” that aircraft impacts didn’t cause the collapse of the twin towers must be misleading. In that case, why should I bother with it? Why should I waste my time responding to evidence or arguments that I know, in advance, are no good? Indeed, one might go even further: it’s not just that I am under no obligation to engage with the alleged evidence against what I know to be true, I should firmly resolve to avoid it. If I can’t avoid it then I should ignore it. The philosopher Saul Kripke describes this as the ‘dogmatic attitude’ and argues that this attitude can be both rational and justified.
Can dogmatism protect knowledge?
The case for dogmatism is that it can protect our knowledge. Suppose I know what really happened to the twin towers on 9/11 – they were brought down by aircraft impacts – but a conspiracy theorist bombards me with data that supposedly prove that aircraft impacts couldn’t have been responsible for their collapse. If I am sufficiently bamboozled by the data I might give up my belief that aircraft impacts brought down the towers. As I result, I now no longer know something that I previously knew. My knowledge has been undermined by the conspiracy theory. If I want to avoid losing my knowledge in this way I should steer clear of conspiracist websites. By the same token, I should avoid reading books by Holocaust deniers.
The idea that dogmatism can be good for us as knowers has been described as the ‘dogmatism paradox’. As Kripke notes, the common sense view is that future evidence could lead me to change my mind even about things I know, and I should leave myself open to such changes of mind. It is partly for this reason that dogmatism is often regarded as an ‘intellectual vice’. Intellectual vices are character traits, attitudes or thinking styles that get in the way of knowledge. They prevent us from acquiring knowledge, or from retaining the knowledge that we have previously acquired. Standard examples include closed-mindedness, intellectual arrogance, prejudice, wishful thinking and, needless to say, dogmatism. If dogmatism is an intellectual vice then it gets in the way of knowledge but the dogmatism paradox suggests otherwise.
Kripke isn’t the only philosopher to have pointed to the benefits of dogmatism. Back in the 18th century, David Hume argued that one is justified in rejecting reports of religious miracles even if one can’t disprove them. They can be rejected out of hand. In the last century, the eminent philosopher of science Thomas Kuhn identified the dogmatism of mature science – its preconceptions and resistance to innovation – as fundamental to productive research. Yet these claims are hard to swallow. A free and responsible thinker is surely one who bases their beliefs on the balance of evidence and whose mind is open to views that are at odds with their own. Even if I know that aircraft impacts brought down the twin towers should I not at least be prepared to give a hearing to people who think otherwise?
One way to see that the intellectual benefits of dogmatism have been exaggerated is to challenge the suggestion that it has a proper role in protecting our knowledge. A resolution to avoid certain types of counterevidence to things I know is supposed to protect my knowledge by eliminating the danger that I will be persuaded by the counterevidence to change my mind. But there is a simple response to this line of thinking: if I really know that aircraft impacts brought down the twin towers then why am I so afraid of evidence that they didn’t? If this evidence is misleading then I shouldn’t fear it and should trust myself not to be taken in by it. A thinker who is worried about having their mind changed by spurious or misleading evidence is a one who lacks intellectual self-trust or self-confidence. A lack of these things is itself a threat to knowledge and one that can’t be remedied by dogmatism or closed-mindedness.
What if, despite my best efforts to avoid it, I am confronted by detailed “evidence” that aircraft impacts couldn’t have brought down the towers and I have no idea how to refute this evidence? Deciding to ignore the evidence and stick to my guns won’t protect my knowledge because my original belief about what happened to the towers is already under threat. For the misleading counterevidence not to undermine my original belief I need to be able to say what is wrong with it and I can’t do that if I ignore it. If I know that aircraft impacts brought down the towers then my belief that this is what happened must be a justified belief and it can’t be a justified belief if it’s based on a dogmatic refusal to say what is wrong with the contrary view.
Engage with the evidence
Doesn’t this make it too easy for us to lose our knowledge in a world in which we are constantly bombarded by dubious or crackpot views that are contrary to what we know but that we also don’t know how to refute? The way to avoid a loss of knowledge is to make the effort to engage with the alleged evidence against what we know and to figure out, at least in broad outline, where it goes wrong. If one is unwilling or unable to rebut the claims of 9/11 conspiracy theorists then that weakens one’s right to suppose that the twin towers were brought down by aircraft impacts. As the philosopher, Adam Morton observes, ‘when you believe something for a reason you can defend it against attacks or undermining’. Unless they are confronted and rebutted, fallacious conspiracy theories are liable to metastasise through the body politic. If one can’t be bothered to argue against them one can hardly complain if millions of people believe them.
The same goes for Holocaust denial. If it is important for people to know that the Holocaust happened it is also important they know in some detail why holocaust deniers are mistaken. It isn’t enough to know that they are mistaken. In practice, knowing why the so-called “evidence” presented by Holocaust deniers is mistaken is less difficult than it sounds thanks to the work of historians such as Richard J. Evans. His book, Telling Lies about Hitler, is about how to tell the difference between truth and lies in history. It leaves no room for doubt about what is wrong with the “evidence” against the Holocaust. It is incumbent on responsible thinkers and knowers to base their views on the best evidence and this means taking the trouble to find out what the evidence is and what it does and doesn’t show. Only on this basis can one effectively protect one’s knowledge. Dogmatism is not the answer.