Humans frequently fail to make their beliefs match what is true. There are many areas of life where this is typically not the case. When asked what a ball thrown in the air will do or whether the keys will be where they were left, almost all people will express beliefs that match reality. However, in many other areas of life they hold beliefs that bear no relation to what is true. This is most common in the areas of religion, morality, and philosophy. It is not limited to those though, but appears often where feedback on the veracity of belief is absent or indirect. Examples include astrology, crystal healing, and conspiracy theories. False beliefs are very widespread. According to a 2016 Gallup Poll, 79% of American’s believe in God.1 Another survey shows that 72% of American’s believe in an afterlife.2 Indeed, even a small percentage of “atheists” claim to believe in an afterlife according to a pew research poll.3 Large percentages of Americans believe in various conspiracy theories. A 2012 survey showed that 10% of Americans believe astrology is “very scientific” and 32% thought it was “sort of scientific”.4 Those are only a few of the more egregious examples of false beliefs. Everyone holds many minor beliefs that don’t comport with reality.
What are we to make of the fact that so many people hold false beliefs when most profess to be competent to determine the truth? Perhaps a better question to ask is: what reason is there to think that humans will believe true things? Humans are evolved animals, and the process of natural selection does not inherently select for individuals that hold true beliefs. Evolution selects for individuals that propagate their genes. In many instances, that is actually a reasonably good approximation for selecting for true beliefs. An individual holding proper beliefs on which animals are its predators and what is edible, will be more likely to survive and pass on its genes. Thus, there is reason to believe that in some areas human beliefs should tend to be true. However, in other areas there may have been no selection pressure or indeed have been selection for a tendency to false belief. It has been speculated that widespread belief in God is a result of over active agency detectors in the human brain. That is, a bias toward seeing external events as the result of the actions of another individual or animal rather than as mere coincidence. A human progenitor might be more likely to survive if they ascribe the rustle of some nearby grass to a predator rather than the wind. Even if doing so means they run for safety when it is not necessary, it ensures they will run truly when needed. Consistent with the hypothesis that we evolved to often hold false beliefs, cognitive psychology has identified many biases common to humans.
We humans have many reasons for holding the beliefs we do besides that the beliefs are the best explanation of the evidence. Holding a belief may be beneficial for fitting in with a social group. One may believe that they “ought” to hold a certain belief. For example, that it is a right or proper belief to hold, or that virtuous people hold that belief. Beliefs may have been learned from an authority and not questioned. Often, people do not have the spare cognitive resources to spend on questioning a belief, or feel that getting to the truth may be beyond their capabilities and so not worth the attempt. Frequently beliefs aren’t selected for truth, but for expediency or benefit. Cognitive science finds that humans are prone to engage in motivated reasoning. Where the desired belief has already been selected for some other purpose and the reasoning mind is engaged to provide justification to others or to oneself. Often motivated reasoning is used because facing the actual truth has lots of negative emotions attached to it. Ultimately, too many simply fear the truth. They find truth to be ferocious.
It is not so much the mundane beliefs and facts of everyday life that people find ferocious, but the answers to the great questions of morality, meaning, religion, and philosophy. These are what they find too ferocious to face. Yet, they already live in the world as it actually is. Psychologist Eugene Gendlin wrote about this in his book Focusing.
What is true is already so. Owning up to it doesn’t make it worse. Not being open about it doesn’t make it go away. And because it’s true, it is what is there to be interacted with. Anything untrue isn’t there to be lived. People can stand what is true, for they are already enduring it.
This quote is referred to as the “Litany of Gendlin” in the rationality community.5 This community grew on the website Less Wrong started in 2009 using material written by Eliezer Yudkowsky for the blog Overcoming Bias. In this community, rationality is considered to have two aspects: epistemic rationality and instrumental rationality. Epistemic rationality is having accurate beliefs about the world. Instrumental rationality is taking the best actions to achieve your goals, whatever those may be. In pursuit of being more rational, they focus a great deal on learning about and counteracting their own cognitive biases. When thinking about whether their beliefs are accurate, they like to use the metaphor of the map and the territory.
This metaphor originated with amateur philosopher Alfred Korzybski who remarked that “the map is not the territory.”6 The actual reality is compared to a territory and one’s beliefs to a map of the territory. This serves as an important reminder that our beliefs don’t always match reality, just as a map may not always be an accurate reflection of the territory it is meant to depict. Embedded in this metaphor is the view of truth philosophers term the correspondence theory of truth. That what makes something true or false is whether it properly corresponds to reality.
Using the map and territory metaphor, we can more easily see that there are a wide variety of ways our beliefs can fail to match reality. Put another way, it is not simply that we hold false beliefs, but that there are many ways our beliefs can fail to correspond to reality.
One of the least problematic ways a map can fail to match the territory is to simply have gaps or blank spaces. Modern maps generally don’t have this. However, historical maps often show distant lands fade off into blank spaces or the unexplored interiors of regions are blank. Here, the map makes it clear that there is something there, but does not indicate what it is. We experience this in our beliefs when we consider something and find that we don’t know the answer. We are knowingly ignorant of a topic. This is generally less dangerous than other kinds of map/territory errors because at least we know that we don’t know.
A more serious issue is an omission from a map. As when a map fails to indicate the presence of a hazard or barrier. Reading the map, it seems there are no gaps; that nothing is missing. There is no indication of the omission. In our beliefs this is experienced as unknown ignorance. We believe we know about a given subject, but, unbeknownst to us, our beliefs have important omissions. This can be problematic because we make decisions believing we are adequately informed when we are not. We do not know to seek out more information to help us make a better decision, and so may make the wrong one.
Before modern cartographic methods it was more common for a map to show the shape of a coastline or other feature incorrectly. The map indicates the presence of a feature, but describes that feature incorrectly. This can be almost as insidious as an omission. Again, we do not recognize the error in our map, in our beliefs. Yet, our beliefs are subtle distortions of reality, leading us astray.
Perhaps the most insidious of errors is a map that shows features that don’t exist in the territory, as when ancient maps showed mythical lands. We may, like bearers of maps of lost pirate treasure, set off on hopeless quests. Beliefs that bear no correspondence to anything in the world are more likely to cause active pursuit of incorrect goals. The other map/territory errors tend to manifest in failure to act. This error, tends to manifest in incorrect action. Great harm can be caused when people act on this, often attempting to force others to conform to their false beliefs about the world.
Given the many ways our maps can fail to match the territory and the potentially high price of the resulting mistakes, it behooves us to attempt to correct our maps. Indeed, that is the very project of epistemic rationality. Yet has the rationality movement actually made its members more rational? Consider that in a 2016 survey of Less Wrong users, only 48 of 1 660 or 2.9% of respondents answering the question said that they were “signed up or just finishing up paperwork” for cryonics.7 This despite the fact that Eliezer Yudkowsky has argued in the strongest terms possible that it is the rational thing to do, saying “I want you to live.”8 While this is certainly a much higher portion than the essentially 0% of Americans who are signed up for cryonics based on published membership numbers,9 it is still a tiny percentage when considering that cryonics is the most direct action one can take to increase the probability of living past one’s natural lifespan. If “rationality is systematized winning”10 then it would seem that involvement in the rationality community hasn’t been able to increase rationality very much. It has been objected to this characterization that the problem may not be a failure of epistemic rationality, but rather of instrumental rationality. That the Less Wrong site does not focus on this aspect of rationality. This objection is consistent with the fact that 515 or 31% of respondents to the question answered that they “would like to sign up,” but haven’t for various reasons. Beyond that, when asked “Do you think cryonics, as currently practiced by Alcor/Cryonics Institute will work?”, 71% of respondents answered yes or maybe.7 I will concede that Less Wrong does not focus on training instrumental rationality and there is a disconnect between beliefs and actions. However, the distinction between instrumental and epistemic rationality is not so clear cut.
If a reliable and trustworthy source said that for the entire day, a major company or government was giving out $100 000 checks to everyone who showed up at a nearby location, what would be the rational course of action? It might be argued that it is more likely the source is mistaken or lying, but assume for the sake of the argument that one does not believe this to be the case. Any course of action not involving going down and collecting the $100 000 would likely not be rational. To do so would be an indication that one didn’t actually believe they would receive the money upon going to claim it. Likewise, when offered a chance to live beyond our natural lifespan, responses that fail to result in actually taking that opportunity must call into question whether the opportunity is believed to be real. Yet, there is a large gap between those claiming to believe and those acting as we would expect based on that belief. This is the divide between professed and actual belief that too often occurs. So, we must find ways to correct not only our professed beliefs, but our actual beliefs.
Very few individuals are able to routinely seek and discover the truth. It is my experience that those who do place a great value on knowing the truth. They hold curiosity as a virtue. Curiosity admits its ignorance and seeks to replace it with knowledge. But curiosity is not sufficient unless it is paired with diligent study. Otherwise, it is simply idle curiosity. Diligent study comes when we whole heartedly follow after something. Personally, I place a high value on knowing the truth and learning new things. I wish this value was more widely shared. And so I say to all: let us be curious disciples of ferocious truth.
As of December 31, 2016 Alcor had 1 606 members and patients and The Cryonics Institute had 1706 members and patients as compared to the approximately 324 million people living in the United States ↩