In the previous section, we discussed how we know what we know. The goal was to make the clear and forceful point that truth exists, in the practical sense that we have been using, and that we have methodological tools to find it. Yes, it is difficult. Yes, it is confusing, complex, and we must be able to accept that we will never arrive at absolute certainty. But this lack of total certainty should not lead us to believe that we know nothing and that there are as many realities as there are people in this world.
What we know about anything is never complete. Even in a best-case scenario, that information is incomplete and noisy; in addition, we often fail to weigh and assess it properly. And when we disseminate it (on the Internet, at the family table or at the hairdresser's) filtered by our own opinion, we add additional noise and often contribute to confusion, to irrational doubt. We have no intention to deceive. Yet, unintentionally, we amplify and spread assertions that are simply not true. Every time we hear an interesting, controversial or amusing rumor and we pass it on without confirming it; every time that, instead of analyzing a fact as objectively as possible, we are more influenced by its source or by whether it agrees or disagrees with what we already believe; every time we blur the boundaries between experts and non-experts in a given subject, we are making things worse.
In this case, our attitudes and behaviors end up generating and propagating, involuntarily, a post-truth situation in which we are both victims and victimizers. And because we all have some degree of responsibility, there is the risk that none of us may really feel responsible. Were we not active and unwitting agents of post-truth, the intentional version of post-truth could not occur. Thus, these two sides of the same coin are inseparable, exacerbate each other and are interdependent.
Several problems in the way we think and relate to each other favor the emergence of this type of post-truth: our beliefs, our mistaken thinking, the way we gather in tribes, our distrust of experts, and the way we relate to the information we access and propagate.
We need to know more about these problems in order to identify, limit or eliminate them, so that we can fight against unintentional post-truth and thus, by extension, address intentional post-truth as well.
Like a hungry person standing in front of the refrigerator, we instinctively move towards what hurts us without realizing it. To get out of the trap, we have to understand our own behavior. We are all vulnerable to post-truth, but awareness of how it works, how we function, is the first step towards vanquishing it.
One of the mechanisms that generate post-truth is the influence of irrational aspects in our way of incorporating information, assessing it, considering it and responding to it. These may be beliefs based on religious, ethical or aesthetic principles, values and traditions, or they may simply be the result of the path we have taken in our lives.
Let me be clear: I am not making a value judgment of the word irrational. I am using it as a practical description of situations in which our beliefs are not based on evidence. We should not be offended, as it is one more characteristic of human beings, one that may help or harm us. What’s important is to try to understand what the case is on each occasion.
We all have beliefs that are sometimes justified by evidence and sometimes not. Let us call irrational beliefs those that are not based on evidence, on facts; beliefs that we are willing to hold without evidence in their favor or, even, despite evidence against them. Some people can easily identify beliefs of this type in themselves, and others, less so. We are very diverse and very complex. To deny our complexity or our diversity in the name of some abstract ideal would be to ignore what is happening, what is real, and on ignorance one can only build more post-truth.
Facts will not tell us what to love or what kind of societies to build, but they will tell us how to find our way to them and know that we have reached our goal by helping us measure it.
One of the various manifestations of our irrational gaze is our values, the moral compass we all have that points us towards what is right and distinguishes it from what is wrong. These values are often influenced by our "journey": where and how we were raised, our cultures and traditions, our experiences, the education we received, the precepts of the religion we profess (if any), or even our genetics. For example, while some may argue that men and women should have equal rights and obligations, others may say that men should have more rights, or more obligations, or more of both than women. While some may think that in their community certain moral principles apply and that different ones should be allowed to exist in other cultures, others may believe that humanity as a whole, regardless of its origin and context, should follow the same moral principles.
Is the welfare of all people important to us? Whatever answer we give, it will be moral and based not on facts but on beliefs about how the world should be. Now, having decided this is our North Star, clearly defining well-being and measuring it with the tools of science will tell us whether we are creating the world we envision. To a large extent, scientific tools give us the tangible possibility of knowing whether or not we are being effective in defending the moral principles we claim to uphold.
An invitation to introspection, for each of us to think about our own moral principles. For example, do we believe that all people deserve moral consideration or that there are exceptions? Do we believe that we should try to increase the welfare of all humanity or should we focus only on our community (understood as our family, close friends, country or culture)? Should democracy be protected at all costs or, should an alternative be accepted in certain circumstances? Are there limits to freedom of expression or should it be unrestricted?
What do we do with all this? How do we balance these values with what the real world is like? How do we bond with each other, knowing that we may agree or disagree on them? How do we fight for what we believe is right while trying to preserve human bonds? These are all urgent and real problems, and we must try to solve them.
BELIEFS, EMOTIONS AND POST-TRUTH
How could irrational beliefs influence the generation of an unintentional post-truth? To begin with, we must set aside all beliefs that allude to subjects that do not refer to the real world and for which there is, therefore, no truth in the sense we give to the word here. We will not discuss these situations, which are very important, but exceed the scope of this proposal. An example of these issues in which we cannot (nor could we) find a truth is the existence of God. And since no truth is possible, one would expect there is no post-truth either, but it is not so simple.
Believers are convinced of God’s existence. Some non-believers are convinced of God’s non-existence. There is no evidence for or against, nor could there be. For others, the question is irrelevant and they behave as if God did not exist, as they find no evidence. The existence or non-existence of God can be a crucial question in a person's life, or something to discuss lightly over coffee. Either way, it will not be possible to go beyond that point, because we will be unable to answer that question by means of evidence and, therefore, to show the truth of our conclusion to those who do not already believe in it.
Religion is a very complex phenomenon that emerged in diverse ways in all civilizations, and many believers feel comforted by their beliefs, which give them a frame of reference and help them bond with each other.1More on tribes in Chapter VII. Religious beliefs can be observed even in those who claim not to be religious.2Like the one who says he never prays/and recites Trotskys program at the table, went a song by Leo Maslíah.
In principle, it should be nobody else’s business what a person believes or does not believe in, unless that belief somehow affects the freedom of others or puts someone in danger. And this, of course, is a belief of mine. As we see, our beliefs are everywhere. In my case, if I succeed in identifying them (I know I might not), I will make them explicit throughout this chapter.
From now on, then, we will focus exclusively on factual issues for which we can obtain evidence,3In the sense discussed in Chapter XV. with the understanding that, whether we have it or not, the important thing is whether or not we could have it.
On most issues, our positions are based on a mixture of evidence and irrational beliefs.
Many discussions on controversial issues - say, the legalization of abortion or the death penalty - are unresolvable because those who argue with us do not clearly recognize the nature of the discussion. Since we argue from our values and not from the facts, the facts do not convince us, and since our values are not shared by those who argue with us, they do not resonate with them either. I am not making an argument either for or against these examples here, but I am raising them to illustrate issues where we do not all believe, in principle, in the same set of ideas. Before we can proceed, we need to keep in mind that people do not have a single way of looking at these issues. Recognizing that we are different is the first step towards rebuilding what binds us, and this is the starting point for fruitful exchanges of ideas. In extreme cases, recognizing that we are different also helps us be aware of when the gulf that separates us is so vast that we would prefer to keep it that way. In either case, with the hope of reaching agreement or with the certainty that we will not, I believe we need to recognize that others have views of the world that are different from our own and as complex as our own. If we do not do so, we remain locked in post-truth, using pseudo-arguments in disguise (and they are pseudo because what we want to change are other people’s values, which are not easily changed with arguments).
These situations are quite different from when Herschel discovered Uranus: he got evidence, which was tested and finally accepted by the expert community, and his discovery was smoothly accepted as new knowledge, a fact of reality that was added to other facts such as the existence of the other planets. Our belief in the existence of Uranus is supported by evidence. We can justify this belief because the evidence is so clear and so consensual that it allows us to consider that the existence of Uranus is the truth. It does not conflict with people's irrational beliefs.
In contrast, complex problems, such as the question of whether or not euthanasia should be legalized, allow approaches based on evidence and also on the culture, tradition or values of the individuals directly involved, as well as of society as a whole.
For example, a discussion on the legalization of abortion can be approached from evidence (How many women die from clandestine abortions? Does legalizing abortion affect the number of abortions performed?), from values (Is causing the death of the embryo equivalent to killing a person?) or from legal arguments (What do our laws say about it? How do our laws compare with those of other countries?).
Aspects related to values may be talked about and discussed, one may agree or disagree with others. But the evidence side is different. The valid way to reject evidence is to show that it is wrong, or to replace it with better quality evidence. Science is challenged with more science. If we have quality evidence, we cannot reject or ignore it simply because we don't like it, or we surrender to post-truth. Even more so if we conflate values and evidence and assume they are interchangeable. Saying "according to science, an embryo is a person" is constructing a fallacious argument. Not because science claims otherwise, but because science cannot do so. To defend it would be just as fallacious as to defend a tenet such as "science tells us that an embryo is not a person". The attribution of the category of person is not a scientific fact, but a normative, legal categorization, a human evaluation. One that, of course, may be based on (or at least not against) scientific evidence as one of the reasons to grant that status to that organism or not.
What can be assessed on the basis of evidence is discussed at the level of evidence. On the other hand, what is based on values or traditions must be approached in discussions about values or traditions, and here evidence has nothing to contribute. If we try to reduce these issues to an exclusively evidence-based point of view, we will not be able to have a conversation, and this may lead to segregation and fundamentalism. Just as an evidence-based approach disregarding the beliefs of the participants is reductionist, thinking only in abstract ethical or dogmatic terms is tantamount to assuming that no factual analysis is possible regardless. Let us try to understand the different aspects and identify which ones can be approached with evidence and which ones cannot.
Truth in politics or in the legal system, for example, is not the same as truth in science, even if it is nurtured by it. What does post-truth have to do with all this? In factual matters, if our irrational beliefs prevent us from seeking and accepting evidence, and lead us to ignore it completely, or cause us to select isolated evidence that supports our position, we are unwitting generators of post-truth. If we have powerful evidence, and we follow it, it is more difficult for us to be wrong, but if our belief is irrational, then we may be holding the wrong position.
If we do not take the bull by the horns, the truth is lost, it is diluted as if it were just another opinion. This generates fertile ground for falsehoods, lies and everything that comes with post-truth to spread. Arguments become emotional, exclude what we do know, and we get caught in a loop.
If we want to participate rationally - that is, in favor of a purpose - in major decisions, we need to train ourselves adequately to make informed decisions, as well as to demand that our political representatives do too. For that, we need to embrace uncertainty. Fighting against it by denying it is not fighting, it is giving up. This is no easy task, because we tend to transform everything into categorical statements that help us feel secure. But there are many shades of grey between black and white. With practice, we can become more comfortable with uncertainty, accept it and, in spite of it, try to move forward. We need to train ourselves in flexibility: flexibility to think, and think again, if necessary. Flexibility to place our trust in certain places, and to change if we see that we are wrong. Flexibility to be able to look within and understand what is happening to us, to analyze our knowledge, our doubts, our beliefs, our desires. Flexibility to make decisions with all the information we have, even when, as is often the case, it is not enough. Flexibility, also to change our position when we know more.
Physicist and science communicator Richard Feynman said, “I have approximate answers and possible beliefs and different degrees of certainty about different things, but I'm not absolutely sure of anything, and many things I don't know anything about, such as whether it means anything to ask why we're here, and what the question might mean. I might think about it a little bit, but if I can't figure it out, then I go on to something else. But I don't have to know an answer... I don't feel frightened by not knowing things, by being lost in the mysterious universe without having any purpose, which is the way it really is, as far as I can tell, possibly. It doesn't frighten me."
One of the easiest ways to deceive ourselves is to ignore our irrational beliefs on factual issues. To avoid this pitfall, we should ask ourselves if there is evidence which might change our minds. If there is, we can look for it and assess it, find the consensus, and adjust our position. If there is not, then our belief is irrational.
This can be even more confusing because we do not usually admit that we dismiss evidence when we prioritize our beliefs, but rather, we usually turn to reason to find isolated evidence, or apparent evidence, to justify what we already believed anyway. If a newspaper publishes an article stating that, according to a research, drinking coffee is very healthy, even without knowing if that research is of good quality, or if that article adequately reflects what was researched, coffee drinkers will say "See, it's healthy!". It is still an irrational belief, but it is disguised. And it's an irrational belief because chances are that if the article had said that drinking coffee is unhealthy, we would have flatly ignored that information or questioned it, without changing our stance. It is not that the evidence alters what we believe, but that what we believe alters the evidence we accept, which is usually known as motivated reasoning.
It is not easy to distinguish between facts and opinions, but the worst are facts disguised as opinions and opinions disguised as facts.
This also has effects on our relationships. When we believe in something, based on evidence or not, we are usually convinced it is true. As one of our values we hold is that "the truth must be defended no matter what", we will fight for it. And, since our belief "is true", when talking to others, we take it for granted that we are right and that others are wrong.
Post-truth appears here not as a denial of truth, but as the consideration that one's own position is true, as a certainty of something we do not know to be true. The trap is to think that "since I am so capable, intelligent and rational, what I think must be true". Conversely, if you disagree with me, you are wrong. And we are not aware that we are deceiving ourselves. We adopt an irrational position, but we believe that evidence led us to it. As Thomas Henry Huxley said, "what we call rational bases for our beliefs are often irrational attempts to justify our instincts."
A new invitation to introspection.
Belittling the beliefs of others is also a problem. If we cannot see that sometimes people live in tension between their needs and the needs of others, if we do not understand that there are local and global problems with solutions that sometimes conflict with each other, we will miss much of the complexity that we need to understand in order to truly solve problems.
But things can be even more complex. In addition to the irrational beliefs we have, which are part of who we are and are the pillars that support us, there is also the influence of emotions, the factor that modulates the way we receive or incorporate information.
Most things that happen to us pass through an emotional sieve. We feel joy, love, affection, respect, gratitude and many other positive emotions. We also experience negative emotions, such as hatred, anger, indignation, distrust, guilt or fear. Our emotions are leveraged by our values and other irrational beliefs, but they are appropriate to the particular situation of each moment.
In principle, our emotions should not be of any concern to us. After all, they are also part of who we are and how we see the world, and in matters that have nothing to do with reality, such as relationships, they have a lot to contribute. But, again, if we are dealing with a factual issue and our emotions are hindering our access to the truth, problems can arise. When an issue elicits a strong emotional response, particularly from negative emotions, unintentional post-truth is likely to emerge. In particular, if we have beliefs that run counter to the evidence, and add to that an emotional component, what we have is a cocktail of post-truth.
DOES IT ALL MATTER?
Which aspects should be governed by the facts, or at least influenced, has to do not only with which ones are intrinsically empirical, investigable through the tools we have already mentioned,4 See section 1. but also with which ones we believe are sufficiently relevant. What would happen if our beliefs or emotions were making us hold as valid something that is not?
The public sphere (an organization or a state making decisions) is one thing, and the private sphere (each of us acting in our own lives) is another. The reason for this distinction is that the harm caused by bad decisions grows in proportion with the influence and power of the decision-makers. It is certainly possible to make bad decisions with the best available evidence, or for good decisions to yield bad results, since the world is always more or less uncertain. But deciding on the best available evidence increases our chances of not being wrong, and the more potential harm, the greater our obligation to make the best possible decisions.
Thinking about potential damages can help us to be discerning and try to prevent the emergence of an unintentional post-truth based on irrational beliefs, and this is not just an abstract idea. Between 1959 and 1961, China went through a great famine, caused by a combination of factors such as droughts, bad weather and faulty planning. One of the main problems was the implementation of agricultural policies based on the pseudoscience of Soviet agronomist Trofim Lysenko, an ill-spun mixture of Darwinian and Lamarckian evolutionary ideas that, without any experimental evidence, became the official dogma of Soviet agronomy and was later exported to China. The Chinese harvest fell by more than 30% in those years, and estimates range from 10 to 20 million additional deaths in this period. When the going gets tough, adding post-truth (in this case, supported by pseudoscience) makes matters worse quickly. And the cost can be enormous.
Now, even on issues where there is evidence with a very high consensus and that meets high quality standards, some still believe the opposite of what the evidence indicates. To understand how and why (and to serve as a model for understanding other post-truth content that is constructed in the same way), we will see three post-truths in action: the belief that the Earth is flat, the belief that vaccines are dangerous, and the belief that alternative medical therapies work.
A PALE BLUE DISC
Unless we make calculations to launch missiles, or study planetary climate, believing that the Earth is flat doesn't do much harm. Those who believe it are probably neither ignorant nor stupid, but came to that idea by dismissing all the evidence to the contrary.
My position is that people's beliefs are their own business and we should not intervene, except only if those beliefs harm those people or their environment, or constrain other people’s freedom, in which case the moral duty to protect prevails. Of course, this is only my position. Some think that no one should believe in something irrational and clearly erroneous, such as that the Earth is flat, simply because something that is not true should not be tolerated. Here, the belief would be that it is crucial to defend the truth in every situation. Others think that if someone believes in something, and that belief can harm them, it is their problem: "Let them harm themselves and others. That’s what they get for believing the wrong things". Here, the belief would be that it is crucial to defend individual freedom over the repercussions that those beliefs may have on the believers and others.
Similarly, something we might consider harmless at the individual level may not be so at the public level. Would we allow it to be taught in schools that the Earth is flat, that belief in a flat Earth is a valid hypothesis, a potentially true statement? Would we be okay if our state devoted taxpayers’ funds to research on the subject?
For these questions I have a clear answer, but I understand that others might not agree with my position because, again, we are at least partially in the realm of beliefs. Particularly, I believe that we do not have to fight against irrational beliefs as a whole, but perhaps only against those that represent a loss of freedom or a risk, such as, for example, those related to health issues. In that case, the choice is clear: collaborating with post-truth can be very harmful.
Vaccines are one of the public health measures that have saved and continue to save the most lives. They are very cheap, safe and effective, and the adverse effects that may be observed are generally very mild. They make it possible to prevent diseases that, until not so long ago, killed millions of people and disabled many more every year.
All this is known, and it is very well known. And when we say "it is known" it is because there is very high quality evidence, including clinical trials, epidemiological studies, meta-analyses, and so on. Moreover, the consensus on efficacy and safety is ample. However, some choose not to vaccinate their children because they believe that vaccines are somehow toxic or dangerous.
Just as, in Anna Karenina, Tolstoy says that "all happy families are alike, but unhappy ones are each unhappy in their own way," so it is with vaccine hesitancy: all pro-vaccine people are alike, but those who doubt do so each in their own way.
Although people who believe that vaccines are harmful are very different, for the sake of simplicity, I will divide them into two large groups, even at the risk of oversimplifying a phenomenon that is so complex and that touches very intimate personal fibers. On the one hand, there are vaccine doubters: these people are not fanatics and are where they are because of personal situations. We need to listen to them, to understand them better and thus, perhaps, we will be able to help them. On the other hand, antivaxxers are very few, but very outspoken people, who also behave very differently. They are extremists, they have made antivaccination their identity and feel the moral duty to preach their position to others. We will discuss them later. These two groups of people are very different, and I think that the worst thing we can do is to include the former in the latter.
Vaccine doubters are not lacking in information. They know that most people get vaccinated and are generally educated. But something happened in their lives that sowed the seeds of a doubt that grew over time. You don’t have to be convinced of the danger of vaccines to decide not to get one. Doubt is enough, and we will return to this often because it is one of the keys to understanding post-truth.
There is a myth, which was totally discredited years ago, that vaccines could somehow cause autism. It is very clear from the evidence that this is not so, as clear as that the Earth is not flat but round. And we can know this by studying both things: autism and vaccines. Autism is a condition with which one is born, which is not generated throughout life, and it is clear that vaccines do not affect the number of diagnosed cases of autism. Every vaccine is subject to extensive scrutiny before and after it is marketed. Today’s vaccines are of the highest quality and proven efficacy, and the inevitable adverse effects they may cause are generally minimal. Of course, as we advocate evidence-based decision making, we will say that, so far, in very large studies in terms of number of patients and duration, this is so. That is the evidence.
But let us picture this situation: a family with an autistic child who has been vaccinated begins to frequent circles of families in a similar situation, who support each other and, in these circles, they hear about the possibility that the vaccines may have caused autism in their child. What happens then? Faced with this situation, some families begin to doubt, without a factual basis. It is not that facts do not matter, but it is very difficult to take them into account when what seems most relevant is feelings. Accurate information is of little use, in this case, if what is triggered is a set of negative emotions: anger for the situation and its difficulties, distrust towards the State that vaccinates, the society that accepts to be vaccinated and the companies that manufacture vaccines and benefit from it. Let's add guilt (Is our child autistic because we vaccinated him?) and fear (Could the same thing happen to our next child?). All this is a cocktail that can lead, unfortunately, but also understandably, to some of these families deciding not to vaccinate the rest of their children.
Perhaps these people are not totally convinced that vaccines are harmful, but that slight doubt may be enough to tip the scale towards the decision "let's not vaccinate, just in case". This is post-truth too. A post-truth generated by our emotions, involuntarily and as a result of what we do. And, when it comes to health, it is a post-truth that we need to fight effectively. There is great certainty that vaccines are safe; let's look at the evidence, which will protect us better than our negative emotions.
Some of us may try to understand what is wrong with these people to see if we can help them reconsider their position by showing them that it is not only contrary to the evidence but, above all, dangerous for their children and other people’s children. Others may choose to let them have their way, thinking "they're only hurting themselves or their children". But this is not true because, if there are many unvaccinated people in a community, the protection of the community as a whole is impaired, increasing everyone’s risk.
ALTERNATIVE MEDICINE, ITALICS MINE
While we are on the subject of health, where post-truth can clearly do harm, what to do about so-called alternative medical therapies? Alternative medicine is a set of practices that are applied with the intention of curing diseases or improving people's health, but that are not based on evidence: they have either not been proven to be effective or have been proven to be ineffective. These alternative practices have always existed. When we did not have a mechanism to validate their effectiveness, we called them medicine. Things have changed, but alternative practices are still around and will probably continue to exist.
A huge number of practices are considered alternative medicine, such as homeopathy, acupuncture, chiropractic, reiki or ayurvedic medicine. Each is different, of course, and it is difficult to discuss them all at once. But, generally speaking, the response they trigger in patients is indistinguishable from a placebo effect, which is why they are not considered to work.5 As we discussed in Chapter III, a placebo effect can be comforting and relieve many of our ailments, but it is not enough to cure us of serious diseases, infections, etc.
What guides a person to place their trust in an alternative therapy? Sometimes it is the failures of the health care system. Tired doctors who see patients every ten minutes, don't listen, don't offer comfort, and merely write prescriptions. Overcrowded hospitals. Difficult treatments that sometimes are not very effective or that generate serious adverse effects. Those who seek the answer in alternative medicine may not have found it in medicine. Other times, it is not that they firmly believe that alternative therapy will work, but perhaps they have already tried a medical treatment and were disappointed, or they have a disease that has no cure and they say to themselves "What's the harm in trying?
Perhaps, the alternative therapy is not working from a medical point of view more than a placebo, but it provides comfort and improves the quality of life.
In the case of homeopathy, for example, an immense number of studies show that homeopathic preparations are nothing more than a placebo. However, it is less clear whether other aspects of homeopathic practice might not be effective to some extent. For example, homeopathic practitioners often spend much more time in the office asking questions and listening to patients, which, in some cases, may result in a better experience for them (even if their underlying ailment does not improve). Sick people need their loved ones close by, just as they need the best medicine the system can offer them. Even if the ritual is "fake," it can function as a bond between people, as an opportunity for a genuine encounter.
But again, we could ask ourselves, what’s the harm of wrongly believing that these therapies work? And here is where things get trickier. Suppose a person has a chronic, progressive disease and feels that medicine is unable to provide answers. In these cases, well-meaning people often make recommendations, offer advice such as "doctors know nothing, try this", "this worked for me", etc. And this person, who is in a highly vulnerable situation, may say "what do I have to lose? But there is something to lose, and there lies the harm of falling into post-truth. On the medical side, the person could end up delaying, or abandoning, a proven effective therapy. Generally, this is the issue that is most often mentioned as a danger of alternative medicine, and with good reason. A study was conducted to assess how much the use of alternative therapies instead of anticancer treatments such as chemotherapy or radiotherapy affected the survival of cancer patients. The results were alarming: using alternative cancer treatments doubled the risk of death.6See Johnson, S. et al. (2017). Use of alternative medicine for cancer and its impact on survival, Journal of the National Cancer Institute, 110(1): 121-124. While not all those who use alternative therapies stop using conventional ones, the risk exists and is measurable.
Some believe that pointing out that these practices have not proven to be effective is cruel because, in some way, it shatters the hope placed in them. What I believe is cruel is giving false hope to someone who is also extremely vulnerable. Once again, a matter of beliefs.... Mine is that anyone who convinces a person with cancer that they can be cured with the mind or a special diet or quartz crystals or whatever, simply has no empathy. It's not just the claim itself, which is unsubstantiated, but they somehow seem to be telling the patient that if they are still sick it's because they are “not making an effort," as if having cancer were their fault.
Surely, having a positive attitude benefits the patient. It adds, it does not subtract. But telling someone with cancer that they don't need chemotherapy because they would benefit more from taking a meditation course seems inhumane to me.
There is another issue in "trying an alternative therapy just in case". Assuming that something works, unless proven otherwise, is an argument tantamount to saying that reading this book prevents naval accidents on the grounds that a ship never sank while one of its sailors was reading it. If, unfortunately, one day a ship were to sink with a reader of this book in it, we might argue that he had a bootleg edition, or that it had been underlined with yellow highlighter, which demonstrably reduced its efficacy against the perils of navigation. Those who believe, believe...
Personally, I would never judge the beliefs of someone who, when faced with the despair of a serious illness, turns to the hope offered by alternative medicine.
But do those who apply these practices to their patients really believe in them as well or is it just an opportunity to make a profit? Who is "responsible" for proving that the practice actually works? We had already remarked that the burden of proof should be on whoever makes a claim.7See Chapter II. Asking others to prove that the alternative practice is effective reverses the burden of proof. This, which may seem confusing, is often a deliberate strategy to delay processes: claims that are based on evidence must continue to be defended all the time, and this means that resources, which are always limited (money, time, attention), are spent on proving things that we already know.
Again, a single person may believe in an alternative therapy, feel that it is good for them, we can live with that. But would we be as flexible if our public health system decided to make the practice universal? Can we allow these therapies to be taught in medical schools?
Even if something is "known" to be wrong, even if the information is available, the discomfort we feel from facts that are contrary to our beliefs makes us dismiss that information unconsciously. Paraphrasing the idea of Occam's razor,8 We have discussed it in Chapter II. biologist Sidney Brenner invented the expression "Occam's broom" for the process by which facts that contradict our beliefs are "swept under the rug," something that is key to post-truth.
The question is what to do with personal and irrational beliefs within the framework of States, which must promote effective public policies that really improve people's lives. If States must reconsider their health policies, they may or may not consider cultural aspects, values, traditions, etc., but we hope that they will be able to look for and incorporate evidence.
In these cases, where there is a reality that we can know to a greater or lesser extent, allowing our irrational beliefs to obscure the truth is a problem. If we believe that what we think and feel is equivalent to the facts, we have a problem. Basically, if we consider that our internal perception of how the world is - or should be - is as true as reality, we are in trouble. If we get there, then we’ve arrived at post-truth. And finding our way back is often difficult.
I know it is hard to accept that this happens to every one of us. We have no doubt that it happens to others, but does it also happen to us? So I thought it was important to first go through how we know what we know. We can know and we have ways of knowing. Once we accept that solid premise, to think that reality is nothing more than a social construction, or that "you have your truth and I have mine", somehow legitimizes the idea that reality and fiction are equally valid. Accepting any statement as potentially true just because it seems accurate to me, or because it should be, failing to consider the evidence (or flatly opposing it) is surrendering to post-truth.
Our personal beliefs can be challenged, we can rethink the issues and course- correct. We need to be able to understand the truth even if it goes against what we believe to be true. As Carl Sagan said in 1995: "Science is more than a body of knowledge, it is a way of thinking. I have a premonition of an America in my children's or grandchildren's time, when America is a service and information economy; when almost all the major manufacturing industries have moved to other countries; when enormous technological powers are in the hands of very few, and no one representing the public interest can understand these issues; when the people have lost the ability to set their own agenda or intelligently question those in authority; when, clutching our glasses and nervously consulting our horoscopes, our critical faculties decline, unable to distinguish between what feels right and what is true, we slip, almost without realizing it, back into superstition and darkness."9The Demon-Haunted World: Science as a Candle in the Dark, 1995. Understanding the truth, knowing how the world works is not, then, just a matter of personal satisfaction. It is also a political agenda, that of living in a society where we know enough to, at the same time, take advantage of what knowledge has to offer us and use knowledge to intelligently question the direction in which we are going.
BELIEVING DESPITE THE EVIDENCE
When we take certain positions or make decisions, we are not always motivated by what is true. It's not that facts don't matter: they may matter a lot, more or less, or not at all, and much depends on how willing we are to let them matter to us, or to what extent we are able to take them into account.
As an example, let’s consider three extreme scenarios in which, even in the face of high quality evidence and a very wide consensus among experts, positions are held that go against the evidence: belief in conspiracy theories, denialism and postmodern relativism. They can feed off each other, although they also operate independently. They all happen in a post-truth context: truth is brushed aside and one's own perception of the world is prioritized. They all feature an influence of irrational and emotional aspects that overshadows the ability to consider what is known, and thus allows the emergence of post-truth. All three scenarios are also similar in their absolute refractory position regarding the available evidence. There are no uncertainties in these positions, only certainties. Of course, those who hold these views see themselves as rational, as people who are capable of seeing what others do not see, enlightened few in a deluded mass.
Conspiracy theories are based on the idea that there were and are powerful groups (governments, companies, etc.) that use their power to hide truths and impose lies. They do this in secret, without anyone ever knowing, except for some people who, for some reason, are able to see what others do not see, or to access information that others cannot.
Of course, there were, are and will always be real conspiracies. Sometimes, it takes a long time before they are exposed. In those cases, convincing evidence allows us to identify them. But the term conspiracy theory is different from conspiracy, because it implies that it is something not known to be true, or known not to be true because it contradicts everything that is known.
Some of these conspiracy theories are almost naive, and very amusing. Perhaps, some actually believe in them, while others may continue to spread them more for fun than in earnest. For example, they say Paul McCartney died in 1966 and was replaced by a doppelgänger. This implies assuming that his relatives, his acquaintances, his manager, his lawyers, all are keeping the secret. It also implies assuming that nobody but these few enlightened people who know that Paul died identified the look-alike. And one must also assume that there are powerful groups at work to protect the secret, and that they are being successful!
The world is very complex and very confusing. In the midst of all this, even if conspiracy theories go against the evidence, it is understandable that some people may believe in them. This feeds our human need to assume that things happen for a reason and are not due to some uncontrollable combination of complexity and chance. If something is coincidental, conspiratorial explanations are likely to appear, if for no other reason than to attribute the event to some agent. Just as we perceive (exaggeratedly) that our acts are the result of our will, as soon as we see acts in the world, we assume that they also originate from someone’s will. And if we do not see the supposed agent, the puppeteer, then we create it, as we created the shapes of constellations in the starry sky. It is a way of defending ourselves: it gives us back a sense of control and rationality in this world that sometimes makes us feel as if our influence over it is non-existent. It is not ridiculous for someone to believe in a conspiracy theory. But we need to determine whether doing so brings harm or not.
Going back to our previous examples, some of those who believe in a flat Earth believe that NASA is hiding that fact.10 See Chapter V. NASA, all the airline pilots, the people who have traveled around the globe. They believe that space travel is a lie, and consider every single photo of the planet taken from space as fake. Since conspiracy theories are kept secret, thousands of people have, over decades, been involved in this one. In any case, the death of a single (talented) human being, the Earth being flat (or the opposite) will not have a significant effect on most people's lives (unless we consider the damaging effect of unquestionably believing something to be true that is not).
This, as before, may apply to individuals, but not States or educational institutions.
But let's go back to vaccines. The group we mentioned earlier, the doubters, is not a group that believes in conspiracy theories. Even if that doubt is enough to keep them from vaccinating, if we can listen to them and understand their fears, we may be able to help them. But there are other people, anti-vaccination activists, who are totally resistant to evidence: the true "antivaxxers".
These people hold ideas such as that if Bill Gates donates vaccines to Africa it is because he actually seeks to poison Africans so that they will die and thus depopulate the planet. The fact that populations that receive vaccines have longer, rather than shorter, life spans does not affect the belief. Others are convinced that vaccines cause cancer, and they continue to believe that even though they are shown that there is no difference in the number of cancer cases between vaccinated and unvaccinated people. Moreover, today there are two vaccines that can prevent cancer: the vaccine against hepatitis B and the vaccine against HPV, the human papillomavirus, and it has been shown that, since mass vaccination, the incidence of these types of cancer has dropped sharply. But none of that makes them change their position.
What they also sometimes do is to take real evidence and misinterpret it. For example, a typical argument among those who hold this idea is that cancer is on the increase. From this they conclude that, "clearly", it is vaccines that cause cancer. It is true that there are more and more cases of cancer, but the correct explanation is this: people are living longer and longer, thanks to recent medical and technological improvements (including vaccines!), and cancer is more likely to occur in older people. But if you believe with total certainty something that goes against everything we know, you will fail to incorporate the correct information. Most likely, each new piece of evidence will be reinterpreted to fit that previous belief.
That's why I came up with the Pocket Survival Guides. If the most reliable evidence points to something different from what I believe, if the consensus is contrary to what I believe, if what I consider evidence is of poor quality, if I invoke the existence of things that nobody manages to find and give ad hoc explanations such as "all the evidence of what I say is being silenced by the power groups", I should, at the very least, review my position, at the very least, as long as I consider that what motivates me is the search for truth and not to live in a deception that comes not from power groups, but from myself.
There are radical antivaxxers who believe that a microchip (curiously as undetectable as the unicorn we talked about in the first section of this book) is injected with the vaccine, and that this is a government plan to control the population (the assumption that successive governments, even if they cannot agree on practically anything else, will agree on the imperative need to continue to keep this secret does not change that belief either).
These extremists are very few, but very active, and their message, unfortunately, may influence others and cause them to hesitate enough to decide not to vaccinate.
Let me take this opportunity to clarify something in all honesty. Post-truth in vaccination is an issue that I consider extremely important and dangerous. I believe that vaccine doubters are not usually well treated, listened to or taken into account. Their doubts usually fill them with anxiety and nobody gives them an answer. If we want to connect with them, we should reach out and listen to them with empathy and respect. The case of antivaccine fanatics, I believe, is totally different (and I say "I believe" because my personal position on the subject is not based on evidence, but rather on my irrational beliefs). They do not hesitate; they are certain and, by spreading an erroneous message, they keep others from getting vaccinated. Whether they do it out of a firm conviction does not seem relevant to me. I think they are a danger to public health, and we should not let them proceed or help them spread their messages.
How can we identify a conspiracy theory? Generally speaking, it goes against the consensus and it is not supported by evidence, or it claims to be supported by evidence, but, upon closer inspection, we see that this evidence is of poor quality or that its interpretations do not follow the principle of parsimony. When evidence is not taken seriously because it is of poor quality, the idea of conspiracy, of concealment, is reinforced. There is no evidence to debunk a conspiracy theory, and it feeds on ad hoc explanations and negative emotions such as fear, hatred, and the insecurity that comes from feeling out of control. Conspiracy theorists consider that the experts on the subject are not real experts, or that they are in the pockets of interest groups, always secretly, of course. More importantly, they fail to take into account the number of people who must be keeping the secret, and how effectively. In a world where we know everything about the private lives of presidents and the most powerful people on the planet, this is hard to accept.11Some estimates were made of how many people should be keeping the secret in different conspiracy theories. For some, it is tens to hundreds of thousands of people, and they should be doing so for many years. See, for example, Grimes, D. R. (2016). On the viability of conspirational beliefs, PLOS One, 11(3). Conspiracy theories are "zombie" ideas: even if we try to kill them, they just won't die.
Challenging knowledge is welcome, as long as challenges are based on evidence. In the post-truth era, what we have is a breeding ground for the emergence and spread of extreme irrational beliefs such as conspiracy theories.12 We have discussed beliefs in this same chapter. Perhaps we should try to identify and block these ideas, not only because they are not true, but also because allowing them to spread can help create a climate of doubt that favors the emergence of unintentional post-truth on other issues. The problem with living with other people's conspiracy theories is that, to a greater or lesser extent, they are contagious. Creating doubt nurtures post-truth. When ideas are harmful, such as those promoted by antivaxxers, there is a real danger to consider. Other than that, tolerating conspiracy theories leads to the feeling that "anything goes". Anythinggoesism is dangerous per se because it spreads distrust and doubt to other topics, carelessly and indiscriminately, and makes us forget that distrust and doubt are essential tools in critical thinking, as long as we adjust them to quality criteria that consider the evidence.
Conspiracy theories have always existed, but today they are easily fed by the extreme polarization observed in society and the screening of the information we allow to reach us.
There is another extreme, sometimes very harmful, version of irrational beliefs that set aside everything that is known: denialism. In this case, there is outright denial of anything real. Some denialists rely on conspiracy theories; others less so. But something they all share is the absolute certainty of something that goes totally against what is known, even on topics we know a lot about. This is intentional post-truth in its purest form. Denialists cannot help believing what they believe, and it is very difficult for them ever to correct that mistaken belief. If someone denies that we landed on the Moon or that species evolved by natural selection, at most they will be wrong, but that will have no major consequences. Holocaust deniers, or those who deny that HIV causes AIDS, are in a different league.
Not so long ago, HIV/AIDS denialism had appalling consequences. When there was no longer any doubt about the causal relationship between HIV and AIDS, the physician Peter Duesberg, ignoring the abundant and overwhelming evidence available on the subject, became an active campaigner for the denial that HIV causes AIDS, with considerable success. He managed to get his message across to many people, including Thabo Mbeki, who was president of South Africa from 1999 to 2008, following Nelson Mandela. Like many other African countries, South Africa is heavily affected by the AIDS epidemic. Despite the strong scientific consensus, Mbeki listened to Duesberg and, in 1999, at the height of the epidemic, stated that HIV did not cause AIDS. According to him, AIDS was caused by poverty, malnutrition and immune system issues. But that was not all. Although antiretroviral drugs to combat AIDS were available through international aid, Mbeki actively blocked their delivery to those in need. Alternatively, and without any evidence to back it up, his government claimed that AIDS could be treated with vitamins, garlic, lemon juice and beet.
This decision had very clear and specific effects. Between 2000 and 2005, some 2 million people died of AIDS in South Africa. It is estimated that, of these deaths, at least 330,000, one in six, could have been prevented if a proper HIV/AIDS health policy had been implemented.13See Chigwedere, P. et al. (2008). Estimating the lost benefits of antiretroviral drug use in South Africa, Journal of Acquired Immune Deficiency Syndromes, 49(4): 410-415. Denialism can kill. But because we sometimes find it hard to see the relationship between actions and consequences, and because we find it hard to understand action and inaction in the same way, neither Mbeki nor Duesberg were or will ever be charged with genocide at the international tribunal in The Hague.
We arrive at the third position, totally resistant to evidence. A very extreme position -perhaps the most extreme we’ve discussed- but, at the same time, a very common one, which we must watch out for. Some consider that there is no truth, period: that there is no reality that we all share, no facts, only interpretations. This position is a type of relativism, to which the rules we discussed in section 1 of this book do not apply. Reality is considered a social construction, and truth is assembled by each person based on his or her perception of the world, what he or she feels. The possibility of knowing through evidence is denied, the ability of science to obtain answers is denied. Instead, it is argued, there are as many ways of knowing as there are subjective realities, and all are equally valid. If you have your truth and I have mine, if everything is subjective and nothing is objective -attainable or not-, what can evidence say about that? Nothing. It is not a question of thinking that, sometimes, there may not be agreements about what the truth is, which is clearly possible, or that there may be different ideological positions or values. For something to be true, all it takes is for someone to believe it is true, and this is the mark of relativism.
This type of relativism is strongly influenced by the postmodern cultural current that emerged during the 20th century as a reaction to the enlightened values of Modernity, which –with optimism and confidence– considered that the human capacity for art and science would give us totally objective knowledge. Thus, we moved from the idea of total objectivity to that of total subjectivity. Today, we understand that science can never be totally objective, but it does manage to be, in practical terms, better than the alternatives.
Often, in the face of a cultural current that proposes something extreme, an extreme reaction soon arises, but in the other direction. Today, it is clear that science will not provide totally objective answers, nor will we reach absolute knowledge. Our position was precisely to introduce science as a tool that, for certain purposes, works better than the alternatives, in the sense that it allows us to obtain answers that correspond better to reality. It is by accepting its limitations that we can value it. The problem of using a screwdriver to drive a nail is not solved by claiming that all tools are equally wrong, but by finding the hammer.
Postmodern relativism is still prevalent, even if it is supported by a completely erroneous conception of what science is.
Somehow, in certain circles it became acceptable, and it was even considered a sign of intellectuality, to doubt the existence of a shared practical reality, as if believing in facts were a way of self-deception, of submitting to the arbitrary rules of others, and what should be considered valid is one's perception of the world.
People who believe this are usually educated people, considered intellectuals and even progressives. But this current of anti-scientific progressivism is, in practice, anti-progress, because instead of challenging the authority of scientists -something that is always necessary-, it challenges the authority of science. However, this criticism is not methodological. It does not propose to replace one way of evaluating evidence with another, but rather to confront the notions of evidence and of the objective existence of the world. Ultimately, it is a technique of self-protection: if nothing is true, then everything is equally true, and thus ideas are shielded from any criticism. It is a denialism similar to those we discussed before, but not about a particular subject, simply about science as a whole.
In postmodern relativism, the role of evidence and factual analysis of situations is blurred. So, just as we have evidence-based medicine, it is considered a possibility, and even a more promising one, that there are other types of medicine, with other "rules". These rules do not test the claims, but the argument is supported by vague ideas such as "there are other ways of knowing", "there are ancestral medicines", "ancient civilizations already knew how to cure diseases", etc. Thus, it is possible to speak of a "western" science or medicine that they oppose to an "ancestral" one, which in some way, neither specified nor tested, would be more "respectful of individuality".
I know that I am allowing my position on postmodern relativism to creep in. Since I cannot help it, I am making it explicit. I think there is something very damaging at play in this extreme belief. Someone holds it, and those who listen to it often remain silent, as if disagreeing about whether there is a reality were a matter of positions that can be put aside and moved on from, as if it were inappropriate to discuss it in public. And this silence is also harmful, because it makes it easier to spread that message without questioning it.
We can be wrong about what the truth is, and in that case we must recognize our error, but to deny outright that it exists and, therefore, that we have ways, however imperfect, to access it? Personally, I often see this position in circles whose members consider themselves very intellectual, but it seems to me exactly the denial of what intellectual activity should be.
Discussing the existence of reality is not a mental exercise. It is a posture that slowly but surely creates mistrust where there should not be any, and, to make matters worse, directs its "skeptical" attention to that and not to where it could be directed, which is a review, within the framework of science, of whether things are being done well or not.
Postmodern relativism is seductive. It controls a narrative of great ideals, of emotions. It is a posture that, by not being subject to "external rules", enables all "truths" to be equally worthy and to be respected. And this, of course, is comforting, especially if the alternative is to submit to a world with unknown, complex and often inaccessible rules, a world that looks cold and distant and does not care if we live or die, contribute or not, a world in which we are not as important as we feel we should be.
I can sympathize with relativists. Rationally, though, I cannot agree that theirs is a valid position, because it ignores the role of evidence. In factual matters, there is on truth, and the rest is false. We can not know that truth and still consider that it exists.
I believe that much of the attraction of the emotional pleasure derived from relativism can also be found in other activities, not all intellectual. Curiosity, the challenge to go beyond, is the attitude that gives me the comfort and emotional support others may find in relativism.
Postmodern relativism is one of the supports of post-truth: when anything seems possible, we have a problem. Not only does truth take a back seat, but "truth" starts to be the truth of the loudest person in the room, or a "subtruth" of each isolated group which, then, can no longer inhabit the same reality as the others. Along the way, we lose not only the truth, but also the human bond.
Relativism rejects both the mechanism of science and all of the answers science provides. And this is key: any knowledge obtained by science is fully open to criticism.
Even with its limitations --which are intrinsic, because a perfect science would not be science-- today there is no better way to obtain reliable answers to factual questions. To adjectivize science’s results --regardless of the adjective used, because there are circles in which people speak of Western, feminist, patriarchal, hegemonic science-- is not a way to solve its problems, but a blatant attempt to tear it down, not in favor of other, more democratic, more egalitarian, or better forms of knowledge. It is an attempt to replace it with tribal forms, secure in their isolation and poverty. It is a return to magical thinking.
We may not know a subject, we may know it imperfectly. But there are not so many alternative ways of knowing. If we don't agree on this, we can't move forward. There is no alternative knowledge. The toolkit is the same for everyone. If we set it aside, we are not playing the same game, and we could be fooling ourselves and others.
As Marcel Kuntz puts it, "the danger of a postmodern approach to science, which seeks to include all points of view as equally valid, is that it slows down or prevents the scientific research that is needed, even denying that science has a role in those decisions."14Taken from Kuntz, M. (2012). The postmodern assault on science. If all truths are equal, who cares what science has to say?, EMBO Reports, 13(10): 885-889.
More importantly, while they act with good intentions and innocence, convinced of their position, many relativists leave fertile ground for extremists who use the same arguments to deny the validity of well-demonstrated facts.
IN PRAISE OF UNCERTAINTY
The two great difficulties we encounter when we approach irrational beliefs linked to factual issues are the discomfort generated by the lack of absolute certainty (science, as we have already said, cannot provide it) and the fact that obtaining evidence is often a slow and complex process. What is known is always known incompletely. We move in a continuum of certainty in which each new evidence adds or subtracts a measure of support for our belief. We could always wait for new evidence, make more observations or experiments, and consider the issue from another point of view. What we already know, or think we know, may be proven wrong. This lack of absolute certainty in science, which many consider a weakness, is its greatest strength, since it allows us to continuously test what we already know in order to get closer to the truth. Of course, if we expect absolute certainty about the world, science and its results will cause us much anguish.
At best, what scientific activity can do is to decrease uncertainty to a very minimal minimum, almost zero. But it often fails to do even that, and this is extremely uncomfortable for all of us. There is no such thing as absolute objectivity, the absolute impartiality sought by the Enlightenment, nor the absolute subjectivity of postmodernism. What exists is a partial objectivity, accessible through the mechanisms of science, sustained by evidence that, even when carefully considered, may have biases, limitations and errors. Although we understand that the analysis of reality through evidence entails uncertainty, we want certainties. And if there is something in this uncertain world that provides certainty, it is our irrational beliefs. Beliefs can mitigate this anxiety, and science cannot. But science enables us to understand reality better than beliefs do. Which means that, if we feel confused or lost, if we feel insignificant in this complex world, it is often comforting to place our trust in beliefs or in people who offer us relief from this discomfort. If we find the uncertainty of science intolerable because it does not provide clear and convincing answers, or because it does but we do not understand them, then it makes sense to seek refuge among the solid pillars of our traditions, values, ideologies or religions.
The second point is that science is difficult, slow, complex, and does not even guarantee results. It involves always doubting whether you are on the right path. But this is how you arrive at answers that, limited as they may be, are genuine. Once we have the evidence, and we have reviewed it thoroughly, we must still move cautiously, on the alert, with an attitude of healthy skepticism in which, temporarily, we trust assertions to the extent that they are supported by quality evidence. A tiresome process, one that requires constant effort.
It is something personal, but to me it does not seem positive when a scientist or a science communicator presents science as something simple, clear, firm, immaculate (even glamorous or funny); when its flaws are hidden and the tortuous process that led to the validation of an assertion is edited out of the story. In my view, being afraid of complexity and hiding this aspect of the scientific narrative is problematic because it does not portray it fairly. Moreover, I think this is sometimes done out of a certain contempt for the intellectual capacities of society.
Denialism is sometimes confused with an attitude of healthy skepticism. But these are two very different attitudes, and we need to distinguish between them, because just as denialism is a possible path to post-truth, healthy skepticism is one of our best weapons against it.
Skepticism involves weighing the evidence and, from it, drawing a conclusion; it is doubting an assertion if it does not have solid evidence behind it and trusting a statement based on the weight of evidence. That is also why reaching conclusions is a slow and complex process.
Unlike skepticism, denialism starts from the desired conclusion and, based on it, rejects the evidence that contradicts it. What distinguishes skepticism from denialism is that, if contradictory evidence emerges, the former allows us to correct our position, whereas the latter does not.
But it is easy to get confused, as some denialists call themselves skeptics: those who deny the existence of climate change consider themselves "climate change skeptics". However, to oppose a scientific consensus of such magnitude as climate change has on its side is not to be skeptical, but gullible. Credulous enough to embrace an irrational belief that implies not only disbelieving the scientific consensus, but actually believing that there must be a conspiracy and thousands of people have agreed to keep the secret.
The key is to maintain a state of reasonable doubt. If we continue to doubt something despite plenty of evidence that it is true, we are not being skeptical: we are holding an irrational belief. And, while demanding more evidence, we are dismissing what is already known and choosing ignorance. Expecting science to make a definitive statement about something --where that "definitively" is associated with absolute certainty-- is expecting something that is not going to happen, failing to admit that science has already made a definitive statement on the subject, where definitive implies a fairly high certainty, supported by evidence.
This attitude, which may be innocent and unintentional, is also one of the strategies used by some who hold discredited positions, such as climate change deniers or antivaxxers: requesting new evidence while dismissing all the existing evidence, which is incredibly powerful and generates a strong scientific consensus. The same strategy is used in "non-scientific" topics more related to post-truth, in politics and other fields. Identifying it in scientific topics could help us identify it in others. This is particularly difficult because those who apply the strategy appear to be respectful of evidence, skeptics and not denialists. We need to be able to distinguish when criticisms are reasonable and help a claim to be adequately tested from when those criticisms create, intentionally or not, a cloud of doubt about something that might already be considered actually certain.
Believing against the evidence cannot be equated with Galileo confronting the Church with his belief that the Earth revolves around the Sun. The fact that science has validated some beliefs which were once rejected by the establishment, or considered crazy, cannot be extrapolated to any situation. Rejecting ideas because there is evidence against them is not being "closed-minded": it is using science.
Sometimes, however, the evidence is not really strong, but the pressure is on us to make a decision. This is one of the great difficulties of evidence-based decision-making in this real, non-ideal world. What if there is reasonable doubt, but we still need to make a decision? Evidence-based medicine offers us a possible way forward: making a decision based on the best available evidence. In these situations, too much hesitation can be paralyzing. And not making a decision is also a decision. We have to trust and doubt at the same time, in an attitude of healthy skepticism. Blind doubt is as harmful as blind trust.
To conclude, we will briefly comment on an irrational belief that has a particularly clever disguise: the irrational belief in science. This is another way of unintentionally creating a post-truth scenario. Here, it is not a matter of casting doubts about something that is known, but of promoting certainty about something that is not. We may not realize this is happening, and intentional post-truth can easily be generated in this way.
Having to make an urgent decision without enough evidence is one thing: without certainties, you do what you think is best, considering the best available evidence. But considering something true just because some isolated evidence seems to support it is quite different. Perhaps, from a distance, both situations appear similar, but in the first case a healthy skepticism may allow course-correction if new evidence that warrants it emerges, while in the second case there is an irrational belief, a certainty that may reject new, contradictory evidence.
The world is complex, difficult, and has no interest in us understanding it better. We do the best we can, but our emotions and beliefs will often come between us and the truth, further complicating our access to it. Still, tools that can protect us are within our reach.
American comedian Stephen Colbert says that there are "those who think with their heads" and "those who know with their hearts". On one side, those who are guided by reality. On the other, those who embrace fantasies. However, we should not get caught in a false dichotomy between a "scientific side", based on evidence, and an irrational side, based on emotions, values and various beliefs not supported by evidence. This dichotomy is useful to talk about the subject, but it is also false, because each of us lives on both sides, which, in practice, cannot be separated. Except in very extreme situations, our positions are generally shaped by evidence whose reading is influenced by our values and emotions. In any case, each of us is thinking with our head and knowing with our heart at the same time.
Sometimes this is positive, because it allows us to contextualize the truth of science in particula human situations. The analysis of many complex problems we face today need to be evidence-based, but the evidence must be framed by values, traditions or emotions. Other times it is negative, as when our irrational beliefs overshadow the evidence and prevent us from identifying and accepting the truth. In the fight against post-truth, identifying our irrational beliefs and checking them against our emotions can make the difference between life and death.
There is a reality out there, and we all share it, whether we like it or not. Part of that reality is the fact that we can never totally rid ourselves of the irrational. Nor should we, for the sake of our well-being and that of others, blindly follow our personal perceptions, even going against what the world is telling us about itself.
This is how we stand regarding our own irrational beliefs. But what about those of others? Just as we cannot label ourselves as purely "rational" or "irrational," the same is true of others. It may seem to us that a person's belief is ridiculous or stupid, but to think that without looking at ourselves in order to identify which of our beliefs may give that impression to others is, at the very least, incomplete. If there are facts and others, or we, are brushing them aside to adopt an irrational belief, we should not think that this happens out of ignorance, stupidity, or malice. The mechanisms that make us think this way are more complex, and we all have them. We can all mix evidence-based arguments with arguments based on values or other similar aspects. Understanding and accepting this can help us understand and correct ourselves. In addition, there is a permanent trap in our thinking: we all see ourselves as rational beings with access, to a greater or lesser extent, to what is true.
And when I say "all", I emphasize it. Those "others" from whom we sometimes feel so different also think the same thing.
People are neither totally rational nor totally irrational. Thinking that is false, and also makes us miss all the nuances in between.
With this in mind, one thing we can do is not only try to identify our own irrational beliefs, but also disclose them. If we say something, we should make it explicit whether we are speaking from the evidence --and, if so, provide it--, from our beliefs or from a combination of both.
I have tried to do this throughout this chapter. That's why I used "I believe that", to signal my beliefs so that others can decide whether or not they agree with them. It may seem like "weak" language, as if I'm not sure of what I'm saying. It is exactly the opposite: it is a strength, not a weakness, to identify whether our lack of certainty about something we say is due to the fact that we simply cannot be totally certain about it (because the evidence does not provide us with absolute certainty), or if it is because our opinions are not unquestionable truth. Likewise, it does not seem appropriate to me for someone to tell me their opinion as if it were a fact, or to speak with absolute certainty when they have no way to support that certainty with evidence.
We need to recognize that, although we may want to be guided by reason, we may be guided by emotion. Also, there is always the issue of respect and empathy. In that sense, it can be helpful to distinguish people from their ideas.
People deserve our respect and empathy, but ideas do not: we should be able to put the latter to the test without it being threatening to the former. It is good for others to test our ideas, and to allow us to test theirs. If irrational emotions and beliefs may be hindering access to the truth, we need to identify them and try to set them aside, or we risk contributing to the creation of unintentional post-truth.
This chapter, the first of five that make up the second section of this book, discusses the influence of our irrational beliefs and emotion in the unintentional creation of post-truth. We need to identify to what extent our emotions, our beliefs, affect us, and modulate our positions as necessary. If they are blocking us or making it difficult for us to access the truth, we are allowing them to create unintentional post-truth.
We hereby introduce the fourth Pocket Survival Guide, with new questions that function as tools to help us identify our irrational beliefs, evaluate whether it is important to identify them, and see how to take it from there. These questions are grounded in our introspection, without which we cannot move forward.
We have just considered irrational aspects that can make it difficult for us to access the truth. We will now turn to, if you will, more rational issues: the way in which we reason may -- and usually does -- cause us to make mistakes. We make mistakes in our thinking that we cannot easily identify. Also, we will need introspection to detect them and overcome them. Let's get to it. We'll take the fourth Pocket Survival Guide, with these new tools in the box, and proceed.