Common ground




The search for truth takes many forms, from a private and personal one to a shared, bound and practical one in which we try to discover and inhabit that truth with and thanks to the help of others. 

In our fight against post-truth, seen as the impossibility of inhabiting the same space as other people, one of the aspects we will have to understand better is how to help find that common ground. That is, how to get people to accept the truth, incorporate it and use it to build their position and decide on their actions. All this without forgetting that, many times, we ourselves are the people who need to accept the truth, incorporate it and use it to change our position. 

If we believe something to be true –that is, if we consider it a fact--, but another person does not believe the same, what do we do? It is at this point that the most difficult question becomes pertinent: are we going to settle for being right in the face of that mistaken other, even at the cost of knowing that we inhabit realities that don’t overlap, spaces we cannot share? Or are we going to seek to connect with that other in a sincere way, asking them to be willing to consider new information? Can that connection exist if we ourselves are not willing to consider their point of view? And given that it could be that the mismapped territory is our version of the truth and not the other's, are we able to modify our positions if new information shows us that we were wrong? 

To share with others our perspective of the territory we understand as truth, the easiest thing to do would be to do it intuitively and simply talk and give the information we have. This tactic would be equivalent to getting up on a soapbox in the square and preaching our truth to the four winds. But does that, besides satisfying our desires, work? If what motivates us is to feed our own ego and show how smart we are, how well we know the terrain, it will work, and we will feel very good about ourselves. If we want to talk to people who already think like us, it will also work, and it will be a clear signal to the tribe. But if what matters to us is that our message reaches everyone, and particularly those who don't think like us, what do we do? How do we construct an effective way of sharing our perspective so that an outsider who sees differently manages to bring their view of the ground around them closer to ours, close enough to overlap? 

Fortunately, just as evidence allows us to find out how things are, the same applies to communication. We can study with the methodology of science which ways of communicating work and which ways don't, and when we do that, we find several surprises because, of all the territories of truth, the communication of cartography seems to be one a work in progress. Let's go to that. 

Some of our positions refer to ideological issues, or values. They are opinions on which we may differ among ourselves because we have different ways of looking at the world. There are neither truths nor falsehoods here. But in other cases, we are in the realm of the factual, and there, not everything is valid. There are certain rules. Sometimes the information exists, the facts are known, but that does not effectively permeate to all people, which favors for doubts to continue to be cast and for post-truth to grow. Intuitively, it may seem that if a person does not accept something as true, it is because they are ignorant on the subject, lack information. This is known as the information deficit model. In factual matters, matters for which we can have evidence and arrive at an approximately objective truth, this model suggests that those who distrust those facts do so because they do not know them. But is the intuitive idea really so? 

If this were true, we would expect that giving them the missing information would be enough to change their position on that issue. What happens in reality is more complicated. Sometimes, when a person does not know something, they are given correct information and incorporate it without much difficulty. This is how we learn many topics. 

But other times, it is not that the person does not know, but that they think they know. They hold an erroneous version of the subject; not an absence of a position on it. And this illusory territory, this erroneous version, once established, is very difficult to correct. 

The tricky questions emerge. When reading the word erroneous, some will think that it is very strong, very powerful. Didn’t we say that when we think in terms of evidence, we cannot have absolute certainty? So how right is it to talk about right and wrong? And, on another axis, isn't it aggressive, politically incorrect and even bad manners to say that someone is holding an erroneous idea? On this I want to be clear: we may not know, but if we do know –because we are considering the weight of evidence-- then it is important that we defend that truth. Looking the other way or saying that everyone can have their own opinion on a subject puts us at risk of falling into relativism and thus allowing unintentional post-truth to emerge and take hold. 

My position is this: 1) when you know the truth, you defend it by being explicit and honest about how much you know it; 2) a priori, people deserve respect and have the right to express their ideas; 3) with ideas it is different: they have to earn my respect, which I don’t grant a priori. If an idea refers to factual issues for which there is evidence, but ignores them, it must be challenged. 

By criticizing ideas, separating them from the people who hold them, we put them to the test and allow them to polish and improve, where improvement is that they become better adjusted to reality. 

So yes, I defend speaking of right vs. wrong in cases where we have so much evidence that we can be practically sure what is right and what is wrong. I defend it also because, when something is not yet very well known, I simply refrain from assigning it those categories: in that case, I would not say that there is a self-evident truth, but that it is not yet very clear what it is. Also, the advantage of this perspective is that, instead of pitting two people against each other, we can make them observe a phenomenon together and try to describe it accurately. Where before there was a zero-sum game,1A zero-sum game is a situation in which the gains and losses of the different players balance and add up to zero. If there are two players, one wins and the other loses. in which one person "won" and the other "lost" by defending a position, we now have a cooperative situation of two subjectivities in search of something that more closely resembles the truth. A single team winning, together, accurately mapping the terrain of what is. 

It is often observed that, if someone already has a position on an issue, and this position is wrong, trying to correct it by giving them the correct information does not change their mind. One of the first papers to show this was a 1994 experiment in which people were told that there had been a fire in a warehouse caused by a short circuit that had occurred near a cabinet containing cans of paint.2See Johnson, H. M. and Seifert, C. M. (1994). Sources of the continued influence effect: when misinformation in memory affects later inferences, Journal of Experimental Psychology: Learning, Memory, and Cognition, 20(6): 1420-1436. Shortly thereafter, they were informed that, in fact, the cabinet was empty, thus correcting what they had been told earlier. On evaluation, participants remembered and accepted the correction –they said there were no paint cans at the end-- but if asked why they thought there was so much smoke, they said it was because the paint was burning. They responded using the wrong information despite remembering the correct one, possibly because it allowed them to construct a narrative in which the smoke was a consequence of the paint burning: we would rather have incorrect explanations than no explanations. 

In other cases, holding a mistaken belief probably has more to do with the passion that this belief awakens in the person. Emotion, values and distrust of the system, elites or experts are important here.3We see some aspects of this in Chapter II. When today, with all the information we have about the Earth being a planet, some argue that it is flat, it is not that they do not know that the "official version" is that the idea of a flat Earth is incorrect. Moreover, they almost certainly learned in school that the Earth is an almost spherical planet that revolves around the Sun like other planets, and so on. However, at some point, they came to believe in a flat Earth, and once they are in that space, it is very difficult for them to get out. As the narrative they put together is that there is a NASA conspiracy to hide information,4More on beliefs in Chapter V. even if we try to show them the evidence that contradicts their idea, everything will be interpreted as more evidence of a plot. 

And here, I think we have to be careful and distinguish the person from the idea, as before. Perhaps we arrive at this belief, or others like it, as a defensive response to critical personal situations. Perhaps we find in those ideas, and in the new tribe with which we share them, something that comforts us, that gives us a sense of control, of confidence, of being listened to. If we understand that the person who believes in a flat Earth is not a third party, but that somehow we are all of us, that we may have begun to believe in this mistaken idea for reasons that have much more to do with emotion than with reason, it will be easier for us to accept that we are not stupid or ignorant, as some may think, but simply victims of our biases. 

When on a factual issue there is available, overwhelming, coherent information that allows us to distinguish between a correct and an incorrect statement, and yet some set that information aside and continue to hold a wrong position, we are in the realm of post-truth. 

The existence of these situations shows us that the information deficit model is not correct or, at least, is not applicable to all cases. We believe that the path we follow is to know the facts and use them to form our opinion. We do not realize that many times we do the opposite: we have an opinion and we accept or reject the facts depending on whether they agree or disagree with that opinion. Opinion based on facts versus facts based on opinion. 

So, there are at least two major situations in which we see that the information deficit model is not necessarily correct: when we have already adopted a position and it is difficult for us to change it, or when, for some reason, the subject triggers emotions in us or touches on beliefs, values or other elements that are not necessarily based on evidence and that constitute the core of our identity. 

When “external” facts threaten these identity cores, both personal and tribal, we will try to defend ourselves without realizing neither that we are doing it nor how we are doing it. Because, in addition, if what is being threatened by the facts is one of our most deeply rooted postures, we feel that without it our world falls: an attack on the posture is an attack on us, on our construction, on what we are. It is not easy to get out of this space. Thus, we reject the information that contradicts us and embrace the information that supports us. 

What we see in this behavior is called motivated reasoning,5 We can see more about this in Chapter V. a set of cognitive strategies that allow us to reduce the discomfort produced by the fact that there is evidence that contradicts our beliefs. We then select the facts that agree with our position (confirmation bias), we do not realize that beliefs and emotions are interfering with what we consider to be a rational position, or we believe that we are adopting an unappealable moral position when in fact this is not the case.6See Epley, N. and Gilovich, T. (2016). The mechanics of motivated reasoning, Journal of Economic Perspectives, 30(3): 133-140. 

More information does not necessarily lead to a better understanding, let alone a change of position. In a very interesting experiment, Dan Kahan and his team researched this issue in the following way: they presented numbers to participants who had to draw conclusions, but in one context those numbers referred to whether a topical cream made a rash better or worse, and in another context, the same numbers referred to whether allowing guns in a city increased or decreased crime. What the researchers observed was that, in the latter case, the responses were distorted according to the political leaning of the participants, showing a type of motivated reasoning.7 See Kahan, D. et al. (2017). Motivated numeracy and enlightened self-government, Behavioural Public Policy, 1(1): 54-86.

As Jonathan Haidt, a social psychologist who studies this phenomenon, said, "The reasoning process is more like a lawyer defending a client than a judge or scientist seeking truth."


Sometimes, when we try to communicate with someone who holds an erroneous position, what we actually end up generating is a "rebound effect" (also known as  "backfire effect") whereby the person becomes more entrenched in their position. People who believe something that is incorrect become more certain that they are right because someone tries to correct them. And, thus, they are further away from changing their position than before. 

In pioneering work on the subject, researchers gave participants fictitious texts with a wrong statement by a politician, or with that same statement accompanied by a correction, and then asked them questions.8  See Nyhan, B. and Reifler, J. (2010). When corrections fail: the persistence of political misconceptions, Political Behavoir, 32(2): 303-330. As an example, in one of the experiments, the information was that George W. Bush said that it was necessary to go to war with Iraq because they had weapons of mass destruction, and in the correction it was clarified that Iraq did not have these weapons. What was seen is that many times not only was the persistence of misinformation not reduced, but sometimes a rebound effect appeared. In particular, conservatives who received the correction were more likely to disregard it and continue to maintain that Iraq had such weapons. 

When information somehow threatens the way we see the world --that is, our ideology, values or beliefs– even if we are wrong, our minds reinterpret it to strengthen our previous position, and so we end up further away from realizing that we are wrong than we were before. We resist incorporating the information into what we previously thought and, if someone tries to get us to do so, what they accomplish is that we become even more entrenched in our erroneous position. This type of rebound effect is stronger in people who hold more extreme positions, thus putting them more decisively in the wrong path. 

A few years ago, scientists John Cook and Stephan Lewandowsky wrote The Debunking Handbook, a "handbook for myth debunking" in which, based on research, they described different ways in which the rebound effect can arise and proposed simple courses of action to avoid it. There, in addition to the rebound effect mentioned above, which they call the "worldview rebound effect", they point out two more that can arise if, when trying to disprove a myth, some specific mistakes are made. 

One of them is the "overkill backfire effect". In this case, one tries to refute the myth with a correct explanation that is very complex, extensive and full of details. This overwhelms the other person, who then prefers to stick with the explanation provided by the myth, which is usually simpler in structure. To prevent this from happening, Cook and Lewandowsky propose the "KISS" strategy, for "Keep it simple, stupid!”, i.e., give simple explanations to try to correct misinformation and not "err" on the side of excess. 

Generally, myths are successful as such because they have seductive narratives that explain in a simple way some observations or agree with people's previous beliefs. When an attempt is made to refute them, the facts sometimes do not have the same seductive power, so that, after a while, the details are forgotten, and what remains is the explanation provided by the myth, which, as now discussed, is more present and entrenched. Cook and Lewandowsky call this the "familiarity backfire effect", and it is something that perhaps we are all collaborating with every time we try to explain to someone why their position is wrong. 

With all this in mind, in the manual for refuting myths, the authors suggest that, in order to avoid the rebound effect, the refutation should focus on the most relevant facts and not on the myth, so as to prevent it from becoming more familiar to people. In addition, it should be taken into account that, by destroying the myth, the person is left without an explanation, even if it is wrong, that allows them to accommodate observations and beliefs. This generates a discomfort that, if not solved, leads the person to return to the myth. To avoid this, it is not enough to destroy the myth, but it is convenient to offer an alternative and correct explanation, a new narrative; this helps the person to accept the refutation and not reject it. 

So, basically any media story that is titled "The 10 most common myths about (fill in the blank)”, and which lists each myth and accompanies it with its refutation, is perhaps helping to spread the myths among people who have not yet heard them and to reinforce them in the minds of those who have.9See Peter, C. and Koch, T. (2015). When debunking scientific myths fails (and when it does not): the backfire effect in the context of journalistic coverage and immediate judgements as prevention strategy, Science Communication, 38(1): 3-25. Remember that myths endure because they are generally attractive per se, so doing this would basically be communicating misinformation, rather than information.


One of the characteristics of science is its counter-intuitive nature. The science of science communication is no exception, and it shows us that there are ways of communicating that are more effective than others. At worst, it tells us which ones are not effective, but even that is valuable information, because it allows us to avoid making mistakes and generating with our action something worse than what is generated by our inaction. Sometimes, neither good will nor intuition are enough. Not only are they not enough, but they could make us cause more damage. 

Experiments can be performed to identify what type of message works better and which works worse. In addition to the rebound effect discussed above, what does the evidence say about communication on issues inundated with post-truth, such as the myths that vaccines cause autism, or that anthropogenic climate change does not exist? 

If we are communicating on health-related issues, such as vaccine safety, we have a huge responsibility to try to get our message across and get it accepted, and to avoid creating a rebound effect. In this context, an evidence-based approach becomes essential, not a luxury. What does seem to work when researching the effectiveness of different messages? 

In the case of vaccines, it appears effective to make the risk of the disease more present, as shown in a study in which it was found that images of sick children work to make people change their attitude towards vaccines, while an informative message that refutes the connection between autism and vaccination does not work.10See Horne, Z. et al. (2015). Countering antivaccination attitudes, Proceedings of the National Academy of Sciences of the United States of America, 112(33): 10321-10324. Parents who do not vaccinate their children believe that in that way they avoids the supposed risk of "generating" autism in them, but do not see that they are then choosing, at the same time, the risk that they will become ill with the disease that the vaccine prevents. If a person thinks that vaccines can cause autism, the two options are vaccination or no vaccination. And here the cognitive biases reappear. When we act, we feel that what happens afterwards had to do with our action. When we do not act, we feel that what happens afterwards has nothing to do with our inaction. In both cases, we can be wrong. One of the reasons why the myth of vaccines and autism "caught on" so much in some people has to do with the fact that autism is usually diagnosed around the same time that children receive vaccines, in early childhood. In this case, it is not true that our action of vaccinating "caused" autism, so the supposed causal relationship between the action and what happens afterwards is incorrect. In the same way, if someone does not vaccinate a child, and then the child gets sick with something that could have been prevented by vaccination, the person does not see that it was inaction, not vaccinating, that created the risk of the disease, and attributes it to fate or chance. Another mistake, because here there is a causal relationship between inaction and what happens afterwards. Since vaccines prevent us from getting sick, it is not obvious to us how many lives are saved. This is also a problem for vaccines and preventive medicine in general, which is always at a disadvantage compared to that which treats established diseases. A "miraculous" cure of someone who is sick is always more striking than not getting sick.11See more on this in chapter XI.

Another message that seems to have effectively counteracted the negative attitude toward vaccines was not to discuss individual evidence, which may require a person to understand some of the subtleties of science and medicine, but to talk about the scientific consensus. One investigation showed that it worked very well to say that 90% of scientists agree that vaccines are safe and that all parents should vaccinate their children.12See Van der Linden, S. L., Clarke, C. E. and Maibach, E. W. (2015). Highlighting consensus among medical scientists increases public support for vaccines: evidence from a randomized experiment, BMC Public Health, 15:1207.

It also seems to have been effective to appeal to empathy and emotion by explaining that vaccinating the population protects the most vulnerable, who may not be able to be vaccinated.13See Betsch, C. et al. (2017). On the benefits of explaining herd immunity, Nature Human Behaviour, 1(3).

The evidence-based communication approach was also investigated in relation to anthropogenic climate change. Dan Kahan believes that, in an issue so polarized along political-partisan lines, "we need science communication strategies that recognize the best available evidence that is in turn compatible with membership of different cultural groups." He argues that trying to strip the issues of their "identity signs," trying to depolarize them, could help, although he does not give specific suggestions. 

As with vaccines, emphasizing that the scientific consensus around the issue of anthropogenic climate change is enormous does seem to work: 97% of experts agree that anthropogenic climate change is a fact.14See Lewandowsky, S., Gignac, G. E. and Vaughan, S. (2013). The pivotal role of perceived scientific consensus in acceptance of science, Nature Climate change, vol. 3, pp. 399-404. Even talking about the consensus would be important to "vaccinate" people who have not yet had contact with the disinformation: if the consensus is first emphasized, then, when the myth that anthropogenic climate change does not exist arrives, it is more easily rejected than if it arrived before the person had any awareness about the consensus.15See Van der Linden, S. et al. (2017). Inoculating the public against misinformation about climate change, Global Challenges, 1(2).

Not everyone is so convinced of the evidence provided by experiments in effective communication, and for several reasons. To begin with, the rebound effect that was described in some cases, and which we discussed above, does not always appear. There are even those who are suggesting, in recent months, that the rebound effect may not exist in practically all cases.16See Wood, T. and Porter, E. (2018). The elusive backfire effect: mass attitudes steadfast factual adherence, Political Behavior, January. Possibly, the rebound effect is preferably seen in people who already have such a strong position that they will not change it. These people are somewhat "radicalized", but they are very few. In vaccination, for example, it is estimated that there are about ten vaccine doubters for every one anti-vaccine extremist. 

In this resistance to information that contradicts our beliefs, and that could even create a rebound effect, some researchers observed something that gives a glimmer of hope. If we are repeatedly exposed to the correct information, the growing discomfort it causes reaches a kind of "breaking point", which is personal for each one of us, in which we begin to reconsider our own position and sometimes change it.17See Redlawsk, D. P., Civettini, A. J. and Emmerson, K. M. (2010). The affective tipping point: do motivated reasoners ever get it?, Advances in Political Psychology, 31(4): 563-593. 

In evidence-based communication, we still have a long way to go. We are still in a "pre-consensus" situation, and therefore, we see stumbles and some twists and turns. This, rather than a weakness, shows the strength of the mechanism of science itself, which always reviews what has gone before and puts it to the test. We must continue. 


If we are uncomfortable for not having clear answers and feel tempted to explain things on the assumption that others adopt wrong positions because they do not know enough and that our explanation will give them the information they need, we are in trouble. First and foremost, it is a discomfort that comes from not having a clear and simple explanation which makes us prefer a clear, simple, and perhaps also wrong alternative explanation, such as the one that assumes that people do not know and just need to be informed. 

A bit of introspection here to recognize the traps, the same old traps... Perhaps we prefer the myth of the information deficit model, and ignore the solid evidence that it does not work simply because of what we said before: myths have particularly seductive narratives. Meta-myth on the attack, again.

So, evidence-based communication has given us some answers. They are not complete or unappealable, but neither is it true that we know nothing. Why, despite the fact that there is evidence, even if it is not so much or definitive, do we still see messages that presuppose that people lack information? What is going wrong? Why does the information deficit model look like a zombie that never dies? 

In practice, the communication strategies being pursued do not seem to be changing much. States do not modify them when communicating their public policies, while scientists or science communicators (for the most part) do not take the evidence into account and change the way in which they approach the issues. Neither do doctors with their patients, or teachers with their students. Nothing. But just as it was important to obtain the evidence that smoking causes lung cancer, it is also important to find out how best to communicate it in order to get fewer people to become smokers and smokers to quit. Still, many communicators, in many different fields, continue to assume that the audience does not know and explain the evidence in detail, always with the same format and the same tone. This is fine for topics that do not "generate" post-truth, that is, those that are not taken as an identity sign, or do not contradict one's own beliefs, or do not arouse negative emotions. But when we have polarizing issues, and interact with people who may be subject to that very polarization, shouldn't we try to use the evidence on effective communication that we have, now that we know that the traditional approach, at the very least, does not work? If, in spite of everything, the communicator does not change the message, unfortunately, signals will be sent to his tribe to gather around the "good guys" (those who already agree) and drive away the "bad guys". And then we are surprised that these people feel left out of the conversation, that they believe no one listens to them, that they define themselves by their distance from the enlightened elites who "know it all". Sometimes communicators even make fun of these people or talk about them in a derogatory way, which makes their followers rebroadcast the signal, amplified, in social networks, blog comments or conversations in general. For example, on the subject of vaccines, a frequent mistake is to stereotype a person who only has some doubts as if they were an anti-vaccine fanatic. If we treat a doubter as an extremist, not only will the communication not work, but we run the risk of making them become one. All this is the opposite of communicating, it is as if the other person does not matter. It is to deepen the rift and further alienate those we want to be close to. It is tribalism, and among supposedly "rational" people. 

Some scientists identify several problems in the fact that the information deficit model is still present, such as the training of scientists, who communicate the way they learned and how they can (again, intuition and tradition), and do not usually have specific training in communication.18See Simis, M. J. et al. (2016). The lure of rationality: why does the deficit model persist in science communication?, Public Understanding of Science, 25(4): 400-414. Moreover, communication occurs in a context of institutional traditions that establish a culture that is difficult to change. Particularly, in that study they observed that scientists who least recognize the value of the social sciences would be the most likely to base their communication on the information deficit model. 

It may be that what research is finding out about effective communication has not yet reached these interlocutors, society, and is still kept within the academic world. After all, you don't change cultures overnight. It may also be that this information is getting through, but it is not being accepted and the usual, simpler explanations are preferred. 

Wait a minute, what does this remind us of? Is information not enough to change positions? Could those who communicate be setting science aside to continue relying on beliefs, intuitions and traditions?

Likewise, the evidence-based communication approach is relatively recent, with the most relevant advances occurring in the last decade. 


As we have said, in the case of medicine, we speak of evidence-based medicine, but that does not mean that evidence that a treatment or a drug works is enough. Medical practice takes that as a basis, but adds to it the physician's experience, the context of what is happening, the patient's values and particular situation, and so on. Perhaps it would be more appropriate to speak of evidence-influenced, rather than evidence-based, medicine. We can draw a parallel between this and communication, analogously to what we did in the previous chapter for public policy: an evidence foundation and, on top, layers of another category. 

In the Rhetoric, Aristotle discusses the art of persuasion, which for him is supported by three pillars: logos, ethos and pathos. The logos is knowledge, what we have been calling evidence. Ethos is values: a person who has a good ethos and looks irreproachable has more credibility. Finally, pathos seeks to convince through emotion. With these three axes, according to him, it was possible to make people change their positions. 

In these post-truth times, it would seem that logos does not matter much, and persuasion is achieved only with ethos –often apparent and not real-- and pathos. It is pathos that explains why a good emotional narrative, whether or not it reflects real facts, is sometimes enough to persuade. 

For example, the myth of the relationship between vaccines and autism persists through very emotional stories (pathos) and also because it continues to be promoted by some people who appear to be influential and who are sometimes even physicians (ethos): they look like reliable sources, even though they are not. 

When you try to refute these myths only by using logos and, at most, ethos, you don't get very far: this is true and you will accept it because my information is good and I am an expert. The facts do not speak for themselves. They are not self-evident. And for this reason, some communicators try to include a narrative that adds pathos. 

We also need to understand the starting point of the audience and, if possible, of each of the people in the audience separately. Aristotle had this in mind, both as part of ethos and pathos, under what we would today call empathy. But to emphasize this aspect, we could speak of "empathos", a word invented with the aim of stressing the importance of knowing the point of view from which the other observes and understands the world. Thus, we can build a path attentive to the perspective of the other. A bridge that is built by knowing where the end point is, the territory that the other currently inhabits. 

With what we know so far, we can say that there is no such thing as a "perfect message" that will work with everyone. What does exist, clearly, are some "imperfect messages" that we know do not work with some people and, to a lesser extent, "perfect messages" for some particular people in particular contexts and with particular issues. We need effective communication to be sustained by the logos (otherwise it would be dishonest manipulation) and to include the other three components as well. 

In particular, emotion is very powerful, both the possibility of expressing one's own emotion and the possibility of connecting with the emotions of others. If the person who tells us a fact does it in a passionate, convinced and kind way, it will give us more confidence than if the same fact is told to us by an apathetic or unpleasant person. If what they say is wrapped in a narrative that appeals to emotions, that adds up. It is probably true that what is often lacking is not information, but emotion. But of course, emotion is also in surplus in those who hold wrong positions, so many communicators see appealing to emotion as a kind of "shooting themselves in the foot", something that, even if effective in the short term, in the long run may generate greater distrust in the evidence. What is the right thing to do? What approach to follow? We do not know. The reality is that, at present, it is not yet clear. 

In political issues, where post-truth seems to be raging, the focus is often on motivated reasoning, and it does not matter so much whether the information is correct or not. This implies, then, that trying to correct a person's information does not have much effect on their behavior.19See Flynn, D. J., Nyhan, B. and Reifler, J. (2017). The nature and origins of misperceptions: understanding false and unsupported beliefs about politics, Advances in Political Psychology, 38(51): 127-150. 

In itself, polarization is not necessarily a problem. Perhaps it reflects positions that cannot be reconciled. It is different when that polarization arises not so much from ideological differences of substance, but from the impossibility of the extremes to inhabit a shared reality, an impossibility that comes from post-truth. 

Communication has more chances to be effective if it is done with kindness, with respect for the other person, even if the idea that this person holds is not respected. In addition, this helps the other person not to feel threatened and therefore become defensive. 

Manners matter. Ideas should be attacked, not people, as we said at the beginning. Attacking ideas is a way to strengthen the good ones and thereby distinguish them from the bad ones: we are all on the same team together, and that team of everyone is strengthened by the generation of powerful ideas. If we treat the person as if they are stupid, that is actually feeding the myth that they are. It's a myth of ours, which we have to try to destroy. 

Once we decide to be nice, what do we say? We know very little about how to get a person to accept facts that contradict their position. But we know even less about how to encourage that, once the new information is accepted, it is effectively used to change the attitude. It is one thing to accept that vaccines do not cause autism; it is quite another to have one's children vaccinated. It is one thing to recognize that one's "own" politician lied, and quite another to decide to stop voting for him or her. One accepts the lie, the incompetence or whatever, but that does not change the emotional issues that make us continue to prefer that politician. It is one thing to know which habits are healthy and which behaviors are risky, and quite another to use that knowledge to change our own behavior. 

At least in medicine, things do not seem to be going very well in this regard: as we mentioned in the previous chapter, most of the time people's behavior is not modified by the information they have.20 See Marteau, T. M. (2018). Changing minds about changing behaviour, The Lancet, 391(10116): 116-117. Logos is not enough. On the other hand, many alternative medical practices, which are not known to be effective or not, or are known not to be effective at all, continue to be very popular among many people. Why? One reason is that these people feel rejected or not listened to by doctors, they feel dehumanized by a system that does not take the time to know and understand them. What alternative practices often offer, even if they are not treatments that medically work, is the possibility of a bond between the patient and the practitioner. The patient is listened to and feels that someone is taking into account his or her pain and point of view. From then on, the fact that the supposed therapy is not based on evidence becomes information that does not modify a position that was generated thanks to totally different components: emotion, attention, empathy. Effective communication, for sure. 


In Inventing the Enemy, Umberto Eco mentions this counter-intuitive way in which we construct what we believe in: "Once, in a monastery on Mount Athos, talking to a monk-librarian, I discovered he had been a student of Roland Barthes in Paris and had taken part in the demonstrations of 1968—and therefore, knowing him to be a man of culture, I asked whether he believed in the authenticity of the holy relics he kissed devotedly, each morning at dawn, during a magnificent and interminable religious ceremony. He smiled kindly, with a certain malicious complicity, and said that the problem was not one of authenticity but of faith, and that when he kissed the relics he sensed their mystic aroma. In short, it is not the relic that makes faith, but faith that makes the relic." 

Perhaps this is where we have the most mythical myth of all: we believe that people base their positions on facts, as if the path were facts first, and a position on the issue later. Reality seems to be quite different, it’s as if we first have the positions, which we arrive at through "irrational" paths (values, beliefs, emotions), and then our motivated reasoning helps us protect them from facts that may threaten them. As we said before, opinion based on facts versus facts based on opinion. But, even if a person eventually accepts the information provided to them, unless their values, beliefs or emotions are modified, not much else will happen. They may not change their behavior despite updating their position. 

Things look more complicated than they seemed at first, don't they? And it gets worse. We have to ask difficult questions, but better to ask them and have no answer than to sweep them under the rug. For example, we tend to assume that education will save us, and quite possibly in most cases it will. However, it clearly finds a limit in situations where, for whatever reason, we dismiss information if it does not agree with our previous positions. Should we supplement traditional education with this approach to effective communication? 

To communicate better with others, we need a little introspection to understand our motivations and biases, a little empathy for others, and a lot of information and critical thinking. 

Here is a Pocket Survival Guide that can help us communicate better with others in these post-truth times: 



Do we distinguish well which aspects of a position are factual –where there may be truth– and which aspects are not? 
Are we motivated by the truth, and are willing, upon joining the conversation, to change our factual position if presented with evidence that contradicts it? 
Are we motivated to develop a meaningful bond with the other person? Are we treating them with empathy and respect, even if we do not respect their ideas? 
Do we know the other person well? What ideas do they hold? What motivates them? What are their beliefs? How do they see the world? What can we learn from their perspective that we don't have on our own? 
Are we reasoning in a motivated way? What are our biases and emotions? 
Can we contextualize the issue so that it is not so threatening to the other person? Can we depolarize it, take away its identity marks, recognize ourselves as belonging to the same tribe as the other person? 
What is the best available evidence on effective communication in the topic and context we are dealing with? Are we willing to take it into account? Do we want to talk at or talk with someone? 
Do we try to adapt our communication to the other person (empathos)? 
Do we know the facts (logos), do we have credibility with others (ethos), and do we want to add pathos to our communication? 
Are we careful not to generate a rebound effect in the other person by focusing on facts and not on myths, using the "KISS" strategy and not overwhelming them with too much information? 
Do we care about "educating" the other person, i.e. getting them to accept the correct information, or changing their position and behavior, i.e. persuading them? Are we willing to be that "other person"? 

Post-truth is not an abstract discussion about the nature of reality. Once its practical consequences are perceived, allowing it to grow can be seen as a public health problem. It is about avoiding post-truth in order to survive, and it is at that point that we must ask ourselves whether we would rather be right or win, remembering that "winning" is defined as getting closer to the truth together, and includes all of us. 

In this chapter, we addressed one of the most complicated aspects of the fight against post-truth: how to communicate with each other when some accept the truth and others push it aside, for whatever reason. We started with a simple question about whether or not it works to "try to educate" someone who rejects the truth, assuming they do so because they don't know about it, and moved to the question of what works to communicate with that person. And then we saw that, at least for now, we don't quite know. This "we don't know" is what we see when we look at the best available evidence on evidence-based communication and recognize that we are not yet in a situation of clear consensus. In other words, we look at what communication science tells us, and what we notice is that it does not yet give us a conclusive answer. Along the way, we noted how easy it is for all of us to prefer a myth with a simple structure and seductive content to a confusing and complex reality. 

We do not yet have an answer as to what is effective communication for each situation, but we do know that basing ourselves on evidence when trying to communicate will surely give us answers little by little. We do not offer definitive answers, but we do offer this proposal: we should continue researching communication and make decisions based on the best available evidence. 

But we are not addressing a truly complicated issue: it is extremely exhausting to constantly think about how to get to know the other person in order to understand them better, and also to see if we ourselves are not making mistakes, if we have the right information and understand it well. It is difficult, and we have to deal with that while trying to live our lives in the midst of all our worries and joys. But, as always, we need to ask ourselves what the alternatives are. Not to deal with this is to surrender control to others, to keep sinking gradually into something that seems true and is not, is to allow ourselves to be more or less unmoved by what surrounds us, as long as it does not bother us too much, as long as it resembles our expectations of reality even if it does not adequately reflect it. 

"All that is necessary for evil to triumph in the world is for good people to do nothing", the saying goes. Struggling to find a truly common ground can be exhausting, but we can’t afford not to.