Part of our sense of identity arises from our social identity, based on the social groups to which we feel we "belong". Social identity makes us, usually unconsciously, favor people who hold ideas that we identify as those of our "tribes" and avoid those who do not.
This idea, proposed in 1979 by Henri Tajfel and John Turner to explain behavior among groups, is made up of three elements: categorization, identification and comparison. Firstly, we categorize ourselves, we "separate" ourselves according to different criteria: social class, religion, nationality, gender, profession, the neighborhood we live in, the sports team we support, the party we usually vote for, the operating system we choose for our computer. We may even separate ourselves by what we reject, united not for something, but against it: the "anti-s". Thus, we generate stereotypes, caricatures of exalted traits, and we attribute their characteristics to the whole group, glossing over the differences and personal nuances of its individuals: "immigrants are lazy", or "immigrants are the driving force of development", according to the stereotyper's taste.
Secondly, we identify as belonging to a group, based on whatever criteria. An "us" and, perhaps even more importantly, a "them" is created. Where there is a group, there is a boundary, and on the other side of it is the other group. If this were not so, if there were no limits, we would be an "all" and there would be no group or group identity.
Finally, we compare our group with the others. In this comparison, we assign positive values to our group and highlight them, while we look for negative qualities in the others. Our sports team is the best, our language is the best, our country is the best. Our group being the best reaffirms our self-esteem. Since we belong to that group, which is "good", and others are excluded, we are special for belonging to it.1Some strategies for communicating effectively between tribes in Chapter XV.
When our social identity in the group is strong, there is tribalism. Based on our belonging to one group or another, we generate tribal behavior, which is acted out in various ways. We protect our sense of belonging and our group.
What comes next may seem a bit strange. At least in my case, it took me a long time to accept that I am not as "free-thinking" as I thought, that tribal behavior is not something that only happens to others, but that it is also in me. When I became aware of my tribal behavior, I was able to identify my various tribes. Or, at least, I could say with some confidence that I identified several, because quite possibly there are others that eluded my radar. Developing introspection is essential to see these things.
How is tribal behavior related to the unintentional creation of post-truth? Mainly in this way: we often think that, in order to curb post-truth, we just have to look for the truth amidst the sea of misinformation or irrelevant information. But this is easier said than done. We have already discussed how we interpret facts, information, always within the framework of what we already believe. What we have not discussed is to what extent that belief also defines the tribe with which we identify: when the issue in dispute is associated with the identity of our tribe, although we believe that what matters to us is to identify what’s true, it is quite possible that we are unconsciously prioritizing what our tribe considers to be true and failing to challenge it.
If a "threat" towards our group emerges, we protect it, we remain loyal to it and, quite often, we go out of our way to defend its ideas tooth and nail, without reflecting too much on their value. That threat comes from others, who are, a priori, wrong. If they are right, it is because I am wrong, or worse, I am not wrong, we are wrong. My us, which sustains me. The truth becomes a threat to my tribe. One against which we immediately seek to defend ourselves.
Since we want to preserve the social network that sustains us and the ties that bind us to the members of our group, this truth can be a threat, even to the continuity of our membership in that group. Sometimes, to the point that accepting it means putting that membership in jeopardy.
Few things are more difficult for social animals like us than to leave a tribe: to leave a religion, to stop supporting a certain political figure that until then we felt represented us, to change our stance on "difficult" issues, the kind that usually generates a sense of belonging.
This is a good time to try to identify those "difficult" issues that give us a sense of belonging. To do this, we have to look at ourselves and also accept that what is very relevant for us may not be so for others.
Leaving our tribe has an emotional cost, and sometimes other kinds of costs which can be very significant when it comes to our bonds. For this reason, we often prioritize "behaving" in front of our tribes even if it means continuing to be wrong.
On issues that trigger a strong emotional response, which are most vulnerable to post-truth, we need to tread carefully. If any of the positions that define our constituency relates to a factual issue, it is crucial to allow data to guide us. This helps, but sometimes it is not enough, because once we have the evidence, we must be able to accept results that do not align with our position or that of our tribe, which is extremely difficult.
Moreover, it may be more acceptable for us to see that we belong to groups with lofty ideals and purposes, but groups often form around any criteria, even the most trivial ones, such as tastes in music, sports or food. Recognizing this is important, because we may believe that an unbridgeable gulf separates us from "the others", all the while being in a group for reasons that are not as fundamental as they seem at first glance. When we argue with members of other tribes, much of the distance we perceive may not be based on discordant points of view, but on justifications we make for ourselves and for others in order to reaffirm our loyalty to our group.
The criteria by which we separate into groups can be so trivial, that it has been observed that this happens even... if groups are formed at random! In a study in which people were split into two groups by flipping a coin, social identities of "us vs. them" and "we are better than them" emerged.2See Tajfel, H. et al. (1971). Social categorization and intergroup behaviour, European Journal of Social Psychology, 1(2): 149-178. Even as part of randomly defined groups, participants generated narratives that justified that their group was better than the rest.
Personally, I admit that the idea of the "human family" is very important to me: we are all part of one big family and perhaps, if we tried to erase the borders that split us in groups, enhance respect for each other, and get to know and understand each other more, our well-being would increase. This is one of my irrational beliefs. I don't see many of the "boundaries between groups" that others see. But, of course, this idea is not shared by everyone, but by a fairly small group. This implies that one of my many tribes is not that of all human beings, as I would like it to be, but of those of us who think that the distinction among groups is not so useful. I am in the group of those who believe that there should not be groups, but understand that it is inevitable that they exist.
All this happens unconsciously, and it is one more feature of how our minds work. It is not an easy thing to accept. Often, the idea of being part of tribes can make us feel like sheep in a flock. This has also been investigated: the need to see ourselves as individuals often conflicts with the need to be part of groups. There is a permanent tension between both aspects, but, equally, since we are all part of groups, the very idea of "not belonging to the flock" is often also an identity idea that brings people together.
Generally, our social behavior is neither purely intergroup nor purely interpersonal, but lies somewhere in between. But in a high-conflict situation between tribes, such as two armies at war, two sports fans or a polarized debate in Congress, something different can be observed: behavior is so strongly conditioned by the group to which the parties belong that it is almost unaffected by the individual relationship between those parties. In these cases, behavior is less interpersonal. The greater the conflict, the more tribal the behavior and the smaller the awareness that, on the other side, there are people not so different from us regarding matters we consider essential. Or, even more importantly, that on the other side there are ideas or truths worth considering that challenge what we and our tribe believe.
Let’s pause for a minute and try to find some specific example of this in our daily life. Of course, some will really pause here to reflect and some won't, and the reflective tribe is clearly superior to the non-reflective one because it is my tribe. Now wait a minute...
Perhaps for this reason, many dictatorships or corporatist governments perpetuate conflict: it increases group cohesion, silences dissenting voices and promotes the surrender of individual needs in favor of the real or imaginary needs of the group. Others first become a formless mass to which we assign characteristics that make them less human than ourselves. At the most extreme, this can lead to racism and even genocide of one group at the hands of another, based on criteria such as ethnicity or religion.
Possibly, many thousands of years ago –when we lived in small communities that were sometimes no larger than the nuclear family-- it was easier to identify the members of our group and for them to identify us. This would even seem to be etched in our minds, in the so-called Dunbar number. Many years ago, primatologists noticed that non-human primates tend to maintain very strong social contact with their ingroup, but more interestingly, the number of individuals in the ingroup correlated strongly with the volume of neocortex in the species. Robin Dunbar hypothesized that, if this were applicable to humans, given the volume of our neocortex, we should have a significant social group of approximately 150 individuals. This number was repeated in the most primitive human social organizations, from the maximum number of nomads in a group of hunter-gatherers to the size of Roman military units or even the maximum number of academics in the subspecialty of a given field. Some papers see this repeatedly in the number of meaningful interactions in social networks or in job searches.
But today we are no longer organized in groups of 150, and our evolutionary history has seen the emergence of different forms of identification that go beyond that number; today, we join together and find "our kind" in much larger groups. Today, we are many and we live intermingled in very complex societies. We still need to gather, but we lack obvious or "natural" groups. In our current societies, we have options unknown to the Homo sapiens of a mere thousand years ago, who died where they were born, ate what their family had always eaten, continued their parents’ work, and married at most someone from the neighboring village. How do we "meet" the members of our groups today?
Why does someone wear the jersey of his or her sports team? The Star Wars or Che T-shirt? A crucifix? A branded backpack? The latest book by a fashionable philosopher? This book? We may not always be aware of it, but we are constantly signaling the tribes to which we belong: we stick stickers on our laptops, we wear certain clothes, we alter our bodies with tattoos, piercings, exercise, diets.
The signs are not necessarily material, or marks on the body. They can also be ideas (or memes or kitten videos, whatever) that we display for others, for example, on social networks. Yes, we do this for us, of course, because these are all things we like or stand for something very important and we want to share them with others. But we also do it, not necessarily consciously, so that others recognize us as members of the same group, or of a different one. Sometimes, too, we mean to show other tribes that ours is a large, powerful and determined tribe that is struggling for recognition.
It is unescapable: not doing this (not dressing fashionably, not having the sneakers everyone else has, not broadcasting something on social networks, etc.) also sends tribal signals. These say "I don't belong to those groups; I am among those who don't belong to those groups; look at me, people from my group".
Although sending tribal signals is nothing new, the ease with which we can broadcast those signals far away and to many people is relatively recent. Through social networks, blogs or forums, the Internet is now a megaphone through which each of us can shout. An idea can cross continents instantly, and if it "goes viral", it will reach many more people than we could have imagined just a few decades ago.
Why would we be interested in sending signals to our tribe? What advantage does this give us? If we appear to be "good members", the tribe accepts and favors us, which increases our self-esteem. When what we share are ideas or points of view, the tribe will believe us to be intelligent --because we say what it accepts as "correct"--, and we will also recognize its membership as intelligent people –since they are aware that what we say is "correct".
In the world of ideas, one of the spheres in which we manifest our identity is politics, and here too, and perhaps more than anywhere else, we construct a social identity for ourselves. If we want to fight post-truth, we need to pay attention to this.
Do we want a large and protective State, or a smaller and more flexible State? Do we favor immigration as a way to enrich our culture, or do we close borders to protect it? Are we better off recovering traditions, or should we be forward-looking and embrace progress? Should certain religious values be further incorporated into the State, or should we separate Church and State even further? Are we citizens of our city, our country, or the world?
We certainly identify more with some of these stances and less with others. We all have some kind of political identity, in the original sense of the word. Some feel a stong identity, act accordingly and militate for their ideas. Sometimes an identity is more embodied, more a "part of us," and sometimes it is less relevant to our idea of who we are. But it is there, and it has a strong social component.
When we think about our tribal behavior in the context of politics, some additional "dangers" arise: is our social identity affecting public discourse? To what extent are our political decisions due to substantive ideological positions and tribal loyalties? Do we care more about who is saying something than what they are saying?
Faced with these types of questions, we tend to believe that others are more subject to their social identity than we are. Just like when we discussed irrational beliefs,3See chapter V. we tend to consider that the beliefs of others are irrational, while ours are not. When we think we make mistakes, it seems to us that other people’s thinking is flawed, but not ours. We all believe that our positions are objective, realistic and based on evidence, while those of others are not (unless, of course, they happen to coincide). "I'm right side up, you're upside down". This relates to an idea that cognitive psychologists Hugo Mercier and Dan Sperber called the argumentative theory of reasoning, according to which the evolutionary advantage of reasoning is not so much to enable us to attain knowledge and make better decisions, but to have tools to justify ourselves and convince others. Our way of reasoning encourages us to find errors in the way others reason, while helping us to hide our own.
In politics, we do not talk enough about how much we are influenced by the tribal context, and so the same public policy measure may seem appropriate if it is proposed by the political party with which we identify and inappropriate if it is proposed by another.
Given that all these factors influence who we vote for, what information we take into account or, directly, what we identify as true or false, what may be at stake is nothing more and nothing less than democratic life itself.
In the United States, racial tensions continue unabated. In many developed countries, immigration is seen as a threat. Since tribalism has always been with us, perhaps globalization, by blurring "boundaries", is encouraging it? Could it be that the more the world becomes one and multiculturalism surrounds us, the more our minds tend to reinforce our bonds, in small groups? How much is globalization influencing the resurgence of nationalism witnessed today in some European countries? According to The Economist, identity politics was generally the domain of the left, but nowadays the right is taking that rhetoric and adapting it to its own style: in Europe, a young, generally nationalistic, right-wing activism is emerging, which, moreover, is often strengthened by establishing collaborative ties between different countries. The ideas they put forward are those of protecting a certain cultural identity (tradition, language, belonging, etc.), sometimes by proposing to close borders to immigration, to confront the European Union or to exclude Islam. Communicatively, they often send extremist messages that provoke strong emotional responses, typical of post-truth. The rejection that this style triggers in some only succeeds in victimizing them and giving them more exposure. This incendiary and categorical communication finds an echo in social networks, through which the messages spread at great speed:4More on this in Chapter IX. again, signals for the tribe members to meet, identify each other and grow in numbers.
Research is being done on how our social identity in terms of partisan affinities influences the way we analyze information: which data we highlight and which we "sweep under the carpet" --with "Occam's broom", as defined by Sydney Brenner--, or how we are affected by what someone we recognize as a member of our group says about a certain political figure.
Let's think about every politically partisan meme we receive or share through social networks: how much of it is a real exchange of information, verified and reliable, and how much is it a way to send a signal to our own tribe?
By this I do not mean that there is anything per se wrong with exchanging tribal signals. There isn't. What I do find interesting is that we can realize whether or not we are doing it and, if we are doing it, whether or not it is what we mean to do.
Regarding partisan identity, psychologist Jonathan Haidt discusses the importance of morality in our political identification. He argues that not only do we consider our political party to be right and "the other" wrong, but we also believe that members of “the other” party are dangerous and morally suspect people. Within our tribe, our bonds are also sustained by morality, and somehow we need, then, to consider "the other" to be people we would not trust to make morally sound decisions. Of course, in the other tribe the same happens, but in the opposite direction. The problem is that, if the game is played in the realm of morality, or worse, the appearance of morality, we put information, facts, aside and fall squarely into post-truth.
We may believe that we are capable of realizing that our tribe is wrong and that we can change our minds, but that is not the case. In fact, we resist. How come? To understand how we manage to ignore or counteract ideas that contradict our beliefs, we looked into our brain activity. For this purpose, in a series of experiments, participants were told arguments that provoked low or high resistance and the brain circuits that got triggered were identified.5See Kaplan, J., Gimbel, S. and Harris, S. (2016). Neural correlates of maintaining ones political beliefs in the face of counterevidence, Scientific Reports, no. 6. Among them, a region called the amygdala, which plays a part in emotions, particularly fear, seems to be relevant: information that somehow threatens our identity, our tribal belonging, literally generates a fight or flight response. Again, negative emotions make us particularly vulnerable to post-truth.6More on this in Chapter V
I wish we were not culturally split between "science" people and "humanities" people (more tribes and more sub-tribes, all separated), as if someone who is interested in scientific subjects is automatically perceived as uninterested in humanistic subjects, and vice versa. In this split, we lose all that the other tribe has to offer us. In the research above, we see that a cognitive psychology approach is complemented by our social behavior as citizens, which is explained by specific brain mechanisms. Humanities and science intermingle, both in the set of topics ("what") and in the research approaches ("how"). The problems to be solved in the world do not come in little boxes with the label of a single field of knowledge on the outside.
In the world of fact-checking a very interesting phenomenon occurs. When there is clear evidence that a politician of party A lied, those who actively spread that information are those of party B. This is to be expected, because it is information that allows the tribe to justify its position. But look at what party A members do: they do not share the information, nor do they discuss it. They ignore it because, if they were to accept it, it would conflict with their political identity. The skeptical look is more powerful outside the tribe than inside. In a way, "this shouldn't be true, but it seems to be true, so for me it doesn't exist".
Another moment to invite introspection.
There are some very interesting experiments that show this in action. Suppose we want candidate A to win an election. If candidate B were to win, we would be disappointed. Intuitively, if we bet money on B winning, that might somehow compensate for the disappointment of A losing (this is called hedging in financial markets). What actually happens? In such a situation, people prefer not to bet at all.7See Morewedge, C., Tang, S. and Larrick, R. (2016). Betting to your favorite to win: costly reluctance to hedge desired outcomes, Management Science, 64(3): 997-1014. They lose the possibility of winning money, since if they were to win it at the expense of their tribe losing, it would be interpreted as a betrayal. They prefer not to jeopardize tribal loyalty by sending signals inconsistent with those accepted by the group.
Something similar is also at play in this scenario: a group of people are given the option of reading an article that agrees with their previous stance on same-sex marriage, or reading an article that takes the opposite stance. In the first case, they are offered $7, while in the second they would earn $10. Nearly two-thirds of the participants chose the first option, even though it meant less money. But they preferred not to expose themselves to a position contrary to their own.8See Frimer, J., Motyl, M. and Skitka, L. (2017). Liberals and conservatives are similarly motivated to avoid exposure to one anothers opinions, Journal of Experimental Social Psychology, vol. 72, pp. 1-12. This was also observed in the case of having to listen to political positions with which one does not agree. To be clear, this is not something that happens to the people who participated in the study. It happens to all of us, and it has clear consequences in terms of partisan politics.
I emphasize that it happens to all of us because it is very easy to consider that these studies are nothing more than anecdotes. If we do not realize that we are also susceptible, there is no escape from this "trap".
How can we listen to the position of the other tribe if we do not want to hear it? Or, in other words, how can we expect the other tribe to listen to us if they do not want to hear us?
Intra-group loyalty is so strong that it even punishes those figures of the same party who decide to change their position on an issue: suppose that we identify with a politician who holds an idea that is a "mark of identity" for us. If that person changes his or her mind, he or she is usually accused of being "fickle", of "selling out", etc. In our minds, it goes something like this: "It's not that he is wrong now, but that he was never really one of us". All his past acts are reinterpreted as betrayals. He is erased from the pictures of the past, as was customary in the former Soviet Union.
Ideas are not discussed, identities are discussed. Researching someone’s past is often used not to show that we were all sinners once, but to reveal the worst sin of all: changing our position on an issue. But if we penalize updating our positions, what are we left with? Our unconfessed secret, not even to ourselves, is that we embrace someone who changes their position to that of our tribe, but we punish someone who does the opposite.
This puts us in another bind when we look beyond ourselves and consider the media. Their business model requires them to seek out as many readers, viewers, etc., as possible. Some people consume some media, others consume others. Could it be, then, that there is no incentive for media outlets to challenge the tribal identity of their followers by reporting the facts? Could it be that what they do is feed the tribalism so as not to lose those who financially sustain their business? Is it possible for a media outlet to act as an individual, to be silent when there are facts that cause them to conflict, and instead actively disseminate the facts that benefit the tribe?9More on this in Chapter IX.
Tribalism is all around us in our daily, professional and civic lives. How, then, do we achieve the consensus that is necessary in democracy, or simply agree on what is true and what is not? If we prioritize sending signals to our tribe that we are good members, even if we are wrong, how will we be able to make informed and careful decisions? How can we change our minds on an issue if we dare not challenge our tribe?
SO FAR, SO CLOSE
We meet someone who thinks differently on a topic that is relevant to us, but we are friends, colleagues or relatives, we like each other, and that difference of opinion is not a central aspect of our bond. This is enough to identify us as two people who are on opposite sides of an imaginary line drawn on the floor. Surely we all experience this, be it on political, economic or social issues, or on very personal and small aspects of our own preferences. It may be about whether there should be a death penalty or not, whether you can wear socks with Crocs, whether God exists or whether the existence of God is a relevant question. We identify ourselves as people on opposite sides of that line. The edge that separates us appears. We "move closer" to those who are similar to us, we "move away" from those who are different.
This also happens in politics. What we said before about extreme tribalism in groups in a scenario of high conflict between them also applies to party politics: when the rift is wide, it is more difficult for us to identify those on the "other side" as individuals with personal characteristics, and we assign to them as a whole the characteristics that we attribute to the group to which, in our mind, they belong. This gradually increases polarization, and the phenomenon is exacerbated.
Extreme polarization, in which one group views the other group as full of negative qualities, is seen in politics in many countries. People who have a greater identity commitment with their group are more mobilized and tend to be more "noisy", which contributes to polarizing the rest. And in many of these cases, although we may not be aware of it, polarization is not so much about how different the ideas are, but about how much we like those of our group and dislike those of the other. In other words, it is not so much about the underlying ideological differences that may exist, but about tribalism.
In the United States, partisan politics has long been dominated by two major parties: the Democrats, who identify themselves as more progressive, and the Republicans, who are more conservative and traditionalist. Identity politics is seen to be leading to growing polarization: surveys conducted by Pew Research show that, in recent years, the proportion of people of one party who have a very unfavorable opinion of the other is increasing, and this phenomenon is accompanied by more and more polarization.10Pew Research Center (2014). Political Polarization in the American Public. Available online.
Again, negative emotions influence how we react to events and how we bond. Sometimes, we hate what we hate more than we like what we like, and it is around that hatred that we coalesce as a group.
We can think of two types of polarization: ideological, based on the identity ideas held by each group, and tribal, which arises not only from one's own attitude and that of one's peers, but also from the unfavorable attitude towards the other party. Ideological polarization would imply that the people who feel closer to both extremes are more numerous, and apparently, today there are fewer moderate people than in the past. But perhaps there is not really as much initial ideological disagreement as it seems, but this growing polarization is mostly driven by tribal issues.
If the two political tribes overlap less and less, many moderates who do not see themselves represented by either of the two polarized extremes will not vote at all, something that is particularly relevant in countries where voting is not compulsory, such as the United States. Who benefits from this? Not democracy, not truth.
Regardless of the depth of the gulf that separates these two tribes, they perceive each other as so different that the possibility of conversation is nullified, intransigence increases and all the ills of the world are attributed to the opposing tribe, which then feels ignored and blamed. It is no longer just a matter of the different worldviews that may exist in the two political parties: a large part of the rift is based on negative emotions such as fear or anger.
TALK TO EACH OTHER
One of the possible mechanisms to explain at least part of the gradual political polarization seems to have to do not so much with the bond --or lack thereof-- between different groups, but with what happens within the same group as it discusses a particular issue. Several research studies show that when a group of people with similar positions discusses a policy issue, each person's attitude becomes more extreme after the discussion. This phenomenon is often referred to as group polarization.
Some of this research, led by Cass Sunstein (co-author of the book Nudge, along with recent Nobel Prize winner Richard Thaler), examined group polarization in the United States, "taking advantage" of a local situation that allowed experiments: in the U.S. state of Colorado, there are two communities that are very similar in many respects, except that one is particularly liberal (Boulder) and the other conservative (Colorado Springs). People from these communities were asked to discuss for fifteen minutes three issues that are traditionally associated with people's political stance and elicit strongly emotional responses: environmental policy to reduce greenhouse gases, same-sex civil unions, and affirmative action.
The results were very interesting: after discussion among people of similar views (those on the left with those on the left and those on the right with those on the right), the Boulder liberals had an even more liberal political stance on the three issues proposed, and the Colorado Springs conservatives, more conservative, also on all three issues.[footnote content="See Schkade, D., Sunstein, C. and Hastie, R. (2007). "What happened on Deliberation Day," California Law Review, 95(3): 915-940." id="11"] Regardless of whether the participants were left-wing or right-wing, the phenomenon observed was the same: after that short fifteen-minute discussion, their individual positions became more extreme. This also implies that the distance between liberals and conservatives, i.e., polarization, became even greater. Not only this, but also the diversity of positions, within liberals and within conservatives, decreased.
In arguing our position to others, we self-justify, highlight the evidence that supports our position (cherry picking), or we ignore facts that contradict us, which, almost inevitably, ends up strengthening our idea that our opinion is the correct one.12 More on this subject in Chapter VI. We come together with people who think like us and exclude from the group those who do not. Eventually, the only opinion we listen to is that of people who think like us, because all the people around us do, and we begin to believe that this is the only correct opinion, and even the only possible one. Thus, we create unsurmountable rifts.
We do all this in an involuntary effort not to change our mind, to protect our previous belief, our tribal identity, especially in public. If at the beginning that small difference in point of view with respect to the other person was not so relevant to us, it is at the end. Polarization gradually increases, dialogue decreases and the other person becomes the enemy or, at the very least, a good person who is deceived or manipulated. Each tribe believes it owns the truth and devalues the other. Friendships, relationships and even democracy suffer from this pattern.
Of course, it is also possible that there are different opinions, or that aesthetic, ethical, ideological or value-related issues are appreciated differently. Not everything is attributable to our social identity, of course, but recognizing that this factor is also present allows us to bring positions closer together or, at least, to try to build bridges with others in order to understand each other better. The social identities we have lead us to tribal behaviors that, when manifested in politics, make intolerance and polarization grow. This leads to increasing inability to reach consensus or, simply, to converse. All this is breeding ground for post-truth, a monster that feeds on incomplete or misinterpreted information, on issues so strongly emotional that we fail to disentangle the problems in order to access the evidence.
So far, we have discussed social identity, tribalism, signaling to the tribe, how this is observed in politics, and the phenomenon of gradual political polarization. But we did not specifically address a distinction that time has come to make: sometimes we are separated from others by our different values, moral qualms, or ideas about how we should act on certain issues. These are truly different ways of looking at the world. We may never agree, but we are in the exclusive realm of ideas, of value attribution, and we cannot necessarily speak of viewpoints being right or wrong.
Other times, what is at stake are the facts themselves, which we already know but end up being distorted by identity politics. These factual questions could be answered with confidence and there is not much room for dissent, there is truth in the practical sense. However, despite the scientific consensus, sometimes in society there is disagreement about what the facts are. When the truth is challenged, we have entered post-truth.
Our tendency to adjust our perceptions to our values --or those of our groups-- is often referred to as cultural cognition. We believe that our behavior, and that of the groups with which we identify, is correct and good for society. When confronted with an issue on which we have a (cultural) position, we will remember with less cognitive effort the evidence that supports it. Even if all the information comes to us, we will not assimilate it in the same way. Dan Kahan is a psychologist who studies cultural cognition. In one of the studies he carried out with his team13See Kahan, D., Jenkins-Smith, H. and Braman, D. (2011). Cultural cognition of scientific consensus, Journal of Risk Research, 14(2): 147-174., he introduced fake experts in different fields to a group of people, made up short texts supposedly authored by these experts, and then asked participants in the study if they considered these to be true experts in their field.
For the climate change expert, all the participants saw the same image, but read only one of two texts, assigned at random: one argued that anthropogenic climate change is real and extremely dangerous for all, and the other argued that it was premature to conclude that greenhouse gasses contribute to climate change. Previously, participants’ perceptions on different issues had been assessed, and they had been classified as "hierarchical individualists" or "egalitarian communitarians," which, roughly speaking, could be equated to Republicans and Democrats, respectively.
The results obtained with this experiment showed that the expert's position presented through these two different texts influenced people's responses as to whether or not they considered him to be a true expert.
What is going on here? If an expert says something contrary to what we believe, it is very likely that we will consider him a false expert.14 See Kahan, D. (2013). Ideology, motivated reasoning, and cognitive reflection, Judgment and Decision Making, 8(4): 407-424.
Our minds eliminate or diminish the cognitive dissonance we feel when someone considered an expert questions our pre-existing positions. We protect our tribal identity, and this does not depend on which tribe we belong to (there is not one tribe that is always wrong and another that is not).
Kahan argues that when an evidence-based issue becomes partisan, it immediately becomes a "moral" and, therefore, tribal issue. This happens when the leaders or opinion makers of each party adopt a position that is then transmitted as identitarian to the rest of the tribe, when the information reaches us first --or only-- in this way and only later –or perhaps never-- through real experts. If a factual and well known issue is first discussed by politicians who most probably do not know enough about science and are not adequately advised, and society accesses the issue through the position they take, the mistake (or lie) spreads and post-truth invades us. When a previously non-partisan issue becomes partisan, for those with a strong partisan identity, it ceases to be something that can be reasoned on the basis of evidence and becomes an issue that functions as tribal signaling.
The evidence is set aside, although we remain convinced that what leads us to our position are incontestable facts. Something like this happened in the United States with the issue of climate change. A person's stance on anthropogenic climate change is very much aligned with whether they identify as a Republican or a Democrat (very broadly speaking, Democrats accept that climate change exists, while Republicans do not). But anthropogenic climate change is a fact of reality and there is no scientific dispute about it: the real experts in climate science agree that it exists, and that it must be addressed urgently.
On factual issues such as climate change, we are most likely not able to evaluate the evidence directly because we do not know enough about the subject. In such cases, we can follow opinion makers or the scientific consensus. But if the opinion makers we follow hold positions that are objectively wrong, so will we. Otherwise we would be questioning the leadership of those opinion makers and, therefore, betraying the tribe, with all the implications we mentioned. Once we start defending an erroneous position, we keep defending it because, otherwise, we should recognize that we were wrong before. Make no mistake: in situations like climate change, there is a right and a wrong view. This is not just a matter of opinion.
Perhaps this is another good opportunity to examine our beliefs and which opinion makers we follow. None of us exhibits tribal behavior in all aspects, but some of us do. What are my tribal aspects? Can something like this be happening to me? When was the last time I identified a factual error in a member or leader of my tribe? What about the last time someone from a tribe I detest held a factually accurate position that I chose to amplify and endorse?
It is not just others who are wrong and trapped by cultural cognition. It is all of us. There are many issues where being wrong and following a tribe that identifies with that wrong stance doesn't have much impact on the real world or our daily lives. But there are other issues where being wrong can be dangerous to us, to our loved ones, or to society as a whole.
OUT OF THE TRAP
Given all this, what could we do to combat post-truth? We need to challenge tribalism, or we may end up not only being unwitting generators of post-truth, but also vulnerable to others who are able to harness tribalism for their convenience, to dominate us through intentional post-truth. Thus, the greatest risk of not recognizing ourselves as sheep is to make the shepherds invisible.
If part of the problem is our tribalism, our own behavior, can we fight this unintentional post-truth without fighting either among ourselves or with ourselves?
There is much to be done. Some of these suggestions are simply that, suggestions that stem from what we know about how our social identity works. Some are more supported by evidence, others less so. Let's take it one step at a time.
What follows is a series of "tips" that I put together guided by the available literature on this topic. I don't know if they work –no one really knows, because there is little evidence on the subject-- but I confess that I would like them to work and I believe that, based on what is known about these issues, they have a high probability of working.
The first thing we can do is within everyone's reach and it is very positive: train ourselves in introspection. We need to analyze what is happening to us, to what extent tribalism could be affecting us with respect to our position on different issues. If at some point we are thinking that "what is happening is that people are influenced by their social identity and do not realize it", let us not forget that we are most probably among those people. Let's analyze whether or not what we do is to send tribal signals, whether our ideas are actually the identity ideas of a tribe. Let us also try to look for factual answers without feeling that our identity is being diluted or that we are betraying our tribe. In these cases, we may be able to avoid to some extent that some issues become identitarian. If we allow no separation between what we think and who we are, it is inevitable that we will interpret other people's attack on our ideas as an attack on us. Thus, we become defensive, and our ideas, unfortunately, cannot be challenged even if they are wrong. But if introspection shows us that we are acting in a tribal manner, we can review our ideas and allow others to put them to the test, and, if they are found to be "bad ideas," we can let them go and replace them with better ones.
Introspection allows us to be alert. If we are alert to what happens to us, we can fight tribalism.
It would be easy to say "if we are alert, we may modify our behavior. The problem is some of us are not introspective". But this would also be, in addition to being tribal, a way of thinking of introspection as something that one is, not something that one does. Introspection may be innate, but it can also be developed, and activated in specific domains. So, just as the danger of not reflecting on our thought processes is always present, so is the possibility of beginning to do so.
Understanding that what happens to us happens to others can help us overcome the otherness that is based on tribalism. Then, in addition to looking at ourselves, we can look at others with empathy –understanding what they feel-- and deciding and making explicit that all people deserve moral consideration and are worthwhile regardless of their differences.
Let's not assume that others are bad, dumb or ignorant (unless we have evidence that they are). Treating them as such is only a signal to our tribe, and it is a signal to others that they are not part of it. If we do this, others will not be able to process the content of our argument, but only the tone, the formal aspects, the tribal signal. Let's assume that their intentions are good, that they may be acting in a tribal way unconsciously. Just as we may think the wrong way, or be influenced by our motivations, emotions and identities, so can they. It is not just a matter of recognizing that all this may be happening to them, but to keep this in mind when we bond with them, when we connect with them. Let's remember that they are people. Let's try to listen to them and understand them, let's reach out, even if in our tribe that gesture is considered a betrayal.
Another aspect we can take into account to combat tribalism is diversity. If we recognize diversity, we will be more willing to seek out and listen to positions different from our own. When we only communicate within our tribe, we miss the richness of other worldviews, and we become more prone to the false consensus effect and the illusion of objectivity.
The false consensus effect is what we observe when we surround ourselves with people who are like us and we are very detached from those who are different. If in the election we voted for candidate A, for whom the people we interacted with also voted, and then candidate B wins the election, we may think something like "how could B have won if I don't know anyone who voted for him?". We think that others think like us because those we know think like us. This is a false consensus. Deep down, what surprises us is that, knowing what is known, they do not all vote for our candidate. Here comes into play the illusion of objectivity, in which we believe that any reasonable person must be seeing things as we do: we are objective, but they are confused, ignorant, manipulated or tribalistic. Even if we don't like where this leads us, we need to consider that other people who have access to exactly the same information as we do may reach a different conclusion. This may be due to the tribalism of both groups (even if we agree on what the facts are, we apply different filters on the available data and may thus interpret them differently), and also to deep ideological differences, so deep that they result in the generation of today's so-called alternative facts: factual truths that we are willing to dismiss and replace with constructs that represent no more of reality than our need to avoid questioning our tribal narratives.15The term alternative facts was used by Kellyanne Conway, counselor to President Trump, in an interview in which she defended press secretary Sean Spicers claim that the inauguration of the Trump presidency had drawn larger crowds than those of previous presidencies. Photos showed otherwise, but Conway said Spicer was handling alternative facts. For his part, Spicer maintained that sometimes, we can disagree on the facts. Denying that our differences exist weakens us, isolates us and does not allow us to understand what is going on. We are different, but let us defend the possibility of bonding.
It is also useful to foster flexibility and embrace uncertainty. Let's advocate the flexibility to accept that sometimes we don't know, that we don't need to align ourselves with some position if we are not sure, that we can change our minds, that we can contradict, and perhaps even abandon, our tribe.
Let us try not to "unify" all our different tribes under one banner. If we truly believe that on an issue we have a position that splits us into an "us" and a "them", so be it. But if we feel that the extremes do not represent us, let us defend the moderate position and not allow others to pressure us to take sides.
It is also important that we enable dissent as a way of getting closer to the truth without it undermining our bonds. But to achieve this, we first need to disambiguate the ideas of identity, both our own and those of others. Perhaps we even agree on things that we fail to see because we see it all through our tribal lenses. Here also comes unintentionally generated post-truth: we can no longer recognize the truth because we fail to break through tribal barriers. If we do not manage to set tribalism aside, let us remember that a situation of high conflict induces us to try to protect our tribe more and to make enemies of the others. That leads us to extremism, which in turn leads us to high conflict. And so on and so forth, in a catastrophic spiral, functional only to preserve the Orwellian scheme of fictitious tribes in permanent conflict. In this context, we are capable of defending wrong positions in order to remain loyal.
Now if we can get our own and others' tribalism out of the discussion, we may find that we agree on more than we first thought, or we may continue to hold opposing positions because we are divided on more fundamental issues. We need to find out. And here we have the option of having a conflict or avoiding it. If we believe that "we are all entitled to our opinion," we may think that we have to tolerate the ideas of others and want to avoid confrontation. The problem with this attitude is that, although we are all entitled in principle to express ourselves, we do not have to accept what others say as true or worthy of being taken seriously. Daniel Patrick Moynihan put it beautifully: "We are all entitled to our own opinion, but no one is entitled to his own facts".16 You are entitled to your opinion. But you are not entitled to your own facts.
If we dare not point out that others' ideas may be wrong, that they have incomplete information, or that their arguments are bad, we are also complicit in post-truth. Conflict can be a big deal if we have it not with people, but with their ideas. That's why we need to separate the tribal components first. We need to allow ourselves to disagree also as a way of respecting others –we take them and their ideas seriously-- and not to let issues that seem wrong to us slide. Otherwise, ideas are protected behind their tribal identity and we will never be able to separate the good from the bad. Of course, enabling dissent is everyone's task. If we don't allow our ideas to be tested, they are not ideas we have, but – again-- tribal signals. We need to surround ourselves with people who can challenge our ideas with rational arguments.
Another hopeful possibility for fighting tribalism is to encourage curiosity, manifested by the desire to learn about something we know little about, and to enjoy learning about it. This idea comes from some experiments conducted by Dan Kahan's team that showed that those people with greater scientific curiosity –curiosity, not knowledge-- are more likely to change their positions on issues like climate change or evolution, issues that tend to be partisan in American society.17See Kahan, D. et al. (2017). Science curiosity and political information processing, Advances in Political Psychology, 38(51): 179-199. So far it is a correlation only, and it is not yet possible to say with too much certainty that, if stimulated, curiosity makes a person more open-minded and able to counter politically biased information processing, i.e., that there is a causal relationship between the two variables. But it certainly seems like something interesting to keep in mind. Should we "train" ourselves to be more curious? Should we value curiosity more?
Another valuable tool we have in the fight against tribalism is preventing issues from becoming partisan. We have seen that if a factual issue reaches society not through experts, but through tribal opinion makers of different political parties, for example, it becomes identitarian. From that point on, it is very difficult to find agreements, consensus or, at least, to have a civilized conversation based on rational arguments. When faced with an issue that is new to society, it is preferable that it is conveyed by the appropriate experts to prevent the evidence from being "contaminated" with tribal loyalty and the generation of an "us" and a "them". If the subject is already partisan, we can try to de-partisanize it, to remove the tribal signals, especially in factual matters.
Finally, we must pay attention to communication in many situations. Sometimes, tribes form around the same rough set of values.18Of which we speak a little in Chapter V. Thus, tribal aspects become intertwined with our irrational beliefs. In the United States, broadly speaking, Republicans value respect for authority, traditions and individual freedom, while Democrats identify more with values of equality and protection of minorities. It is important to keep this in mind when communicating with a tribe to which you do not belong. If we want our message to have a better chance of being heard, perhaps we should not espouse the values that are important to us, but those that are important to them. In terms of values, putting facts aside for a moment, what convinces us is not what convinces others.
On the other hand, when faced with a factual question such as climate change, where not only can a correct answer be obtained, but we already know it, we intuitively believe that if someone thinks that anthropogenic climate change does not exist, it is because he or she lacks information. It is very easy to realize that this is not the reason by observing that if we try to give information to these people, not only do they not correct their position, but, many times, it is observed that they reinforce their misconceptions.19More on this in chapter XV. If we are given information that contradicts our beliefs, we have many ways of dismissing it: we deny that this information comes from real experts, we interpret it incorrectly or we flatly ignore it. There are issues where countering disinformation, or misinformation, with correct information works. And there are issues where it doesn't. Which issues belong to which category depends on our identification with the respective groups, on how important or relevant we find those positions to our view of ourselves. If sports do not interest us at all and we do not identify as belonging to a particular team, if we believe that one team is better positioned than another and we are shown evidence that it is the other way around, we will be able to update our belief to align with the new information. But if sports are one of the central aspects of our lives, if we believe that our team is the best and we carry the colors painted on our hearts, etc., it is much more difficult for evidence of what is happening in reality to modify our previous position.
There is a whole, relatively young branch of science that deals with approaching communication in an evidence-based way, that is, by first finding out how to get someone to incorporate information that contradicts their beliefs and then communicating in that way, even if it is different from what seemed most obvious to us. Giving information in these situations does not work. The party that has the correct information recommends to the other party Internet links, scientific papers or experts on the subject, but the other party will always find other links, papers or false experts that claim the opposite. At the end of this war of links, which on top of that are not read, nobody changes their position. Quite the contrary.
When we assimilate this, we understand why so many ways of arguing don't work: atheists treating believers as stupid, believers treating atheists as immoral, people on the right telling people on the left that they are ignorant, people on the left telling people on the right that they are dinosaurs. All these are only signals to one's own tribe, because to the other tribe it doesn't make a dent or even reinforces their position: "what we say is right since they, those others, oppose us". If we really want to reach out to others, we may have to use non-intuitive communication strategies.
This is in relation to intergroup communication. But what about communication within the same group? We saw that, when conversing with people similar to us, our stance becomes more extreme than at the beginning. With this in mind, wouldn't it be better to converse with someone with whom we disagree? How about following on Twitter someone who thinks radically different from us? Maybe that way we could combat extremism a bit.
Not much is yet known about evidence-based communication, a fast-growing field of study. But even what is already known hardly ever gets implemented in real life. Most of us continue to act intuitively and ineffectively, sending tribal signals to our own and increasingly alienating others.
In this chapter, we introduced tribalism and showed why it can contribute in the creation of unintentional post-truth. This builds on what we have seen in the previous two chapters: how we are influenced by our irrational beliefs and how our thinking works.
In the previous section, we suggested several possible ways to keep tribalism from hindering our access to the truth. As a way of systematizing this and transforming it into specific tools, here is a new Pocket Survival Guide.
This Survival Guide encourages introspection, getting a better grasp of what happens to us, but it also raises questions of bonds with others and of communication between tribe members and between tribes. It is comforting to think that we are rational beings, but what we see is that we tend to be wrong, and in many different ways. Tribalism has always been and will always be with us. It is important to belong to groups, it has social and emotional value for us. Possibly, it has been something selected by evolution to a greater or lesser extent, because being in a tribe that is good for us can be very important for our survival. So why fight or control tribalism? Because sometimes, even innocently, it can hinder our access to the truth, our fact checks, and this often poses a greater threat than ceasing to belong to a particular group.
If only it were not so important in our lives to identify from where someone is saying something, that is, from which tribe, but rather what they are saying and what evidence supports what they are saying… but it is. If we gather in tribes in which it is more important to adjust reality to the image and likeness of our narratives than to the truth, we end up creating unsurmountable rifts. We gradually isolate ourselves from others and their worldviews, confusing the "what" with the "who". We shut ourselves up in our cozy echo chambers, in which we are not challenged.
We should not be offended or simply accept that tribalism exists and resign ourselves. Only in this way can we remain alert and fight it, no longer fighting each other, but uniting against post-truth.
This chapter, the third of five devoted to analyzing various phenomena that contribute to the generation of unintentional post-truth, exposed us to more uncomfortable questions, and we will continue in that vein. Identifying the problems helps us focus on possible solutions. Now we will move on to another issue: what if we do not understand the evidence, and need to rely on what an expert tells us? How do we take an informed position, taking into account what is known, if we are suspicious of experts or find it difficult to identify them?