Knowing and knowing who knows




Sometimes, we do not recognize an expert as such, or we attribute expertise to someone who does not really have it. As a result, we consider facts that are false to be true, or viceversa. 

When this is combined with factors such as beliefs and emotions, our wrong thinking and tribalism, we have a problem: it will be more difficult for us to recognize the truth and value it as such, even if seeking it is what drives us. 

In elementary school, we learned the basic operations of mathematics: addition, subtraction, multiplication and division. Was there anything else? When we left elementary school, perhaps we thought that, in high school, mathematics would be limited to learning to perform operations with larger numbers... instead of hundreds, millions! What a surprise when we learned that there are more numbers, trigonometry, logarithms, or ways to calculate the limit of functions. And, for those who study mathematics in college, isn't it surprising to discover that all these are just details, particular applications of other larger and more general things? 

We were not only learning more content about what we thought mathematics was. We were learning something else: the whole field of mathematics turned out to be so much bigger than we had intuited at the beginning. 

Moreover, it is as if, as we know more, our perception of what we still don’t know changes. The more expertise we acquire in a field, how much do we think we know about the "totality" of what there is to know in that field? 

The phrase "I only know that I know nothing" is often attributed to Socrates. Although the attribution is generally rejected, it may be worthwhile to philosophize about its meaning. One possible interpretation could be that Socrates claimed to know nothing, but that for that very reason he knew more than the others, who also knew nothing. They did not even know that they knew nothing. If only it were so for everyone. If only we all knew that we know almost nothing. But we don't. In general, we don't know that we don't know. 

The good news is that we can know about what we know and don't know, and how we perceive ourselves to know. In 1999, Justin Kruger and David Dunning published a wonderful, foundational scientific paper: “Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments".1Dunning, D. and Kruger, J. (1999), Journal of Personality and Social Psychology, 77(6): 1121-1134. Perhaps, this paper should be worked on in high schools, published in newspapers, and discussed at the family dinner table. It is key. It is essential. 

Dunning and Kruger did the following. On the one hand, they assessed in a group of people how apt they were in several skills: 1) sense of humor (social skills); 2) logical reasoning (intellectual skills); 3) English grammar (specific knowledge). On the other hand, they asked the participants to estimate how good they thought they were in each of the skills assessed. 

To give an example, the results they obtained with respect to logical reasoning were surprising. 

The less capable have a harder time realizing that they are incapable. This is sad. As the authors say, the most incompetent suffer from two problems: "Not only do they come to wrong conclusions and unfortunate decisions, but their incompetence prevents them from realizing that". Possibly, the skills that enable us to be competent in a given field are the same skills that are needed to assess how competent we are in that field. To realize whether we are doing something correctly or not, we need the same skills and knowledge as to do it well. 

It is as if, in addition to not knowing, these people have a limited ability to reflect on how they think, so we can say that they are not very introspective about these tasks. 

Dunning and Kruger obtained similar results in the three skills they assessed (an important clarification: a person could be very competent in one of the skills and score very poorly in another). They performed other experiments and were able to conclude that metacognition –our thinking about our thoughts-- has to do not only with the ability to assess how good one is at a given task, but also to realize how good others are. In other experiments, Dunning and Kruger observed two things: participants’ scores on these tests improve, and their metacognition also improves, as evidenced by the decrease in the gap between their actual performance and their perception of it. So there is hope! 

When Dunning and Kruger published this work, many other researchers sought to replicate it, and succeeded. For new knowledge to be reliable, it is essential that it be reproducible. This phenomenon has since been observed many, many times and in many different situations. Today, "unskilled and unaware" is accepted as a particular cognitive bias and is known directly as the Dunning-Kruger effect.2For more on cognitive biases, see Chapter VI.

Surely, we can all think of many particular examples that seem to follow this sort of general rule. The probability that a new business (a craft brewery, a bookstore, a coffee shop, an Internet company) can survive five years, for example, is very low. Let's say, 30%. If you give this information to a person who wants to start a new business, and you also ask him what he thinks is the probability that his particular business will succeed in that context, he will surely overestimate it and say a higher value, for example, 60%. In a way, it is as if they think that the statistic has nothing to do with them. This is often called the above-average effect, and it is always observed, to a greater or lesser extent, in different situations and cultures. It is a bias, because clearly it cannot be that everyone does better than the average, or the average would not be the average, right? 

When we watch a sports game and our team is not doing well, we know with great certainty what changes should be made to the team's formation or game strategy. We are all coaches. In our city, most drivers drive very badly, although, if we think about how each one of us drives, maybe many of us will say that we do better than average, right? And we're getting closer to the problem here: most of us drive better than others. Do you see where we're going? 

Every time I study a new subject, out of curiosity or necessity, this is what happens to me: at the beginning, I feel I know quite a lot, and as I learn, I see how much more there is to learn and how little I really know. The final destination gets farther and farther away, as in the paradox of Achilles trying to catch up with the turtle. As I progress, the "goal" of reaching total knowledge on a subject always seems to be a little farther away. 

Now, what if this, which we so often see as happening to "others", we perceive in ourselves through introspection? It is not the other person who is incompetent. It is each and every one of us. We are probably not incompetent in all areas, but surely in many of them. 

Why this aggression? Why this apparently gratuitous destruction of our self-esteem? Because many of us prefer to know that we do not know to the illusion of knowing. 

And if you are now thinking that we should speak for ourselves alone, because that does not happen to you, think again. And again, if necessary.

An Internet search gives us the illusion that we can be experts after a few hours of "research". First, even if we spend many hours on the Internet, we have to keep in mind the Dunning-Kruger effect and accept that we are unlikely to be on the heels of the real experts in the field we are reading about. Chances are that we do not have the expertise either to understand the issues in depth or to assess the evidence for ourselves. Our Internet "research" is not comparable to real, professional research, conducted with proven methodologies and quality controlled by experts in the field. 

I don’t mean to defend the experts as an inaccessible sect. It is not inaccessible. In fact, any of us who want to become experts in something can certainly do so, and the Internet will be a tremendously useful tool for that purpose. But that is not enough. You have to add dedication, patience, exposure to multiple perspectives, an analysis of how reliable each source is, an assessment of what information was peer-reviewed and, often, of that research we will emerge with ideas, new knowledge that must be, like all knowledge, validated. Becoming experts requires both specific training and experience. 

Most of us wouldn't tell an airline pilot how to fly an airplane, but we would not hesitate to yell “Where did you earn to drive?!” at a car driver. In both cases, these are skills that we might consider technical, but perhaps the greater familiarity we have with driving cars makes us feel more entitled to judge what makes a good or a bad driver. However, let's keep in mind that not all of us can drive better than average.

So, when we don't know something, we don't always realize that we don't know it. We confuse knowledge with the knowledge we can access. The good thing about this is that, as soon as we realize that others know more than we do, the more we know. We put a flag on one side, the "we don't know that we don't know anything" side, and now we go to the other, the side of competent experts. First, because they exist. Second, because they contribute to the generation of consensus, and validate or discard the evidence. Third, because combating post-truth involves being able to recognize them so we can navigate in worlds that are unknown to us.


Let's go back to what Dunning and Kruger observed with the group of top performers. In that case, there is also a slight distortion of how much they think they know: they underestimate how well they did. What goes wrong in this case is different from what went wrong with the less competent group. While the incompetent don't realize they are incompetent because they don't realize they are wrong, the experts believe they are not as good because they overestimate the ability of everyone else. They believe that what is easy for them is easy for others, that what they know is known by everyone else. This is another cognitive bias known as the curse of knowledge: sometimes, experts know so much that they end up forgetting what it was not to know, making communication between experts and non-experts difficult. Added to this is the false consensus that comes from the echo chamber effect of communicating only with colleagues, which makes them assume that what is thought within their community is representative of what the rest of us think. 

In the experiments, unlike what happens with the group of those who do not know, experts are able to modify their position when they are shown evidence that they are wrong: when top performers are told that others perform worse than them, they are able to correct their perception and bring it closer to the real value, whereas when the same is tried with the incompetent, they fail to correct their perception. 

Do we agree that there are experts, that there are people who are very competent in fields of knowledge in which we are not? Let us hoist the other flag, that of competent experts. 

Between our two flags, incompetence and expertise, there is a long distance. As with other issues, these are not discrete values, but a scale, a continuum, a spectrum. 

To become an expert in something takes years of study, experience or both. As Dr. Stephen Strange said, "study and practice, years of it". 

It is time to welcome introspection to try to identify in ourselves, with honesty, what we are really experts in. Once we make the mental list, everything that does not fit in it belongs either to the field of what we are moderately competent at, or to the field of what we do not know anything about. 

 Moreover, this expertise is not transferable to other fields. Expert theoretical physicist Stephen Hawking went so far as to say in his book The Grand Design that "philosophy is dead", which showed his ignorance in the area. Nobel laureates Kary Mullis and Luc Montagnier, for example, are experts who made impressive advances in their disciplines. Mullis invented a technique that revolutionized molecular biology: PCR. Luc Montagnier identified the virus that causes AIDS as HIV. But Mullis also stated that he believes that HIV does not cause AIDS, that astral projection exists, that there are extraterrestrials that abduct people and that climate change is not real. For his part, Montagnier said, for example, that vaccines are dangerous and that autism can be cured with alternative treatments, so both share the Dunning-Kruger platinum. 

Today, to know is also to know what one knows and what one does not, and, for the latter, to know how to look for those who do know. 

One expert knows, but many experts in the same field know more. And from there we come back to scientific consensus.3A little of what we discussed in Chapter IV. Since none of us can aspire to become truly expert in all areas in our lifetime, sooner or later we must rely on experts in almost all aspects of daily life. That trust should not be one of total surrender to what a particular expert says, but we can be reassured, not only because the consensus is large, but also because we expect that, within the expert community, there will be a permanent control by the other scientists. 


But we have a practical problem. There are people who pretend to be experts and are not. If, in addition, we are not experts in the field in question, we do not have the necessary knowledge to realize whether we are dealing with a real expert or a fake expert. 

Sometimes, we can sense that we are in front of one of them. For example, if we read a newspaper article on a subject we have mastered, in which a supposed expert says certain things, we may notice that he or she is talking nonsense and that it is impossible to take that person seriously. But we realize this... because we are experts. If we are not, it becomes much more difficult. What to do? In real life, people don't come with a "certified" stamp that tells us whether they are true or false experts. 

In the United States, a surgeon known as "Dr. Oz" became famous by participating as a "health expert" on Oprah Winfrey's popular television show. This led to his own program, The Dr. Oz Show, where he addresses medical issues and gives health advice to a daily audience of almost 3 million viewers. But is he a true expert or a fake expert? 

We may wonder whether or not what he says about health is in line with the consensus in the field. Much of what Oz argues in his program falls within what is known as alternative medicine,4We discussed this extensively in chapters V and VI. so a group of experts evaluated the contents of his programs and saw that half of what he says about health is wrong or not supported by evidence.5See Korownyk, C. et al. (2014). Televised medical talk shows-what they recommend and the evidence to support their recommendations: a prospective observational study, The BMJ, 349. This research was published in a journal whose works are peer-reviewed, that is, by other scientists. So much for the claims he communicates in his program, but is Dr. Oz really a doctor? Yes, he is a cardiothoracic surgeon. He is a trained person, but in a particular specialty. However, his health advice on his TV show is all kinds of. To think that a surgeon, even a very good one, has much to contribute regarding other sub-disciplines of medicine is equivalent to thinking that Lionel Messi, a spectacular soccer player, might as well be a good basketball player. 

On the other hand, we may wonder if he published scientific papers. By searching for "Oz MC[Author]" in the PubMed search engine, the most prestigious in the biomedical area, we see that he published some scientific papers, but in his specific area. Something that can also help us to know whether or not someone is an expert is to see if the real experts consider him to be one of them, and Oz is not particularly recognized within the medical community (even though his program has received television awards, including several Emmys, which say little about his expertise as a physician). 

Dr. Oz is, then, neither as expert as he makes himself out to be on his show, nor a total fraud. Not everything he says on TV is false, so what do we do with this information? Perhaps, at the very least, we can ask ourselves if we would go to a doctor who 50% of the time gives us accurate and valid information and, the other 50%, fraudulent information, something we could very well get from a coin toss. 

In order to distinguish a competent expert from a false expert, we can look at the aspects we have just mentioned, but bearing in mind what we mentioned in the previous chapter: our beliefs and our tribalism could influence this, making us consider a false expert to be competent or vice versa depending on whether he or she holds ideas with which we agree or disagree. We need to be alert to this possibility. 

Now, suppose we identify someone as an expert and trust what they say. Are we falling into a fallacy of authority? Not necessarily. It depends, in large part, on how we relate to that expert's speech. Do we take it as a guide or do we trust it blindly? If he is proved wrong or lied to us, can we change our position or do we defend him to the end, accusing others of wanting to destroy him? In this, much depends on us, and not only on others. 

Similarly, if we return to post-truth, the issue of experts would not be too serious if the worst that can happen is that we confuse a false one with a true one. The reality is that we distrust experts, and perhaps that distrust is at least partly justified.


We distrust experts for many reasons, and if we become more aware of what they are, we will at least be able to assess whether or not they are justified in each particular case. That is the first barrier we can put up against post-truth. 

To begin with, competent experts are often simply wrong. Doctors give us health advice that later proves to be incorrect, or incomplete information; governments make incredible mistakes, and so on. In some cases, this is due to a misinterpretation of the consensus, to considering that something is "already known" when in fact it is still not so clear. Or there may also be a miscommunicated claim, which appears to be more certain than is actually the case. But specifically, it sometimes becomes evident that experts are not so clear after all. Contradictions, shifting positions... all that does not help to inspire confidence. 

In other cases, experts lie. They are human beings, with all that that entails, even the miseries of wanting to attract attention, succumbing to the pressures of lobbies or even bribes. 

We also perceive them as in a different league from us. As if they were an elite in an ivory tower, inaccessible to mortals. They speak and we do not understand them. And they themselves, under the curse of knowledge, do not realize that we do not all know what they know. Thus, we feel them gradually moving away from us. 

It does not help either that, at times, some of them communicate with condescending and morally superior language, as if we were stupid or ignorant and needed them to point us to the path of "good", as if we were incapable of finding it on our own. Perhaps we feel that the approach of some experts does not represent us, or that we do not share with them the same set of values that are essential to us. We may feel that they belong to a different tribe. Some, moreover, show a total lack of empathy for people's concerns, sufferings or joys. Or it may be that we simply don't like experts telling us what to do or not to do. 

Another possible reason for generalized distrust is the distrust of everything, of the system, of those in power, no matter who they are. This "anti-system" attitude may make people dismiss what an expert says, or, perhaps more worryingly, lead people to think, "the truth is exactly the opposite of what the expert said". 

These may all be reasons for distrust, but we need to ask ourselves where that distrust leads us if it is directed at all experts equally. If we trust no one, then we must rely on our intuition, our irrational beliefs or what our tribe says, and if we are truly motivated to identify the truth, this is not an effective approach.


If we are indeed fed up with experts –and even if we decide to assume that it is entirely their responsibility-- we are going to have to find a way to repair this relationship, because we need them to fight post-truth. While some experts may betray us, not all of them will. Indeed, the fact that we know that some experts are wrong, or corrupt, or lie, shows that our control mechanisms can work. Yes, there is a risk, but "the perfect is the enemy of the good" and the alternatives are worse. Let us regain confidence in them, always with caution and awareness. 

Sometimes, it makes me very uncomfortable to trust someone's expertise. But then I notice how many times I unconsciously, tacitly trust, sometimes even putting my life at stake: every time I cross a street, I trust that drivers will hit the brakes because they saw me and know how to drive, I trust that their cars were checked by mechanics who know their job, I trust that the cars were manufactured by companies that know how to do it. They are all "faceless" experts that I can't identify. And yet... that's where I take heart. When I see discomfort creeping up on me about a particular expert, I try to reassess whether they are indeed an expert, and not a fake expert, and whether I can then trust their judgment.

These days, social networks encourage anyone to give their opinion on anything. It is as if, all of a sudden, we are surrounded by experts, on everything. This is nothing more than a way of helping create unintentional post-truth, because consensus is confused, positions are equated and the feeling is normalized that arguments from those who speak their personal opinion, without evidence, have the same validity as arguments arrived at by someone benefitting from extensive training. 

In fact, how sure are you of your knowledge before you give your opinion on a given subject? Just as mansplaining is the word that was invented for when a man gives a woman an explanation overestimating his competence due solely to gender, we could speak of dunningkrugersplaining when a person with no knowledge explains to an expert how things are due solely to the elusive relationship between the confidence we have in our opinions and the actual competence we have to give an opinion

Regarding vaccine distrust, in addition to beliefs, emotions, cognitive biases and tribal components, we also have distrust of competent experts and trust in false experts and even in anonymous Internet pages, blogs or forums. Someone who already has a settled doubt about the safety of vaccines will search the Internet and very easily find justifications. In fact, these people will often tell others to "do their own research". In doing so, they fall squarely into confirmation bias, continue to act on their prior beliefs, disregard the quality and quantity of evidence, the certainties and uncertainties, and also do what is considered appropriate within the vaccine-doubting tribe. 

What if our family doctor is not an expert, if, for example, he tells us that vaccines are dangerous, or that a healthy lifestyle is enough to be healthy? When we choose doctors, we do not usually take into account their academic training or experience. Or, if we do consider these aspects, perhaps others take precedence, such as whether they "inspire confidence", whether we "feel comfortable" sharing our doubts or whether they were recommended by people we trust. The bond is thus reaffirmed and, at a certain point, it becomes difficult for us to doubt what they say. 

A particular practitioner may be wrong, or may go against the consensus in his or her area. Generally, these are exceptional cases, but they do exist. What if that happens? Again, what matters is our behavior in this situation. We have to choose whether to follow the consensus or to follow that practitioner. If we are driven by the search for truth, the answer is clear: we must follow the consensus. If, despite the fact that, for example, vaccines are known to be safe, we follow what a physician who rejects them tells us, then we would be falling into the fallacy of authority: we do what the authority tells us even against the evidence. 

Of course, if weight loss is intended, we may be choosing doctors who think like we do, but looking for a nutritionist until we find one who recommends a diet based on chocolate and cookies will result in a great many things, except weight loss.


We have addressed the importance of experts and the difficulty of identifying and assessing them. This contributes to the creation of unintentional post-truth because it "muddies the water": the truth may seem less clear than it is, and doubt becomes more powerful. 

There are two central points that we choose not to address here. One is that, sometimes, experts are indeed susceptible to some power or interest or, at least, to whoever is funding their research. The other is that, regardless of what we do or do not do, and regardless of the care and consideration we put into it, it is clear that other people may not do the same. The first point will be dealt with in this next section. Let's put the second aside for now, because it actually applies to all the issues we’ve discussed so far, and we’ll take it up again at the end.6Section 4.

We need to identify competent experts in order to access, however indirectly, knowledge about a subject. This new Pocket Survival Guide aims to offer some new practical guidelines to unravel all this: 



Are we ourselves really experts on the subject? Could we be under the Dunning-Kruger effect? 
If we are not experts and we distrust experts, do we understand where that feeling comes from? Can we put it aside on this issue? 
Is the person we believe to be an expert holding a position that is for or against consensus? 
Does the person have training and/or experience in the specific field? 
Is the person recognized within the expert community or endorsed by recognized academic institutions? Does the person have peer-reviewed publications or specific credentials?

In this chapter we added the lack of trust in experts and the difficulty of identifying which are competent and which are false to the individual and collective phenomena we already know, and when we combine everything, we have a post-truth time bomb. Let us now turn to the last chapter in this section, in order to analyze our relationship with information, from what knowledge comes to us to what we accept and whether we share it with others. As with beliefs and emotions, problems with reasoning, tribalism or distrust of experts, it is our relationship with information that will largely determine whether or not we will be active disseminators. Amplifiers of post-truth.