From evidence to knowledge

45min

OBSERVATIONS, EXPERIMENTS AND IMAGINATION

TO OBSERVE IN ORDER TO UNDERSTAND 

If post-truth is a series of cognitive and social mechanisms that make us believe what is not true, our first line of defense against it is to identify what is known and what is not, how confidently it is known, and how we know it. We need to distinguish more clearly when we know something effectively, when we are not quite sure or do not know, and when we know that something is clearly not so. For this we need a better grasp of the nature of evidence.1In sciences other than logic or mathematics, strongly deductive fields where saying something is true means we can deduce it from the axioms we accept, the issue of truth is trickier. We do not have demonstrations, nor do we have axioms. What we have is evidence: experiments, studies and observations that point in a certain direction. When many point in the same direction, we can say, with some degree of certainty, that something is true. But there are a thousand ways in which we can later change our minds: new data contradicting previous data, aspects that the theory cannot explain and that another theory does explain, and so on.

****We have discussed evidence and why we need it, but I would like to make a comment at this point, especially because from now discussion will focus on these issues. I often find that problems arise when I discuss these issues with people with backgrounds and interests different from my own, who usually come from what we call the humanities (artists, historians, writers). They tend not to react well to this language, which they often describe as distant, cold or not very human, nor to this approach, which seems to them to fail to capture our complexity. If you are one of these people, I would like to show you, little by little and over the course of these pages, that we have much more in common than you might think right now. That science is not only not opposed to the human side of things, but that it is an inseparable part of what we are. It does not subtract: rather, it adds. 

Just as it was (and is) very important for me to have had access to non-scientific perspectives (such as the artistic one) in order to observe and experience the world, I would like those more familiar with those perspectives to be able to experience this one. I don't believe in this division between the sciences and the humanities, which I find artificial. The word evidence may elicit negative or positive feelings. Let’s keep that in mind when we talk to others, or we could be creating distances instead of getting closer. As Jorge Luis Borges says in "The Library of Babel": "A number n of possible languages use the same vocabulary; in some, the symbol library admits the correct definition 'ubiquitous and enduring system of hexagonal galleries', but library is bread or pyramid or anything else, and the seven words that define it have another value. You, who read my words, are you sure you understand my language?". 

Having said this, I will continue discussing evidence and methodologies, requirements and uncertainties because they are precise words that help, I believe, to avoid ambiguities. I hope my words do not alienate you.

 Evidence is a thick forest in which it is easy to get lost, so let's start with the simplest, before we start adding layers of complexity. There are several ways to obtain evidence, to find out what a fact is like, but, broadly speaking, we can classify them into observations and experiments. In experiments, we control variables and modify them at will, compare alternative outcomes, and thus understand the influence of the variables on those outcomes. For example, if we want to know the relationship between the mass of a pendulum and its period of oscillation, we can take different masses, put them on a pendulum, and see how the period changes. In observations, on the other hand, we strictly analyze what happens by taking measurements and making comparisons, without changing the variables. A scientific observation does not stem from what our eyes see, and not even from what we can "see" with technological aids, such as microscopes or telescopes. Scientific observation is what we can "see" in a broader sense: facts that emerge from our interaction with reality, data that we obtain using instruments, measuring variables, etc. It comes from analyzing the real world with everything we have at our disposal, but minimizing the alterations we make when measuring it and accepting that we cannot have absolute control. 

There are entire fields of knowledge in which, for practical or for ethical reasons, we cannot conduct experiments, or they are not always the best way to obtain evidence. In those fields, our knowledge is supported almost exclusively by evidence obtained by observation. All historical fields belong to this category. For example, to try to understand issues that occur on very large temporal or spatial scales, such as the origin of the universe or the evolution of living things, not many experiments can be conducted. We cannot change the amount of carbon dioxide in the atmosphere of the early Earth, or the distance between planets, to see what would happen. Similarly, to find out the impact of environmental pollution on an ecosystem, we cannot pollute it on purpose, but will instead observe what happens in a polluted ecosystem and, in any case, compare it with an unpolluted one. 

Let's start with observations, through the story of how William Herschel found the planet Uranus.

Humans have always been fascinated by the sky. Not only for aesthetic reasons, but also for survival: if you cannot foresee the length of the seasons, you can hardly know when to plant, or where to hunt. Besides, who can gaze at the night sky and not be amazed? Who does not, upon contemplating the movement of the stars, wonder how it happens? We began by telling myths about the heavens, and then we noticed their regularity. Within that regularity, we found some small stars that did not follow the rest and went their own ways: aster planetes, the Greeks called them, wandering stars. We call them planets

Until 1781, we knew of the existence of five planets, which are clearly visible to the naked eye: Mercury, Venus, Mars, Jupiter and Saturn. That year, William Herschel discovered Uranus, the seventh planet from the Sun. This, incidentally, doubled the diameter of the solar system. 

The planet had always been there. Was it discovered by chance? Is finding planets simply a matter of looking through a telescope? Not exactly. Actually, Uranus had been observed many times before its "official" discovery, but it had been considered just another star, as late as 1690. It was only identified as a planet by William Herschel. At a time when scientists often had titles of nobility or wealth, Herschel made his living as a musician and by selling telescopes of unprecedented quality that he made himself and whose lenses he polished by hand with infinite care. A German living in England, he gave concerts by day and was an amateur astronomer by night. A sort of Batman/Bruce Wayne double life, although it is not clear which was the superhero. 

One of those nights, Herschel observed something no one had ever noticed before: there was a bright spot that did not behave like a "fixed star", that is, it did not maintain its position with respect to the others.It moved differently, which could only mean two things: it was either a comet, or a new planet in the solar system. Part of examining the world with the eyes of a scientist lies in this active observation, which goes beyond light entering through the eyes and taking shape with the help of the brain. Science had broadened Herschel's senses, just as ours are broadened when an artist helps us walk through a painting for the first time, and we see with fresh eyes that which had always been there but was invisible to us. New information, perspectives and ideas allow us to see beyond the obvious. 

At that time, new comets were discovered frequently, but it was almost inconceivable to find a new planet, since the five known ones had been discovered in prehistoric times. Herschel was not the first to notice the existence of Uranus, but he was the first to understand that it was a new planet. To discover is not to see for the first time, but to understand for the first time. He was not looking for a planet, but by observing very carefully, he was able to notice its movement, and so he was able to include it within the framework of what was known so far, thereby expanding it. 

Herschel was able to discover something, to gain knowledge that he then left to all of us, by means of extremely careful and attentive observations. No experiments can be conducted to discover a planet. Once Herschel identified this new object, other astronomers around the world searched for it with their telescopes and made their own observations and measurements, which confirmed what he had described. Another essential aspect of scientific observations is their replicability. This finally convinced the scientific community that the "star" was indeed a new planet. 

Once Herschel had seen Uranus for the first time, we could all do so, which reveals a little-publicized feature of science: the ability to extract small pearls of reality from the universe that, once found, are potentially accessible to all. 

This discovery by Herschel became new knowledge because his observations were repeated and validated by other people, and thus reached a very high degree of certainty. This is how something unknown became known. Over the years and with technological advances, we learned much more about Uranus, just as we know more about the other planets and, on a larger scale, about the galaxies and the entire universe. And here we have a bastion to defend ourselves from the onslaught of post-truth: Uranus exists and we know a lot about what it is like, though not everything, of course. But its existence is mostly uncontested.

****When I say this, I am tempted not to say “mostly”. After all, who could deny that Uranus exists? But I do not do so for a reason that may seem a bit obsessive: if I do not say "mostly", a single counterexample can make my whole statement a falsehood. I do not know (nobody knows) if there are not people in the world who disbelieve in the existence of planets. Therefore, I prefer to say "mostly". Absolute statements, which include those categorical words like never or always, everyone or no one, are potential traps. Almost always. 

The existence of Uranus is not a passion of crowds, it does not occupy newspaper columns or disrupt the unstable peace of Sunday brunch. It does not seem to be a subject besieged by post-truth. But in other topics, which have been as well validated as the existence of Uranus, post-truth raises its head. In order not to make it all seem the same, it is important to stress that observations generate knowledge. They are not always simple, nor do they always correspond to the natural world: measurements of inflation, poverty, growth of countries, etc., are refined observations of the social world. Some observations are particularly complex, and we cannot take them into account without considering aspects such as how each term is defined, what is measured and how reliable the measurements are, methodologically speaking, or how much consensus there is around all this. But, once we agree on the methodological and all these other aspects, once we agree on how to establish something, once we have the values and their interpretation, we cannot go back and deny them. Later on, we will add layers of complexity, but let's leave it like this for now: an attempt to build solid bases, to make a sponge cake that we will complete little by little. 

There is an eternal discussion around observations: are they the product of luck or do you somehow seek them out? Perhaps out of envy, some astronomers said that Herschel had discovered Uranus by chance. This annoyed him very much, since, according to him, chance was not involved in this case. Herschel was very methodical in his observations, and considered that discovering Uranus was an "inevitable consequence" of his way of working, which consisted of scanning the entire sky with his telescope and carefully noting everything he saw. It was these precise notes that allowed him to identify that this celestial body was moving, since the movement was minimal. In this regard, he wrote: "The new star might not have been discovered even with the best telescopes, had I not undertaken to examine each and every star in the heavens, including those that are very remote, to the number of at least 8 or 10,000. I discovered it at the end of my second review, after a few observations... The discovery cannot be said to be due to chance, but it would have been almost impossible for such a star to have escaped my attention... From the first moment I directed my telescope to the new star, I saw with 227 magnification that it differed quite a lot from other celestial bodies, and when I put in more magnification, 460 and 932, I was already almost convinced that it was not a fixed star." 2 From the book The Age of Wonder, by Richard Holmes.

Another aspect of observations as evidence comes into play: new information must be interpreted by human minds and contextualized within what is known about the field. Data does not speak for itself, as is clear from the fact that Uranus had already been observed several times, but never before recognized as a planet. That interpretation is a mixture of imagination, prior knowledge and experience, and in addition, it passes through the sieve of our theory at the time of observation, which overlays the data and without which the data has neither meaning or relevance. 

There are real phenomena, and we can generate evidence as to what they are like. But every time, based on that evidence, there are ideas invented by human minds. A scientific idea - in which, as we have been saying, scientific has to do with how it was arrived at and not with the field of study - has a component of abstraction, of imagination, which seeks to explain empirical evidence. 

This becomes very evident in some situations in which, from the same result, two contradictory interpretations are derived. In the history of science, there are several clear examples. One of these strong "differences of opinion" occurred at the end of the 19th century between the Italian Camillo Golgi and the Spaniard Santiago Ramón y Cajal. Both brilliant scientists, they had absolutely opposite opinions regarding the structure of the nervous system. 

At that time, very little was known about the subject, nor were there many techniques available to investigate it both structurally and functionally. There were two competing hypotheses about how the nervous system was formed. One of them, supported by Golgi among others, considered it to be a continuous network. This idea was known as the reticular doctrine. The other postulated that the nervous system was composed of independent cells that contacted each other, but did not join. This was called the neuron doctrine

Ramón y Cajal noticed that it was difficult to understand what was seen in histological sections because, when using mature nerve tissue, what was observed was so complex that it was not clear whether it was something continuous or formed by independent structures. And, then, he had a brilliant and beautiful idea: what if developing nerve tissue was observed instead of an adult one, one in which the forest, less lush as it was, allowed the trees to be seen? 

Ramón y Cajal then went on to investigate the developing nervous tissue and, there, a clearer picture began to emerge. He was able to observe clearly that there were independent units, cells that, as they matured, generated branches and prolongations that contacted, but did not join, other similar cells. As Ramón y Cajal himself said, "to observe without thinking is as dangerous as to think without observing". It was now known that the basic structure of the nervous system was composed of specialized cells, which were called neurons

Both Santiago Ramón y Cajal and Camillo Golgi were awarded the 1906 Nobel Prize in Physiology or Medicine, "in recognition of their work on the structure of the nervous system". At that time, Golgi and other scientists still defended the idea of a continuous network. Why? Wasn't it obvious that the idea was wrong? Not at all. The same images were available to everybody, the observations were the same, but not everybody interpreted them in the same way. And this is important, because it comes up often in post-truth discussions: sometimes scientists hold wrong ideas. This may happen because there is a dispute and the evidence is not entirely clear (so we can only know who was wrong in retrospect). Or because when a new idea begins to appear superior to others, the scientists who constructed the previous ones make a -generally- lawful effort to defend them, to demand credentials from the new idea. Or because, as we have already said, scientists are human beings and, like all human beings, they love what they do even if they make mistakes. But when an idea clearly shows its superiority, there is no longer room for two simultaneous interpretations, and it is not intellectually honest to claim it is “fair” to consider them equivalent when only one of them is supported by evidence. 

Sometimes it takes time for a new idea to be considered valid in the scientific community. Perhaps, as Louis Pasteur said, "in the field of observation, luck favors only prepared minds".3Dans les champs de lobservation, le hasard ne favorise que les esprits préparés. Or, to paraphrase Pablo Picasso when he said that "inspiration exists, but it must find us working"4 Linspiration existe, mais il faut quelle nous trouve au travail., truth exists, but it must find us observing. 

THOUGHT EXPERIMENTS

After a long day, we lie down on the couch and decide to turn on the TV. We're not expecting anything special, maybe the latest game, the news, a movie that looks mildly interesting, we pick up the remote control, point it, press the power button... and nothing happens. We pick up the remote control, point it, press the power button... nada. What do we think the problem is? The remote control ran out of batteries, or maybe it finally broke after being dropped so many times. How do we know which of these two options is the right one? We find new batteries, replace them, and try again. Now the control works, so we know that the problem was that the batteries were dead. We solved the problem. How did we do it? Without realizing it, we conducted an experiment: a very specific mental strategy that allows us to answer questions by getting evidence. In this case, the question is: "What's wrong with the remote control, why doesn't it work as usual?". We imagine possible answers, such as, for example, that the batteries had run out. In science, we call these possible answers hypotheses, and they are another great example of how the human imagination is essential. After generating our hypothesis, we test it by changing a single variable (the batteries) and analyzing what happened. That is our experimental design. In our design, it is important to change only one variable. What would have happened if we had used a new remote control and also new batteries? Maybe it would have worked, but we could not have answered our question of whether the remote control was broken or the batteries were dead. But, by using the old remote control, putting new batteries in it and seeing that it then worked, we got a result from which we could draw our conclusion, which is nothing more than the answer to our original question. 

In common parlance, an experiment is often referred to as something that is done "with your hands", that has a content, a theme, that looks scientific (and usually involves tubes with colored liquids or machines that ping). Any quick search on the Internet will give us lists of "easy and fun experiments" that, in reality, are just a series of steps that are followed, in the style of a cooking recipe, to obtain a result that is already known. Cutting up a lemon and using its juice to make invisible ink will entertain the young- and not so young - in the house, but that is not an experiment because there is no question or answer, only a procedure that is followed blindly,  allowing us to reach a predictable result. This is not only not science, but it is practically anti-science, because it only proves, once again and without the intention of challenging the existing results, something that is already known. When surprise does not generate further questions we can call it entertainment, but not science. 

Why so much emphasis on the distinction between what is an experiment and what is not? Because we want to emphasize the methodological aspect of a well-designed experiment. The crucial aspect of an experiment is that it answers a question, which is why what goes on in our heads when we conduct it is much more important than whether or not we use our hands. So much so that “using our hands” is optional, as we will see below. 

Legend has it that, in 1589, Galileo Galilei threw two balls of different mass but equal shape from the Leaning Tower of Pisa in order to determine once and for all whether the time they took to reach the ground was independent of their mass. Actually, this seems to be an urban legend (not so urban, considering the time) because there is no evidence that Galileo actually performed this experiment. What is believed is that he imagined the experiment and reasoned about that idea, and at the same time, he gave us one of the first, simplest and most beautiful examples of the power of reason rebelling against the authority principle. 

Since the time of Aristotle, we have held the hypothesis that a heavy object falls faster than a light object. Let us now imagine two objects of different masses joined together by a rope. We throw them from the top of a tower. If our hypothesis were true, we would see this: the heavier object would begin to accelerate faster than the lighter one, and as it goes faster, it would begin to tighten the rope that binds it to the lighter one that follows, pulling on the rope and slowing the fall of the heavier object. Therefore, the objects tied by the rope should fall at an intermediate speed: neither as fast as the heavy object, nor as slowly as the light one. 

But we could also think of it this way: since the two objects are attached, they are a single object that is heavier than both of them separately, because it is the sum of the two (and the rope). Then, it should fall even faster, faster than if we threw the heavy object alone. We thus arrive at a contradiction: does it fall faster or more slowly? The only option left to overcome this contradiction is to conclude that both the light object alone, and the heavy object alone, and both objects joined by a rope, must fall at the same speed. Thus, if two objects of different masses are thrown at the same time from a tower, they will fall with the same acceleration, thus hitting the ground at the same time. Galileo reasoned this out and then concluded that the falling time of an object is independent of its mass, and he achieved this without having dropped a single object. As a bonus, proving that a heavier object falls at the same speed as a lighter one challenged Aristotle's idea that things fall because they "want to go to their natural place" (the natural place of terrestrial substances being the center of the Earth). This led to the question of why they fall. Half a century later, Newton would answer it. Great questions help ask even better questions. 

This imaginary experiment shows that we can discover aspects of reality using our mind. We call these types of experiments that are not actually performed, but only thought about, thought experiments5 A thought experiment is also called a Gedankenexperiment, a beautiful German word. . As exercises of "pure thought", thought experiments must have logical consistency, be restricted by what is already known about the subject in question, and predict possible outcomes. Galileo's thought experiment is beautiful because it not only allows one to conclude that bodies fall at the same speed regardless of their mass, but also because it illustrates one of the most difficult and training-intensive aspects of science: its counter-intuitive nature. 

Going back to Galileo, how do we explain our reality, where we daily see that heavier objects fall faster than lighter ones? If we take a light object like a feather and a heavy one like a hammer or a ball, and we throw them at the same time from the same height, the heavy object hits the ground before the feather. Does this prove Galileo wrong? No, because other factors affect the fall, mainly the viscous resistance of the air. A feather and a hammer are also very different in shape, not just in weight. Galileo's reasoning works if there is no air interfering with the fall. How can we diminish the influence of the air to see if Galileo was right? We could think about eliminating the air itself, but this is neither cheap nor easy. But we could make the effect of the air similar for both objects by making both objects the same shape and size, differing only in mass. Galileo also did this by using metal balls of different mass, but the same size and shape, rolling down inclined planes. 

In any case, it used to be technologically impossible to remove the air from the experiment, but this changed relatively recently: we can now generate vacuum chambers and, even more interestingly, we can also leave planet Earth. In 1971, the crew of the Apollo 15 mission conducted this Galileo thought experiment on the Moon - where there is clearly no air - almost for fun and as a way of sharing science with the millions of anxious viewers waiting on Earth. We have a video of this moment: Commander David Scott drops a 1.32 kg hammer and a 0.03 kg falcon feather at a distance of about 1.6 m from the ground at the same time, and we see both land together on the surface of the Moon! 

Indeed, as Galileo had concluded, two objects fall at the same speed in a vacuum regardless of their mass.

But why are we discussing this in a book about post-truth? We raise these examples because of the simplicity with which they illustrate mechanisms of evidence generation. Once we understand how evidence works in these cases, we will have a solid basis to move towards increasingly complex places, and more complex discussions. 

POWER TO THE IMAGINATION

Galileo, Herschel, Ramón y Cajal... all of them carefully tested their hypotheses and turned them into evidence. Now, where do these hypotheses come from? Imagination, of course. Sometimes, it may seem that obtaining scientific evidence such as observations or experiments is nothing more than methodically repeating a series of steps, something that could even be automated and systematically performed by robots. Nothing could be further from the truth; human imagination and creativity are fundamental in many stages of this journey towards a better understanding of the world around us. They are crucial in the generation of the first hypotheses, in thinking how we are going to solve a certain problem, even in identifying something as a problem to be solved. Also in the interpretation and analysis we conduct of the data we obtain, and the ability to create a general, abstract idea from partial empirical evidence. Data do not speak for themselves: we need our brains to make sense of them. 

Imagination is not everything, just as data is not everything. Their impact lies in their combination, without losing sight of what is data and what is interpretation, what is a real fact of nature (or, at least, the best approximation we have to it) and what is our view of that fact. As Henri Poincaré said: "Science is built with facts, just as a house is built with stones or bricks. But a collection of facts is not science, not any more than a pile of bricks is a house"

On the other hand, the misnamed scientific method is not so methodical either: unlike its general portrayal in textbooks or popular science, the process by which new knowledge is generated is not a recipe, it is neither orderly nor linear. Sometimes, it starts from a question and strategies are devised to answer it. Other times, it does not: an unexpected result may generate a new question and redirect an entire investigation, or the conclusions show that an experiment had been poorly designed, and it is necessary to go back to the beginning. 

Physicist and science communicator Richard Feynman said that nature's imagination is far greater than ours. Much of what scientists do is to try to imagine what nature "does", to unravel the mysteries of the world, to understand how things are and how they work. To achieve this, we have to be creative, we have to let our imagination run wild as in any other area, but with one difference: in the case of science, we then have to test what we think to see if it is correct or not. We can imagine with absolute freedom as long as we do not forget that we must contrast our idea against  the real world in order to get closer to the truth and to understand how our new knowledge fits into the general framework of what we already know. This is why Karl Popper spoke of science as "conjectures and refutations," observations and experiments for which we imagine an explanation that we must rigorously test, once and again. 

Science is a way to get to know better what the world is like by minimizing errors that may come from our intuition, traditions or biases. Each field of study then refines this to suit its particular scope, but, broadly speaking, the "toolbox of science" is the same for all. 

It is a pity that one of the best kept secrets of science is the major role that imagination, curiosity, and even a certain aesthetic sense play in it. 

****I say "unfortunately" because I believe that if this were shown more explicitly to the very young, many of them would stop seeing science as the boring routine that it is in most of our teaching. Perhaps we would not have more scientific researchers, but we would have more scientifically literate citizens, who are able to understand what we are talking about when we talk about evidence, and also, willing to creatively challenge what is known, or thought to be known, looking at problems with new eyes and questioning everything. Today, in professional science, it is possible to make a great career without ever having had an interesting idea. Being creative is not essential for this, but I think there is still much to be discovered, and it is possible that much of what remains to be discovered is within the reach of all of us, but we have trouble seeing it. As was the case with Uranus.

 These mental processes related to creativity and imagination are rarely documented and, even when they reach us, what we receive are ex post accounts; we never know how much they reflect what actually happened. What we do have is knowledge, new evidence. 

Is there an opposition between the power of reason and the power of imagination? This apparent dichotomy often arises. In fact, many use it as if they were mutually exclusive and lump it together with the stereotype we mentioned about "people of science" and "people of the humanities". 

In the late 18th and early 19th centuries, it was not uncommon for poets to know science and scientists to know art. The French chemist Antoine Lavoisier studied respiration and discovered oxygen. He said this scientifically accurate and, also, wonderfully beautiful phrase: "Respiration is nothing more than a slow combustion of carbon and hydrogen, similar in all respects to that of a burning lamp or candle, and, from this point of view, breathing animals are basically combustible substances that burn and consume themselves." Samuel Taylor Coleridge was not only a romantic poet, but he conducted experiments with light and prisms and worked with chemists. Science and art intermingled. Poets like Lord Byron or Percy Bysshe Shelley were aware of the latest scientific advances. To write Frankenstein, the first science fiction novel for some, Mary Shelley took inspiration from the experiments Luigi Galvani had conducted a few years earlier, in which he observed that electricity could make the legs of a dead frog move. 

Understanding science requires imagination. Imagination can contribute to science. Why do we classify ourselves using artificial, mutually exclusive labels and separate the science and humanities communities? One problem with this approach is that it is false, because we all have both components within us and the capacity to develop them. Another problem is that, within the framework of university education, we generate professionals with one-sided views who do not know or understand those on the other "side". 

Scientifically, another problem of splitting into science and humanities has to do with an incomplete conception of science, which only sees it as a product and not as a process. Artistically, there is the idea that, in some way, science "spoils" the world, that it takes away its magic by explaining it. But isn't that what makes it even more beautiful? Or does its being real make it less interesting? What greater magic than the ability to extract secrets from nature? 

Some English romantics said that Newton, by showing with a prism that light was composed of all colors, had ruined the beauty of the rainbow. The poet John Keats said that he had "destroyed all the poetry of the rainbow, by reducing it to a prism". Or did Newton actually enhance the beauty of the rainbow even more by being able to explain it?

 Physicist Richard Feynman tells this anecdote6It is an excerpt from an interview he gave to the BBC in 1981 and is known as Ode to a Flower., regarding the beauty of science: 

"I have an artist friend who, on occasion, takes a stance that I don't quite agree with. He holds up a flower and says, 'Look how beautiful it is,' and we agree on that. But he goes on to say, 'See, as an artist, I can see how beautiful it is, but you, as a scientist, take it all apart and make it boring.' I think he's talking nonsense. To begin with, the beauty he sees is also accessible to me and to other people, I think. Although I may not have the aesthetic refinement that he has, I can appreciate the beauty of a flower. At the same time, I see much more in the flower than what he sees. I can imagine the cells in it, the complicated actions that take place inside it, and that also has its beauty. What I mean is that there is not only beauty in the dimension that the eye sees, but you can go beyond that, to the inner structure, and also the processes. The fact that colors in flowers have evolved and attract insects means that insects can see color. (...) All kinds of interesting questions arise from scientific knowledge and only add mystery and interest to the impression left by a simple flower. It only adds. I don't understand how it could subtract.” 

There is a discussion about college education that has not yet found a clear answer: shouldn't those who study science learn more humanities, and vice versa? The problems to be solved in the world are often so complex that a purely scientific outlook, although it allows us to know the truth little by little, will not be enough if it excludes the big questions, if it does not allow us to ask ourselves what kind of societies we want, how we can handle the challenges of globalization or how we can value each individual as unique while considering them  a citizen with rights and obligations. The humanities bring the perspective of art, values, ethics, knowledge of how societies function, philosophy. Science adds a robust methodology to provide answers to the world. 

****That is why I am not convinced by how some high school students, driven by education, by the family, by the environment or by themselves, label themselves: "I am good at natural sciences" or "social sciences are my thing". These are false classifications that force us to exclude a whole area of culture from our focus and training.

 There are not really two sides, there are not "two cultures," as C. P. Snow - who, ironically, was a novelist as well as a chemist - called them. In a very famous essay he wrote in 1959 entitled The Two Cultures and the Scientific Revolution, Snow argues that there is a gulf between intellectuals and scientists that causes them to misunderstand each other and even to treat each other with disdain or outright enmity. 

Although, as disciplines, arts and sciences would probably mutually benefit from a greater connection between them, the truth is that there are many people who do see this rift. Since the need for understanding and conversation between the two "sides" is indispensable, it may be interesting to explore the possibility of training as a "connector", as a "bridge builder" between the "two cultures". In 1995, John Brockman began to talk about this idea and presented it as the "third culture". Perhaps, in the context of the fight against post-truth, we should explore this further. 

HORSES, ZEBRAS, UNICORNS 

When we use our imagination to know, not "anything goes" either, and this distinction may get lost along the way. Just as our imagination is indispensable before, during and after the gathering of evidence, we are always constrained by a reality that is one way and not any other. Therefore, we need to encourage creativity and imagination, but we must also be disciplined in our thinking and  willing to have our ideas about factual issues tested, and to change them if they do not conform to what is actually happening. 

In this regard, it is useful to keep in mind the principle of parsimony, also known as "Occam's razor",7Alternatively, Occam is often also spelled Ockham or Ockam. which is often expressed as "if you hear hoofbeats, think horses, not zebras". 

A razor, in this context, is a "rule of thumb" that, faced with many possible explanations for a phenomenon, allows us to "shave" those that invoke more complex, metaphysical or unprovable questions, in order to prefer the one that requires fewer additional assumptions, or that is more probable. All things being equal, we prefer the simplest explanation, the one that best accommodates the available evidence. Thus, we place a limit on our imagination, at least, a practical limit. 

Following the principle of parsimony does not guarantee that we will get correct answers, nor does it prove anything, but we can consider it a tool that allows us to advance until we know more. If, when the time comes, we see that we have made a mistake, we can retrace our steps and adjust our path. Thus, the road to knowledge is much "dirtier" than it often appears in Hollywood movies. 

So, if one night we leave a glass of water on a table and in the morning it appears washed, we can imagine that extraterrestrials came and washed it, or that someone who lives with us did it. Occam's razor leads us to suppose that the correct explanation is the latter. If we want to invoke extraterrestrials (or goblins, ghosts, or whatever) we should have some convincing proof that these beings exist. The same applies if we resort to time travel or alternate and simultaneous universes. 

This approach to the problem does not mean that we are proving the non-existence of these beings or phenomena, but that the most practical thing to do is to think of more probable explanations. If new evidence appears, then what is more probable changes, and we can reassess the situation. 

The world is very confusing and it is difficult to know what to trust. In the media, news stories appear that are refuted the next day. The difficulty lies not so much in finding evidence to support a particular piece of information, but in navigating a stormy sea that has both quality evidence and pseudo-evidence, anecdotal testimony and categorical claims asserted by false experts. Occam's razor is a great tool to have on hand because it allows us to trust something in proportion to the evidence available, and in the total absence of evidence, we can even invoke Hitchens' razor, which says that the burden of proof is on the person making the claim and is summarized as "what can be asserted without evidence can be dismissed without evidence." 

It is an imperfect tool, of course. But we need to think about the alternatives. Taken to an extreme, not using Occam's razor (or Hitchens') would lead us to consider any explanation as possible, regardless of what evidence supports it. The only limit would be our imagination, and we would end up thinking that any a priori hypothesis deserves the same attention. It would be politically correct to say that they can all be valid, but, in practice, this could lead us to take into account both an ad hoc hypothesis (i.e. one that invokes anything to explain a phenomenon, without asking it to conform to what is already known) and a hypothesis that can be framed by information, processes and mechanisms that are already known. This is even more delicate in the case of ad hoc hypotheses that, because of the way they are formulated, cannot be disproved, such as those that would require us to prove that something does not exist instead of proving that something does exist. One can prove that something does not exist in mathematics or logic, but, in science, it is basically impossible. That we have never observed something does not make it impossible, just highly improbable. But things can change with new observations. 

Sometimes we have evidence and sometimes we don't. And the path we follow is to gradually obtain more evidence to answer the questions. But a difficulty arises when we do not have evidence: is it because we do not have it yet, but we could get it? Or simply because we will not be able to have it, in principle, ever? Konstantin Tsiolkovsky said that "the absence of evidence is not evidence of absence". The problem of the absence of evidence is very complex. On the one hand, it is clear that we cannot assert that something is a certain way unless we have evidence to support it. This is one of the points that distinguish atheists -convinced of the non-existence of God- from agnostics -convinced that they cannot know whether God exists or not, but who choose to assume non-existence since there is no evidence that he does exist-. 

****If I say that I have an invisible pet unicorn that eats invisible food and doesn't make noise or leave footprints, nobody would believe me. Why? Because, with what we know so far about our world, we have no evidence that unicorns exist, much less that they are invisible. But, on the other hand, if someone wanted to prove the non-existence of my unicorn, they would have a hard time, since I can always justify in some way why they can't detect it: because it just hid, because they don't look for it properly, and so on. Those would be, on my part, ad hoc hypotheses.

It is easy to confuse "absence of evidence" with "evidence of absence". Strictly speaking, we cannot really prove the non-existence of something. In practice, if after multiple attempts we do not obtain evidence of its existence, we assume it does not exist. We are following the principle of parsimony. If we hear hoofbeats, we think first of horses before zebras, but without considering that it is impossible that in this specific instance it happens to be zebras. However, zebras are one thing and unicorns are quite another, because we have evidence that zebras exist and not that unicorns do. In other words, a horse is very likely, a zebra is rather less likely, and a unicorn is so very unlikely that we can consider it impossible. 

We could ask ourselves: what's wrong with thinking that, perhaps, there are unicorns? Isn't it beautiful that our imagination allows us to think about that possibility? It depends. If we approach it as an artistic or even intellectual exercise, this hypothesis can be interesting and lead to fascinating reflections. But if we let it enter the realm of what we actually consider as a candidate for truth, our chance of getting things right diminishes considerably. 

Often, people make unfounded assertions and expect others to prove them wrong. That is reversing the burden of proof, and brings us back to Hitchens' razor: the onus should be on you to prove that you are right in what you say, not on others to prove that you are wrong. 

****If I insist on the existence of my invisible unicorn, much of your resources (in terms of time, money, or even attention) will be directed at trying to disprove what I say. But because of my specific approach, you will never succeed. In our daily life, some people will hold ideas that do not conform to what is known and that are also impossible to prove, and will still manage to monopolize those resources that, then, cannot be used for other purposes. A great tool of post-truth, to which we must pay attention.

Whether intentionally or unintentionally, some may demand that our limited resources be used to try to prove them wrong, while modifying ad hoc hypotheses at their convenience. Fighting post-truth is also about putting a stop to this dynamic. Occam's and Hitchens' razors can help because they consider the evidence that exists so far and, on that basis, favor the simplest explanation, while still considering alternative explanations as possible, and even potentially correct. But the burden of proving that an alternative is more than just an idea is on the proponent. 

JUST A THEORY?

So far we have been discussing evidence (observations and experiments) and the role of imagination and rigor. We will need to take these into account in order to discuss later what to do in the face of the onslaught of post-truth. But, before that, let me add something to do with what we can consider truth in science. There are many more examples of post-truth outside science than within it, but some do relate to scientific subjects regarding which some hold ideas that have already been absolutely refuted. In these cases, there is also post-truth: the information is available but, for some reason, these people are unwilling or unable to accept it (unintentional post-truth), or some interest group actively pushes incorrect versions (intentional, or malicious, post-truth). 

Two examples to illustrate post-truth in matters that science has already settled may be the rejection of the idea of evolution by natural selection by those who believe that an intelligent creator or designer who made living beings -particularly humans- as they are today, or the belief that the Earth is flat and not similar to a sphere. Beyond the fact that they propagate disinformation, these two examples of post-truth in scientific knowledge seem innocuous compared to cases in which public health is directly at stake, as is claiming that HIV does not cause AIDS, or that vaccines are dangerous. 

Some very powerful bodies of knowledge do not only explain the evidence we already have, but can incorporate new evidence and ideas as they emerge. We call these scientific theories, the closest thing to a fact that we can come up with in the scientific fields. It is the term we reserve for what’s most "powerful" in science. 

A scientific theory is as close to the truth as science can get. The word theory does not mean the same thing in science as it does in everyday language, which is problematic. Interestingly, this distinction is often not clear even among scientists and science educators, so the confusion continues to grow. Outside of science, we use the word theory almost as a synonym for idea ("my theory is that they do it with mirrors"). Thus, it is nothing more than an opinion, a suspicion, a supposition that is not necessarily supported by specific evidence. In science, a theory is something entirely different. It is an idea that is supported by observations and experimental results, with evidence coming from several different fields of knowledge and in which there is concordance -or convergence- of evidence: a coherent idea is formed that explains what is observed. This is a very solid knowledge that, in addition, allows us to adequately predict what will happen in a given situation. If, when the case arises, what the theory predicts does not occur, the theory is weakened. If this happens more frequently or with very important and high quality evidence, the theory could be refuted. But if what is predicted is observed, the theory is strengthened. 

****Scientific theories are based on evidence that generally comes from many different fields of knowledge, but goes far beyond them. I find this aesthetically beautiful: in a theory there are abstractions, leaps of thought that go beyond the evidence, but inevitably contain it. They show the ability of the human mind to extract secrets from nature. 

For example, in the case of the theory of evolution, a confusion in the meaning of the word leads us to fundamentally antagonistic interpretations. In the everyday sense of the word theory, we could understand that evolution is "just a theory"8The expression just a theory applied to the theory of evolution is often used by groups that oppose it and seek to discredit it., one explanation among several possible ones. But this is not the case. Evolution is a scientific theory because it is supported by fossil records, by an understanding of how the inheritance of traits works, by experimental evidence, and so on. And all this forms a coherent and extremely resilient body of knowledge, which has proven and continues to prove to overcome many obstacles and criticisms. And the same is true of other theories, such as cell theory, atomic theory, the Big Bang theory or the theory of relativity, among others. 

A scientific theory is not just any idea, an unfounded suspicion, but a powerful idea that is built on a very firm foundation, a foundation of accumulated evidence. 

A theory is something so powerful that, in practice, it can be considered a fact. Even so, it does not mean that it is immutable. It can undergo changes and even be set aside in favor of something else that "works" better. Like everything in science, theories can be disproved if new evidence appears that contradicts it. But, in practice, current theories are so strong that it is very difficult for good evidence to appear that completely destroys them. They are refined and expanded in more explanatory ways, and sometimes there are details on which scientists disagree, but their fundamental propositions are preserved. The theory of evolution has changed a lot since it was postulated by Darwin, but it remains fundamentally the same. 

FACTUAL STATEMENTS 

Everything we discuss in this chapter is related to assessing, albeit broadly and with caveats, factual claims, i.e., claims that refer to real-world facts. If we have claims of other kinds, such as, for example, artistic, cultural or moral issues, we will not be able to posit them in terms of truth versus post-truth, so we shall leave them outside the scope of our analysis. Not because they are not important, but because they live in a different space. 

Now, how do we know whether or not a factual statement is supported by evidence? This is not straightforward. Since we cannot be experts on all subjects (and, therefore, on most subjects we are not), we will inevitably have to decide whether or not to trust based on other criteria. 

Let us introduce, then, the first of several Pocket Survival Guides you will find throughout this book. These Survival Guides pose a series of questions we can ask ourselves that are not intended to be exhaustive or "foolproof," but can function as an emergency kit in our fight against post-truth. We begin by loading the first tools into our box, and we will add more as we go along. 

In this chapter, we present evidence (observations and experiments) and discuss the role of human imagination in the process of obtaining factual answers. This approach can be applied to any topic related to what the world around us is like, although the examples we discussed were mostly related to topics that are typically considered scientific and do not generally "trigger" post-truth. We will see in the next chapter that this evidence-seeking approach can also be useful in medicine, where post-truth arises more frequently, and where it can pose a danger to people. 

POCKET SURVIVAL GUIDE #1 

HOW CAN WE DECIDE WHETHER OR NOT TO TRUST 

A FACTUAL STATEMENT?

Is it indeed a factual statement, i.e., does it refer to the real world? 
If this statement speaks of the existence of something, is it supported by evidence (observational and/or experimental)? 
Does what we identify as evidence answer a question? 
Faced with several possible explanations, is the principle of parsimony (Occam's razor) being used? 
In the absence of evidence, can it be obtained, bearing in mind that the burden of proof is on the assertion that something exists? 
If this statement speaks of the non-existence of something, could it be an ad hoc explanation that can neither be confirmed nor refuted?