Why do we believe nonsense? Why does any nonsense that someone tells or posts somewhere influence our thinking?

The neuroscientist and media psychologist Maren Urner, whom we met in the last column, says that everything that is claimed in a credible environment gets stuck in our brains, even if it is false and is immediately denied. If it is repeated, it gets stuck all the more deeply. So I shouldn’t write: “That Olaf Scholz has green hair is nonsense.” Because what image do you have in your head now?

The British knight Sir John Mandeville traveled in the fourteenth century to distant India and further across Africa, and returned to London with the news that he had discovered trees in whose fruit lambs grew; He personally ate the lamb fruit, it tasted fantastic. From then on, the story of the sheep that grew on trees was considered proven truth.

Daniel Kahnemann, psychologist, describes us as lazy creatures: We would get through life just fine using System 1 of our brain, but this System 1 is impulsive, spontaneous, superficial, and System 1 falls for prejudice because it is not exhausting , it simplifies, and it can be deceived, for example by peer pressure from a herd. System 2 also exists, and System 2 can think logically, is capable of learning because it is able to analyze. However, we prefer not to use System 2 because it would be tiring.

The American philosophers James Owen Weatherall and Cailin O’Connor write that what we now believe is based only to a small extent on our own experiences and more on the stories and experiences of others. These alien experiences are all too often fake, because we live in an era of disinformation (“spin, marketing, and outright lies,” the two write), shaped by the brunt of lying machines like Facebook and the professionalism of those who deceive us. Even the smartest analysts get carried away when surrounded by persuasive analysts who also appear to be smart. In their experiments, O’Connor and Weatherall prove that well-falsified data and just one or two so-called agents who skillfully argued were enough to convince a whole entourage of researchers of a correct result and to convince them of the opposite.

Trump was cheated out of the election victory? Climate change is not proven? Ukraine threatened Russia? If a reputable source says so…

All of us, and of course scientists too, tend to wish for results that correspond to our own previous knowledge; and therefore to take the results of colleagues with whom we feel personally close more seriously. “Confirmation bias” means: We gladly accept information that corresponds to our previous beliefs; we ignore those other strenuous ones who contradict what we desire.

The philosopher Kevin Zollman says that scientists working on an important problem should communicate as little as possible with other scientists at the crucial stage of their work in order to remain intellectually independent. There are no people without a political and cultural background, without emotion and prejudice, write Weatherall and O’Connor.

Because social beings strive for applause and fear exclusion, they simply want to belong and be liked.