The sky is cloudy. Every minute the number of clouds doubles, and in 100 minutes the sky will be overcast. How many minutes will it take for clouds to fill half the sky? 50 minutes. This is the answer we usually hear in this version of the riddle. However, if the clouds double every minute, when they cover the whole sky, it means that the previous minute, they only covered half of it. The correct answer is therefore 99 minutes.
Fast and slow thinking
The puzzle was not complicated. But to solve it, you had to think slowly. In his excellent book " System 1 / System 2. The two speeds of thought ", D. Kahneman, father of behavioral finance , describes the two ways in which we process information:
A fast, emotional and intuitive. She gets stuck on problems that require evaluation and logic. Known professionally as System 1, colloquially Homer.
A slow and rational, requiring a greater expenditure of energy. Known as System 2 or Mr. Spock to friends.
We are Homer by default. It is enough for everyday life. With Spock in charge, it would take hours to solve simple operations, like buying food or choosing the color of the tie.
System 1 is efficient and consumes less energy. In return, he takes a series of shortcuts that cause mental traps. For example, how many times do you think you can fold a sheet of paper? Sounds like a simple task, but I bet you couldn't do it more than 12 times. Take the test.
In fact, it was thought impossible to fold a sheet more than 8 times until in January 2002, B. Gallivan explained how to make it to 12 in his book How to Fold a Paper in Half Twelve Times.
Incredible, isn't it? If it weren't for the fact that the math assures you that it is, you wouldn't believe it. It's not something you can imagine: you have to do an exercise in faith in science.
we are deceived
Imagine we found a way to fold it more than 12 times. For example, 20 times. What will be thicker: the pipe of a pipeline or our sheet?
A folio is about 0.1 mm thick. If we fold it in half, we will have 0.2 mm. We bend again and we have 0.4 mm. On the seventh, the thickness will be similar to that of a notebook. In 23, we will reach 1 Km. In 42, our folded folio will reach the moon, in 52 the sun. 86 times later, it will be the size of the Milky Way and 103 times that of the universe. The math, son.
We cannot imagine it. The math claims it's true, but the mind resists it. Only with experience, knowledge and the right tools will we know when System 2 should be implemented to achieve positive conclusions.
confused by chance
You will agree with me that any good trade is one that you would repeat over and over again if certain conditions are met, regardless of the outcome of a particular trade. This assertion implies that any operation, although perfectly planned, can end badly. In other words, investing in the financial markets forces us to deal with large doses of randomness.
And the bad news is that System 1 deceptions multiply in activities whose outcomes are influenced by probability. In fact, Homer thinks he can influence her.
A few years ago a BBC reporter showed that at many traffic lights in Manhattan there is no connection between pressing the "green wait" button and the time it takes for the disk changes color. Corroborated by the New York Times, it was noted that this was happening in other cities (e.g. London). As pedestrians feel they can control the situation, they tend to cross less when red.
This trap is known as the "illusion of control": we believe we can influence things over which we have no control. For example, when we blow into the fist or vigorously shake the dice before rolling them. Or when we attribute winning trades and unlucky losers to our superior analysis (which is also another mental trap known as “attribution bias”).
In this sense, a 2003 study by Fenton-O'Creevy et al showed that traders more prone to the illusion of control had lower performance, poorer analysis and poorer risk management.
Correlation, causation and chance
This need for control leads us to look for cause and effect relationships to explain random phenomena. Unfortunately, Homer is not a scientist who identifies patterns and there are few sites like those in the financial markets to find ridiculous patterns. Thus, there are hundreds of published books which are authentic collections of false correlations.
It is important to understand that correlation does not imply causation and that it is not enough that a system has worked to be able to extrapolate it in the future. The system, in addition to being useful, must make sense.
Chalmers, inspired by B. Russell, explained it well in his inductive turkey story. A turkey, on its first morning, received food at 9 o'clock. Since this was a scientific turkey, he decided not to assume this would always happen and waited years until he collected enough sightings. So he recorded days of cold and heat, with rain and sun, until finally he was sure to deduce that every day he would eat at 9 o'clock. And then Christmas Eve came, and it became the meal.
In 1956, Neyman (subsequently corroborated by Hofer, Przyrembel and Verleger in 2004) showed that there is a significant correlation between the increase in the stork population in a given region and the birth rate in that region. The cause ?
It depends on the question
Most of the decisions we make are often influenced by the way the information is presented to us or the way the question is posed. For example, we will be more willing to sell a stock whose price is 50 euros if we bought it at 40 euros. On the other hand, if the closing price the day before was 60 euros, we will be more reluctant to do so.
Imagine you had to choose between these two options:
800 € with security.
Do not lose anything with a probability of 50% or -€1,600 with a probability of 50%.
Although the expected value is the same (0.5 x -1,600 + 0.5 x 0 = -€800), the second option is usually chosen.
Let's put it another way:
+800 euros with security.
Win nothing with 50% probability or +€1,600 with 50% probability.
Many people will then choose the first option. By showing the same exercise as a gain rather than a loss, the mental process leads to different paths.
Perhaps because we don't know how to decide in an environment of uncertainty, we don't know how to evaluate the decisions made by others. In a 1988 study, J. Baron and J. Hershey asked a subject to choose between:
Get €200 for sure.
Get €300 with 80% probability or €0 with 20% probability.
A priori, the most logical thing is to take risks since its expected value is 240 € (300 x 80% + 0 x 20%), higher than 200 € for sure. But what was sought was not to assess the wisdom of the chooser, but how others assessed that choice. That's why, once the result was known, different people were asked what they thought of the decision taken, being -30 the worst and +30 the best.
The rating was +7.5 when the subject took risks and won and -6.5 when he lost. This implies that the subject is evaluated not for having made the most logical decision, but for its result.
When we make decisions, we are also influenced by what other people think, by what others expect of us, and even by what others order.
Asch has shown the difficulties of going against the tide. In his classic study, he showed tokens with three lines of different sizes to groups of students. They were all in cahoots except one, the subject of the study. He was asked to choose the largest line. The accomplices had to give answers that were sometimes right, sometimes wrong. When saying the correct answer, subjects generally did not fail. But when the group gave the wrong answer, the subjects failed almost 40% of the time, even though the lines were several centimeters apart.
The terrible experience of Stanford prison shows the influence of what others expect of us. A group of young people were selected and randomly divided into prisoners and prison guards. Prisoners had to wear robes and were referred to by numbers, not by name. The only rule of the guards is that they could not use physical violence.
On the second day, the experience went completely off the rails. Prisoners received and accepted humiliating treatment from guards.
Even more terrible is the study by Stanley Milgram of Yale University. This experiment called on three people: a researcher, a teacher (the subject) and a student (an accomplice actor). The researcher indicates to the teacher that he must ask the student questions and punish him with a painful punishment each time he fails. Initially, the discharge is 15 volts and increases with each failure over several levels up to 450 volts.
The student, as the discharges increase in level, simulates gestures and cries of pain. From 300 volts, it no longer responds and simulates seizures.
Normally, from 75 volts, the teachers get nervous and ask to stop the experiment. If this happened, the interviewer refused up to four times, noting:
The experience requires you to continue.
It is absolutely essential that you continue.
You do not have the choice. You must continue.
On the fifth try, the experiment stopped. Otherwise, it continues.
All subjects asked at some point to stop the study, but none exceeded five trials before 300 volts. 65% of the participants, although uncomfortable, reached up to 450 volts.
If you think you would ever fall for such an experiment, remember that both studies were repeated at different times, with different modifications, and yielded similar results.
What can we do ?
You know it now. Your mind deceives you and conspires against you. You can't help it, but you can avoid falling into its traps if you understand how deceived you are. There are hundreds of resources (books, articles, etc.). Use them. And do not forget: to be courageous, it is essential to be afraid.