The world is complex, and it can tax our mental resources to process all the information that is coming our way.
To keep up with all the data that our minds are processing, we come up with time-saving (and energy saving) rules of thumb, called heuristics. We may be applying these heuristics unconsciously, and they don’t even need to be rational. We simply need to believe them.
In Thinking, Fast and Slow, Nobel Prize winner and author Daniel Kahneman breaks down our decision-making process into two systems. To keep things simple he describes them as “System 1” and “System 2.”
System 1 is intuitive and emotional. It is fast and easy. “There was a shark attack last week, I am never going to the beach again.”
System 2 is deliberative and logical. It is also slow and requires effort. “What are the chances of getting attacked by a shark? Are they higher today than they were last week? Is swimming in the ocean more dangerous than swimming in a community pool?”
If I were to ask you “What is 7 times 3?” You could access System 1 and respond immediately with “21.”
If I were to ask you “What is 277 times 53?” You could respond with the correct answer, but it would likely require you to tap into System 2 before you answered correctly.
System 2 requires a deeper level of thinking, and it requires a near-exclusive devotion to the problem-solving effort.
It would be nearly impossible for someone to multiply 277 by 53 in their head while writing out instructions on how to make banana bread - even if they had a great recipe for banana bread stored in their memory. That’s because System 2 thinking requires concentration to solve the problem.
The interesting thing about System 2 is that it can morph into System 1 when we spend hundreds or thousands of hours training our mind. Here are two examples:
My younger son got his license back in February. When he first started driving, it took all of his concentration to remember where to put his hands on the steering wheel, the rules of the road, and the process of looking ahead, behind and the sides of the road to identify threats. When we first start driving, we go into “System 2” to concentrate on the task at hand.
As time goes by, those basic driving and awareness skills become second-nature to us, and driving can move from a System 2 process to a System 1 process. I often listen to podcasts while driving on the interstate. I can do this because System 1 takes over, and driving is second-nature .
Another example of this would be Garry Kasparov, who retired from professional chess after being ranked as the world’s top chess player for 20 years. If Gary were to come and play 10 amateur chess players, he could play them all at the same time. We could line them up and Gary could could move from player to player, knowing immediately what his next move would be.
The next move on a chess board is second nature for Gary because he has played thousands of games. He knows the implications of moving the rook or moving the queen. He doesn’t need to access System 2 to beat an inexperienced player.
The challenge with our System 1 and System 2 thinking is when we "think" we are an expert, or we have a life experience that impacts us.
This perception of “expertise” or the impact of life experience can shape our decision-making process in an equally profound way – to our benefit and to our detriment.
That’s because these experiences build the heuristics that we use as short-cuts to make decisions, and some of these heuristics are helpful, but some are not. So how do we distinguish between the two?
Understanding how System 1 and System 2 work together is critical because we cannot always trust our System 1 intuitions. And a big part of our challenge is that System 1 is so much easier to operate from, because we have all the data to reinforce our personal biases.
How do we determine which of these short-cuts lead to better outcomes?
Over the next few weeks we will explore how to limit the downside of our System 1/System 2 thinking and how to use these two systems to make better investment decisions.