A few weeks ago I heard a TV-maker in a talk show say that people make about 35000 decisions a day. 35000!! As you can imagine, they are mostly not life-changing decisions, but decisions like: do I snooze or do I get up, do I keep my fart in or will I let it go, do I give 2 or 4 kisses while greeting someone in France, stuff like that. These decisions of course do not need extensive research and reasoning.
But what happens if the decisions are about merging two company cultures, having another baby, deciding whether or not to start/keep deforesting the jungles in Brasil, or whatever decision you have to make that is bigger than yourself? I hope we can all agree that when complex issues like these come up, tossing a coin to make a decision won’t do. We all hope (I hope) that people gather all the objective information they can and make a decision according to what they’ve found. And still… company mergers fail because of culture clashes. Poor people keep adding babies without being able to pay for their upbringing. And flora and fauna in Brasil (and elsewhere) is being destroyed.
So why do these kind of things happen anyway??
In earlier pieces Michel and I wrote about shortcomings and limitations in human behavior and our control, or lack there of, over it. I’m no better than the rest of us, let alone Michel..! We are all prone to cognitive biases and heuristics that prevent us from seeing the world objectively, as it is, at any given moment. Humans have a tendency to want the stuff they want right now, and not in 10/20/30 years. And lastly we often fail to take into account the complete system of the problem at hand.
This means that we don’t even perfectly interpret the imperfect information that we do have. We misperceive risk, thinking that some things are way more dangerous and others much less than they really are. Also, we pay too much attention to recent experience and too little attention to the past, focusing on current events rather than long-term behavior.
Even today I ran into a guy whom I managed convinced to invest in cryptocurrency. He told me he sold them, after it went down a few dollars. If he’d done a little analysis, gathered some more information, and kept them he would’ve doubled his initial deposit today. Confirmation bias determines what kind of news we let in or not (indeed, we do not completely decide ourselves). All of this means that overall we do not make optimal decisions for our own, let alone for the greater good. This phenomenon is called Bounded Rationality.
Bounded rationality is the idea that when individuals make decisions, their rationality is limited by the tractability of the decision problem, the cognitive limitations of their minds, and the time available to make the decision. ~ Wikipedia
Before we start blaming people for the stupid decisions they make I have a little thought experiment for you that I found in the book ‘Thinking in Systems by Donella H. Meadows’ (highly recommended), where I also first learned about the concept of Bounded Rationality.
Suppose for a moment you are for some reason lifted out of your usual place in society and put in the place of someone whose behavior you have never understood. Having been a firm critic of government for example, you suddenly become a member of parliament. Or as a donator to Greenpeace or WWF who hates Shell and the likes, but suddenly you become the person at Shell who makes the decisions regarding the environment.
In your new position, you’ll experience the information flows, the incentives and disincentives, the goals and discrepancies, the pressures – the bounded rationality – that goes with that position. You very likely will base your decisions on the information you have in that position. If you’d become very poor, you’d probably see the short-term rationality, the hope, the opportunity having many children would bring. As a farmer who has a family to feed, a house to pay for and incomplete information about the state of the Amazon, you’d probably take down those trees.
Don’t believe this? I’d suggest you look up on YouTube the Stanford Prison Experiment by Zimbardo. You’ll be shocked how fast you might forget who you were and how little control people really can have over themselves. As Dr. Meadows says, this is no excuse for narrow-minded behavior, it just provides an understanding.
To change this narrow-minded behavior it is first and foremost necessary to take a step back and try to get an overview of the complete system. This way it might be possible to restructure the information flows, the goals, the incentives, so that bounded rational actions add up to desired results. Knowledge is potential power. The more complete the information is you got, the better decisions you’ll be able to make.
The bounded rationality of each actor in a system – determined by the information, (dis)incentives, goals, stresses, and constraints impinging on that actor – may or may not lead to decisions that further the welfare of the system as a whole. If they do not, putting new actors into the same system will not improve the system’s performance very likely. What makes a difference is redesigning the system to improve the information, (dis)incentives, goals, stresses, and constraints that have an effect on specific actors.
Did you guys like it?! Then like it! And more important, complete, adjust or diminish our ‘timeless wisdom’… Do you feel left out in the open? Teach us! (and don’t forget to like us on FB, thanks!)
A lot of this article was lend from the book “Thinking in Systems: A Primer – Donella H. Meadows”, which I highly recommend to anyone who runs into some problems to solve once in a while. Read for a taste of Systems Thinking here