Dichlorodiphenyltrichloroethane. Go on, have another crack at it. In a world frustratingly awash with acronyms, we can at least recognise…
Have you ever suffered a mental illness? Do you have a phobia? What about an anxiety disorder? Or have you ever made a mind-bogglingly stupid decision?
Any inkling of a ‘yes’ and I warmly welcome you to the glorious dysfunction that is humanity. You’d be surprised how many wonderful people have just ticked ‘all of the above’. The driver for many of these ‘failings’ is subconscious and automatic – a powerful internal force. It’s us, but we can’t control it very well.
As individuals we’re frail, flawed and more irrational than we’d care to admit. One strategy to mitigate our individual failings is to act collectively. As a collective, surely our failings balance out and we make better calls. We embrace democracy on this basis.
But look at the nutters, dullards and tyrants that we tend to elect or tolerate – Stalin, Mao, a couple of Bush’s, Franco, Mussolini, al-Bashir, Al-Assad, Saddam Hussein, Mugabe, Idi Amin, Gaddafi … and don’t get me started on South America. Most baffling of all were the disciplined, rational, unemotional Germans who, under an MMP-type system, democratically elected Hitler and for a decade thought him a first-rate leader.
Sometimes the collective approach just magnifies the scale of our irrationality. And yet we forget the lessons of history and think it will be different this time around. We like to think we live in an enlightened age of science and reason, but there is increasing evidence that we’re kidding ourselves.
‘System 1’ acts first
Recent MIT studies seem to show that the primitive part of the brain, the basal ganglia, seems to have a more powerful role in decision-making than we’d previously thought. That is, we’ve been working on the basis that we aren’t Neanderthals anymore, but it turns out we are.
The New Scientist magazine (4 April 2015) suggests “intuition, biases and gut instinct” are still our primary drivers. We develop ‘belief systems’ that provide us with an immediate position on something, without actually having to think about it. Nobel Prize-winning psychologist Daniel Kahneman labels this automatic, intuitive process ‘System 1’, and it’s something we’re often not conscious of.
When Ports of Auckland announced they were seeking to reclaim some of the harbour to build a couple of gigantic new piers, the first we (including the mayor of Auckland,) heard of it, was on the national news. I bet, like me, you immediately had an opinion on whether it was a good idea or not. This opinion wasn’t based on reading the business case, the environmental impacts report or the financial forecasts – it was based on gut feel. Kahneman’s System 1 is ruling the roost.
From here we can find plenty of evidence to reinforce our impetuous opinion. Nassim Taleb, Distinguished Professor at NYU, has written extensively on this phenomenon. He calls it ‘confirmation bias’; how we look for ‘comfortable’ evidence to corroborate our existing beliefs. When confronted with opinions that we agree with, our brains register barely a flicker of activity. But when we’re confronted with a contrary view our brain activity goes crazy. I’m sure you’ve felt this phenomenon. Mostly we don’t like to confront the alternative view because, to us – it’s wrong.
The confirmation bias is even more disturbing in that if you hold the orthodox opinion or the prevailing societal view, you’ll find a constant supply of supportive ‘evidence’ to reinforce your System 1 perspective. So it’s a world where we’re inclined to make hasty decisions and then look for evidence to justify them.
Our region has a number of vexing issues on the table right now. Local body amalgamation and the Ruataniwha Dam are at the forefront, but there’s also lots of chatter about exporting our water to China, GE Free Hawke’s Bay, fluoride, fracking and global warming. How do ‘normal’ citizens develop a sensible position on these things?
Enter ‘System 2’
The solution is ‘System 2’ thinking – that is the rational consideration that occurs in our conscious minds. The problem, however, is that System 2 is slow and cerebrally taxing. It turns out the lame-sounding – ‘I don’t have time to think about it’ – excuse, is right on the money. Apparently System 2 is a recent evolutionary development and we have an extremely limited capacity.
There has been some excellent debate on the Ruataniwha Dam with mostly rational, well-constructed arguments on both sides. Sadly, given how our brains work, I’d wager very few people have changed their minds. To do so people would have to achieve ‘unbelief ’ and this isn’t so easy. First you need belief, probably based on System 1, then a great deal of System 2 rational turmoil that leads to converting you to the opposite belief. This ‘road to Damascus’ experience is possible, but not so common. Most people are not as open to the contrary view as they’d like to think.
Even if you do have time for some System 2 thinking on, say, an important regional issue, the scientists and economists that should assist your thinking, often don’t agree. That’s hardly surprising as these professions are based on belief too. You won’t study science or become a research scientist unless you believe it’s very important. Science needs belief every bit as much as religion needs it. You can see many instances where the same data leads scientists to different conclusions, and their beliefs are quite possibly to blame. Belief is a filter through which hard data can become distorted.
Politicians also work on the assumption that 95% of citizens won’t bother with System 2 thinking. All you need politically is for System 1 to say ‘sounds like a great idea’ and you’re away. If you assume the public will mostly just employ the intuitive System 1, then detailed transparency and disclosure don’t warrant so much effort.
People of higher status, like politicians, have one additional flaw which amplifies their errors – overconfidence. They tend to overestimate their knowledge and underestimate the risks. This can lead them to thinking councils can profitably run tourist attractions, for instance. The evidence indicates they cannot.
Napier councillors’ foray into art deco buses was a prime example of their failure to override their System 1 brains. It’s now difficult to believe that councillors believed this venture could be successful. Similarly Splash Planet has never made a profit and probably never will; at least while the Hastings District Council run it. They do show signs of having worked this out and they’ve made at least one exceptional call. The Hastings Top 10 Holiday Park, while HDC ran it, consistently made a loss that burdened ratepayers. Now it’s leased to a private sector operator and generates income for ratepayers.
The good news is that System 2 appears to be able to program System 1. So if you’ve been, say, an environmentalist for 20 years, your System 1 brain will in most cases, instinctively reach conclusions a typical environmentalist will agree with. So we needn’t allow our brains to terrorise us, believing System 1 is an autonomous, omnipotent force. We can control it to some degree.
The secret to programming your System 1 ‘beast’ lies in self-awareness. We need to be mindfully aware of our entrenched beliefs and catch our System 1 brain at the very moment it cries out its conclusion. So next time you feel this moment; lurching towards an opinion that’s based on little information … tug the reigns. ‘Hang on a minute. I’m going to think about this. I’m not sure you’re right.’ This shouldn’t be all that difficult, because it’s the very process we worked on as children; suppressing our natural reactions in preference for something out of the prefrontal cortex.
We may never be able to control our primitive brains, but hopefully, with effort, we can become more rational and open-minded about critical issues. Basically, what the psychologists tell us we need to work hard at is: believing our beliefs less strongly.
Paul Paynter is our resident iconoclast and cider maker. Sometimes he grows stuff at Yummyfruit.