Images of a fallen global leader awaiting arraignment for alleged sexual molestation paint a stark picture of power gone astray.
Apart from disturbing moral and criminal issues, allegations that toppled Dominic Straus Kahn, the former head of the International Monetary Fund, underscore a stunning capacity for very stupid actions by very smart people who ought to know better.
It occurs with numbing frequency. A U.S. Congressman seeks trysts over the internet; a prominent scientist fabricates data; millionaires cheat on taxes; a top general undermines the Commander-in-Chief in Rolling Stone Magazine; hapless investors turn over life savings to swindler Bernard Madoff.
Good judgment, if not plain common sense, flees at immense cost. Consequences demolish reputations and careers. Some lead to prison.
Catastrophic failures of judgment share common threads, from inflated egos and unchallenged arrogance to lack of empathy for loyal customers and hard working chambermaids. Sadly, the same unrestrained drive that pushes people to succeed also can push them too far.
For a better understanding of motives behind spectacular bad decisions, I spoke with psychologist Madeleine Van Hecke, author of “Blind Spots: Why Smart People Do Dumb Things.” She and colleagues tap brain research to explore self-destructive behaviour in the upper echelons of authority and influence.
Van Hecke lists seven blind spots that trip people smart enough to know better:
1. He didn’t stop to think. Instead of pausing to think in a rational manner, we let emotional brain circuits hasten decisions.
The “thinking” circuits of our brain that resolve conflict information require conscious effort. When the reaction is visceral, emotional brain circuits react.
To investigate the difference, psychologist Drew Westen of Emory University scanned the brains of staunch Republicans and Democrats as they reacted to conflicting political statements. For both groups, Westen found that only the emotional circuits of the brain reacted to the statements. He reported results in “The Political Brain: The Role of Emotion in Deciding the Fate of the Nation.” Ask yourself: is your rational side engaged?
2. He Just Didn’t See Himself. Often we fail to see how we are acting as others see us.
A classic example of such blindness emerged from the Princeton Theological Seminary. Students were assigned to write a sermon on the Good Samaritan, the biblical tale about passers-by who ignored the plight of a man left beaten in the road. In the middle of preparing, students were summoned to another building to give the sermon ahead of schedule. On the way was an ill person planted on a doorstep. Only one in 10 of the students about to preach on the virtues of helping others stopped and offered to help the man in their path.
Under pressure of self-interest, they overlooked the gap between their behaviour and the message they intended to share. Ask yourself: do you practice what you preach?
3. She didn’t know what she didn’t know. It might be apocryphal, but supposedly British King George III reported in his diary on July 4, 1776, that “nothing happened today.”
And reportedly, back in the late 1960s, the preeminent maker of slide rules, Keuffel & Esser, polled scientists to learn what lay ahead. No one predicted handheld calculators that would displace slide rules. In a similar vein, a movie industry in the mid 1970s worried about video cassette recorders and brandished copyright law to bar them from homes. But far from putting Hollywood out of business, who knew then that video rentals would generate gigantic profits by extending the life of movies beyond a few weeks in theatres?
From personal experience, I might also add a roster of newspaper and magazine publishers who did not foresee how new online media would turn their business upside down. Ask yourself: what’s afoot already that will change your outlook?
4. He is often wrong, but never in doubt. In his book, “On Being Certain,” neurologist Robert Burton describes how our brains are wired to build a sense of certainty. That’s why we say “the answer is on the tip of my tongue.” We know we know the answer but can’t express it.
Unfortunately, Burton finds, the same wiring can invent a false sense of certainty when we’re wrong. Nobel laureate John Nash, subject of the film “A Beautiful Mind,” offered a rather surprising reason for turning down a prestigious full-professorship at the Massachusetts Institute of Technology: he was “scheduled to become emperor of Antarctica.” Asked how he could believe such an unhinged idea, Nash answered that the same feeling of certainty applied to his economic breakthroughs and an impending post at the South Pole.
To him, brilliant and the flaky ideas both “felt right.” Ask yourself: could you be wrong?
5. She failed to see the other perspective. A radio host once polled a studio audience before the show started. He invited them to demonstrate how to voice answers for listeners when he posed a question during the radio program.
“What if I asked how many of you have never been in the audience of a radio program before?” Almost everyone in the audience raised a hand. They apparently forgot that a show of hands conveyed nothing to listeners.
It shows how we get caught up in our own points of view at the expense of losing a balanced perspective. Ask yourself: should you step into someone else’s shoes?
6. He jumped to conclusions. From riding bicycles to creating spreadsheets, our brains learn patterns that work well most of the time and automate them. This promotes efficiency.
But automatic responses also encourage leaps to conclusions. We arrive at decisions before scrutinizing evidence. We make assumptions instead of reasoned decisions.
Science writer Jonah Lehrer gives a great example of an erroneous assumption that led astray in his book “How We Decide.” Lehrer describes research in which rats learned to make a right turn in a maze–a strategy that led them to find the food 60 per cent of the time. Yale students, placed in the same situation, were less successful than the rats in finding the food.
Why? The Yale students made the assumption that the placement of the food was based on some meaningful but complicated pattern. In trying to discover that pattern, they didn’t use the simpler, more successful strategy that the rats had discovered. Ask yourself: Have you studied enough evidence?
7. She missed the bigger picture. Narrowly focused on the immediate moment, or on the role we play, we often miss the bigger picture.
History supplies a nearly endless list of bubbles and business catastrophes–not least the recent financial crisis–caused by failure to see the big picture. Leading up to 2008, it was indeed true that the US had not suffered a nationwide collapse in home prices since the Great Depression, and that mechanisms in place would prevent another collapse. But even after prices peaked in 2007, homeowners spent with abandon, bankers wrote mortgages to anyone who could cloud a mirror, and Wall Street hiked leverage with confidence that all the relevant facts were in plain sight. Whoops.
Back to DSK: If the allegations prove true, then blame three blind spots, says Van Hecke. He did not pause to think before a precipitous action, much less weigh its consequences; he succumbed to a delusion that women are subservient to men with influence; and he lacked empathy for the victim’s perspective.
The French may play by different rules, but in the U.S. it’s three strikes and you’re out.