Dismantling the disaster in the making

 

Read Part 1 of this two-part article here


"A beautiful theory killed by an incontrovertible fact."

—Thomas Huxley


Have you ever approached a problem multiple times using the same set of tools and/or procedures, only to fail consistently?  I'd bet money that everyone who looks at their past will answer yes. People are stupid. The good news is that you came upon your stupidity honestly. Our brains were never designed to behave in the logical, structured manner we like to assume they are. 

Why does everything in life have to be so haaaaaard?

“Social scientists in the 1970s broadly accepted two ideas about human nature: first, people are generally rational, and their thinking is normally sound; second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart form rationality.”  This quote is from the book Thinking Fast and Slow, written by Nobel Laureate Daniel Kahneman. Kahneman is a behavioural psychologist, who along with Amos Tversky laid down a foundation of research that has been revolutionizing our understanding of how people think. Though you could probably read this in a few days or weeks, this book took me three years to read due to Kahneman’s many references to other scientists and their research. I often stopped reading, followed up on the research he was citing and then moved on to a tangent of his work before returning to the book. It was a bit like an advent calendar, every day I read a page and received a new brain treat.

One of the first tangents in reading the book was Kahneman's reference to one of his idols, Paul Meehl and a book he wrote in 1954 titled Clinical versus Statistical Prediction. This is not something you will find in Oprah's book of the month club.  In it, Meehl makes the argument that simple algorithms or checklists can outperform recognized experts with certain types of decision making. The book was scandalous for its time, and was quickly discredited by many “experts.” As time passed, the validity of Meehl's work was confirmed by study after study. My favourite study that supported the work involved looking at medical diagnoses. A group of radiologists were asked to explain what they looked for on an x-ray to determine if a patient had an ulcer or if they had stomach cancer. The radiologists all gave a series of indicators about the shape, size and texture of the lesions seen on the x-ray that might identify each condition. They were then given a series of x-rays and asked to make their diagnosis of which condition they believed was present. Not only did they often fail to make the correct diagnosis, but the stack of x-rays contained multiple copies of the same x-ray. Radiologists often gave a different diagnosis for the same x-ray. However, when the indicators or clues of what to look for to identify an ulcer as opposed to stomach cancer were written up in a simple checklist, untrained volunteers outperformed the radiologists with just a simple piece of paper. Checklists have become standard in many professions today, including medicine. This methodology is also the birthplace of the Avaluator (AST 1) trip plans, and run lists (which are used by guides in mechanized skiing). These tools are not perfect, but they provide a structure that forces us to ask the right questions and guides us to an informed answer. 

Another factor that impacts our decision making is energy. It turns out that our brains are very high maintenance. Every time we write a difficult test, learn new information, teach others, write another article for the ACC, navigate a complex route or situation, our brain is working in high gear. And just like muscles, the brain has limits to how long it can function under increased demand. In an effort to reduce the load on our brain, we develop shortcuts to help make decisions quickly, simply and with little mental effort, which saves brain power for when we may really need it. This is essential for our every day life: consider how we would manage to get out the door every morning if we had to analyze every piece of clothing and all possible combinations of shoes, socks, pants, shirts and jackets in the closet before getting dressed? 

Here’s an example in the form of a quick quiz: A bat and a ball cost $1.10. If the bat costs a dollar more than the ball, how much is each item? If you answered $1.00 for the bat and $0.10 for the ball you are making a quick decision and taking a short cut. But with this answer the bat costs $0.90 more than the ball. The correct answer is actually $1.05 for the bat and $0.05 for the ball. In hindsight, this is a simple question, but it took me a surprisingly long time to finally figure it out. It takes mental energy to solve this problem, and this is a much easier problem than determining if the snow bridge across a crevasse will hold your weight and what are the consequences if it fails? So we see that short cuts can be very useful, but we often use them when more intensive analysis should be applied.

As part of the AST 1 curriculum, instructors are required to teach a group of cognitive biases that may be linked to poor decision making. In theory this sounds like the Holy Grail to the problem of human errors. After three years of reading Kahneman's work and associated references I was certainly ready to believe this. My bubble was burst though when I had the chance to talk with Pascal Haegeli, an associate professor at Simon Fraser University. Pascal leads many of the avalanche research projects in Canada and when asked for his thoughts on decision making, he quickly pointed out that being aware of these biases and short cuts does not make us any less susceptible to them.  

Fortunately, not all is lost. Many researchers including Daniel Kahneman have been able to distill many of the challenges regarding poor decision making down to over-confidence and the illusion of knowledge. We often assume we are more capable than we really are, or we think that we have information that we don't. Perhaps a more extreme variation of this phenomenon is the Dunning-Kruger Effect, named after the two researchers who described this behaviour (Read their publication called: “Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments”). CBC radio also has a good web page that delves into the subject (“How ignorance makes us cocky,” The Sunday Magazine).  

Personally I found the Dunning-Kruger Effect to be a very enjoyable and sometimes humorous subject to investigate. It also hits very close to home and stands as a warning to us all.  

Another flaw in our brains is selective attention, as described in this Youtube video called “The Monkey Business Illusion.

If you recall, this was your homework assignment from my last article in the Summer 2022 Gazette. Chances are good that you may have missed a few items in the video (I have not yet met anyone who observed every change in the video). I have experienced this effect on multiple occasions. Once, while guiding through a crevasse field I was so focused on finding a safe route that I failed to notice a size two avalanche releasing on a nearby face. Thankfully a second guide following my tracks had a much broader perspective of the situation and alerted me to the event. No one person can be relied upon to identify everything occurring in a complex environment.

This is just the tip of a giant human behaviour iceberg we are only beginning to understand. Expect everything I have described here to expand, change, and possibly be debunked in the future. We are at the beginning of a new road ahead.

Words to live by

In the meantime we will continue to travel in beautiful, risky and dangerous places. It is our responsibility to manage those dangers to the best of our abilities. To that end, here are the best tools I have found to help overcome the limitations of our brains:

  • Use checklists. When used properly, tools like the Avaluator and trip plans can go a long way to addressing many of the shortcomings or biases in our brains.

  • Everyone needs to have a voice in making decisions that affect their safety, and anyone can veto a decision. About a decade ago I started discussing the day with my clients, outlining the goals for the day, mentioning concerns I had and the reasons for my decision. The purpose was to give them the opportunity to assess my decision-making, add to the available information, and express any concerns they may have. The clients also had the right to veto my decisions. In the event of a conflict, we would default to the safer option. I believe that this practice has served everyone well.

  • Whenever there is uncertainty or concern regarding a decision, all members of a group must be able to freely question and challenge the process. This is important as it is one of the few tools that can identify over confidence and the illusion of knowledge. My favourite word in this regard is “why.” Why is it dangerous? Why is it safe? Based on the leader's answer, break down the information even further. If the information is solid, it will likely peel back like the layers in an onion, with information or knowledge supporting each layer behind the decision. If the answer quickly breaks down into vague generalities or elements of frustration and/or intimidation appear, then there may be problems.

  • Share information with everyone in the group. Route information, weather, field observations, anything that may be of value to the group. When something arises that is a potential or unexpected threat, point it out; you may be the first to see it.

  • Slow down the decision-making process. Take the time to analyze the situation. One exercise a group can do is to have half the group assess why this route is safe and the other half assess why it is dangerous.

These are fairly simple things, but they can make a significant difference in how we address and avoid hazards, threats, and increasing risk on any given trip.