The dark side of expertise
## [The dark side of expertise](https://lwn.net/Articles/809556/)
By Jake Edge January 15, 2020
---
[LCA](https://lwn.net/Archives/ConferenceByYear/#2020-linux.conf.au)
Everyone has expertise in some things, which is normally seen as a good thing to have. But Dr. Sean Brady gave some examples of ways that our expertise can lead us astray, and actually cause us to make worse decisions, in a keynote at the 2020 [linux.conf.au](https://linux.conf.au/). Brady is a forensic engineer who specializes in analyzing engineering failures to try to discover the root causes behind them. The talk gave real-world examples of expertise gone wrong, as well as looking at some of the psychological research that demonstrates the problem. It was an interesting view into the ways that our brains work—and fail to work—in situations where our expertise may be sending our thoughts down the wrong path.
Brady began his talk by going back to 1971 and a project to build a civic center arena in Hartford, Connecticut in the US. The building was meant to hold 10,000 seats; it had a large roof that was a "spiderweb of steel members". That roof would be sitting on four columns; it was to be built on the ground and then lifted into place.
[![\[Dr. Sean Brady\]](https://static.lwn.net/images/2020/lca-brady-sm.jpg "Dr. Sean Brady" =150x300)](https://lwn.net/Articles/809599/)
As it was being built, the contractors doing the construction reported that the roof was sagging while it was still on the ground. The design engineers checked their calculations and proclaimed that they were all correct, so building (and raising) should proceed. Once on the columns, the roof was bending and sagging twice as much as the engineers had specified, but after checking again, the designers said that the calculations were all correct.
Other problems arose during the construction and, each time the contractors would point out some problem where the design and reality did not mesh, the designers would dutifully check their calculations again and proclaim that all was well. After it went up, Hartford residents twice contacted city government about problems they could see with the roof; but the engineers checked the calculations once again and pronounced it all to be fine.
In 1978, the first major snowstorm since the construction resulted in an amount of snow that was only half of the rated capacity of the roof—but the roof caved in. Thankfully, that happened in the middle of the night; only six hours earlier there had been 5,000 people in it for a basketball game.
So, Brady asked, what went wrong here? There were "reasonably smart design engineers" behind the plans, but there were also multiple reports of problems and none of those engineers picked up on what had gone wrong. In fact, there seemed to be a reluctance to even admit that there was a problem of any kind. It is something that is seen in all fields when analyzing the causes of a failure; it turns out that "people are involved". "We have amazingly creative ways to stuff things up."
#### Expertise
Before returning to finish the story about the arena, Brady switched gears a bit; there are lots of different human factors that one could look at for failures like that, he said, but he would be focusing on the idea of expertise. Humans possess expertise in various areas; expertise is important for us to be able to do our jobs effectively, for example. We also tend to think that more expertise is better and that it reduces the chances of mistakes. By and large, that is correct, but what if sometimes it isn't? "The greatest obstacle to knowledge is not ignorance ... it is the illusion of knowledge", he said, quoting (or paraphrasing) a [famous quote](https://quoteinvestigator.com/2016/07/20/knowledge/).
![\[Müller-Lyer optical illusion\]](https://upload.wikimedia.org/wikipedia/commons/f/fe/M%C3%BCller-Lyer_illusion.svg "Müller-Lyer optical illusion" =x200)
Before digging further in, he wanted to show "how awkward your brain is". He did so with a variant of the [Müller-Lyer optical illusion](https://en.wikipedia.org/wiki/M%C3%BCller-Lyer_illusion) that shows two lines with arrows at the end, one set pointing out and the other pointing in (a version from Wikipedia can be seen on the right). The straight line segments are the same length, which he demonstrated by placing vertical lines on the image, even though that's not what people see. He asked the audience to keep looking at the slide as he took the lines away and restored them; each time the vertical lines were gone, the line with inward pointing arrows would return to looking shorter longer than the other. "It's like you learned absolutely nothing", he said to laughter. Your brain knows they are the same length, but it cannot make your eye see that.
#### Mann Gulch
A similar effect can be seen in lots of other areas of human endeavor, he said. He turned to the example of the [Mann Gulch forest fire](https://en.wikipedia.org/wiki/Mann_Gulch_fire) in 1949 in the US state of Montana. A small fire on the south-facing side of a gulch (or valley) near the Missouri River was called in and a team of smokejumpers was dispatched to fight it before it could really get going.
Unfortunately, the weather conditions, abnormally high temperatures, dry air, and wind, turned the small fire into an enormous conflagration in fairly short order. In less than an hour after the smokejumpers had gathered up the equipment dropped from the plane (and found that the radio had not survived the jump due to a malfunctioning parachute), the firefighters were overrun by the fire and most of them perished.
The foreman of the team, Wagner Dodge, followed the generally accepted practices in leading the men to the north-facing slope, which was essentially just covered with waist-high grass, and then down toward the river to get to the flank of the fire. From what they knew, the fire was burning in the heavily timbered slope on the other side of the gulch. As it turned out, the fire had already jumped the gulch and was burning extremely quickly toward them, pushed by 20-40mph winds directly up the gulch into their faces. Once he recognized the problem, Dodge realized that the team needed to head up the steep slope to the top of the ridge and get over to the other side of it, which was complicated by the presence of a cliff at the top that would need to be surmounted or crossed in some fashion.
When they turned back and started up the ridge, the fire was 150 yards away and moving at 3mph; in the next 12-14 minutes it completely overtook the team. The men were carrying heavy packs and equipment so they were only moving at around 1mph on the steep slope. Dodge gave the order for everyone to drop their equipment and packs to speed their way up the slope, but many of the men seemed simply unable to do that, which slowed them down too much.
It took many years to understand what happened, but the fire underwent a transformation, called a "blow up", that made it speed up and intensify. It was burning so hard that a vacuum was being created by the convection, which just served to pull in even more air and intensify it further. It was essentially a "tornado of fire" chasing the men up the slope and, by then, it was moving at around 7mph.
Once Dodge realized that many of them were not going to make it to (and over) the ridge, he had to come up with something. For perhaps the first time in a firefighting situation, he lit a fire in front of them that quickly burned a large patch of ground up and away from the team. His idea was that the main fire would not have any fuel in that area. He ordered the men to join him in that no-fuel zone to hunker down and cover themselves as the fire roared past them, but none could apparently bring themselves to do so. Only the two youngest smokejumpers, who had reached the ridge and miraculously found a way through a crevice in the cliff in zero visibility, survived along with Dodge. Thirteen men died from the fire.
There are two things that Brady wanted to focus on. Why did the men not drop their tools and packs? And why didn't they join Dodge in the burned-out zone? If we can answer those questions, we can understand a lot about how we make decisions under pressure, he said.
#### Priming
In order to do that, he wanted to talk about a psychology term: [priming](https://en.wikipedia.org/wiki/Priming_(psychology)). The idea is that certain information that your brain takes in "primes" you for a certain course of action. It is generally subconscious and difficult to overcome.
There was a famous experiment done with students at New York University that demonstrates priming. The students were called into a room where they were given a series of lists of five words that they needed to reorder to create four-word sentences, thus discarding one word. The non-control group had a specific set of words that were seeded within their lists; those words were things that normally would be associated with elderly people.
They then told the students to go up the hallway to a different room where there would be another test. What the students didn't know was that the test was actually in the hallway; the time it took each participant to walk down the hall was measured. It turned out that the students who had been exposed to the "elderly words" walked more slowly down the hall. Attendees might be inclined to call that "absolute crap", Brady suggested, but it is not, it is repeatable and even has a name, "the Florida effect", because Florida was used as one of the words associated with the elderly.
It seems insane, but those words had the effect of priming the participants to act a bit more elderly, he said. So to try to prove that priming is real, he played a different word game with the audience; it is called a "remote associative test". He put up three words on the screen (e.g. blue, knife, cottage) and the audience was to choose another word that went with all three (cheese, in that case). The audience did quite well on a few rounds of that test.
But then Brady changed things up. He said that he would put up three words, each of which was followed by another in parentheses (e.g. dark (light), shot (gun), sun (moon), answer: glasses); he told everyone to not even look at the parenthesized words. When he put the first test up, the silence was eye-opening. The words in parentheses, which no one could stop themselves from reading, of course, would send the brain down the wrong path; it would take a lot of effort to overcome the "negative priming" those words would cause. It is, in fact, almost impossible to do so.
The tests were designed by "evil psychologists" to send your brain down the wrong solution path, he said; once that happens, "you cannot stop it". "We are not nearly as rational as we think we are". If he repeated the test later without the extra negative-priming words, people would be able to come up with the right solution because their brain had time to forget about the earlier path (and the words that caused it). This is the same effect that causes people to find a solution to a problem they have in the shower or on a walk; the negative-priming influence of their work surroundings, which reinforce the non-solution path they have been on, is forgotten, so other solution paths open up.
"So at this point you might say, 'hang on Sean, those are some fancy word games, but I'm a trained professional'", he said to laughter. He suggested that some in the audience might be thinking that their expertise would save them from the effects of negative priming. Some researchers at the University of Pittsburgh wanted to test whether our expertise could prime us in the way that the parenthesized words did. They designed a study to see if they could find out.
They picked a control group, then another group made up of avid baseball fans, and did a remote associative test with both groups. Instead of putting words in parentheses, though, they allowed the baseball fans to prime themselves by using words from common baseball phrases as the first word in the test; that word was deliberately chosen to send them down an incorrect solution path.
For example, they would use "strike", "white", and "medal"; a baseball fan would think of "out", which works for the first two, but not the last, and they would get stuck at that point. Those who don't have baseball expertise will likely end up on the proper solution, which is "gold". As might be guessed, the baseball fans "absolutely crashed and burned" on the test. Interestingly, at the end of the test they were asked if they used their baseball knowledge in the test, but they said: "No, why would I? It had nothing to do with baseball." The expertise was being used subconsciously.
In another test, they would warn the baseball fans ahead of time that the test was meant to mess with their head and use their baseball knowledge against them, so that they should not use that knowledge at all for the test. They did just as poorly compared to the control group, which showed that the use of expertise is not only subconscious, but it is also automatic.
#### Back to the fire
Brady then circled back to the forest fire; the men in Mann Gulch "can no sooner drop their firefighting expertise than the baseball fans could". They could not drop their physical tools and they could not drop their mental tools that told them they had to get to the ridge. They also could not accept new tools, he said; when Dodge showed them the ash-covered area that the new fire had created, they did not accept it as a new tool, instead they "defaulted to their existing expertise and worked with the tools they had".
There is a name for this, he said, it is called "The Law of the Instrument": "When all you have is a hammer, everything looks like a nail." We are all running around with our respective hammers looking for nails to hit. "We see the world through the prism of our expertise and we cannot stop ourselves from doing so."
After Mann Gulch, firefighters were told that if they got into a situation of that sort, they should drop their tools and run, but that still did not work. There were fatalities where firefighters were close to their safe zones but found with their packs still on and holding their chainsaws. The next step was to properly retrain them by having them run an obstacle course with and without their gear, then showing them how much faster they could run without it. It sounds silly, Brady said, but it worked because it gave them a new tool in their mental toolbox.
The one exception at Mann Gulch, though, is Dodge, who dropped both his physical and mental tools. He came up with a new tool on the spot; "escape fires" became part of the training for firefighters after Mann Gulch. How did that happen? Psychologists have a term for this as well, it is called "creative desperation"; when their back is truly to the wall, some will recognize that their expertise is not working and will not solve the problem at hand. At that point they drop their tools and see the facts for what they are, which allows them to find a solution that was outside of the path their expertise was leading them down.
Brady then returned all the way to the beginning and the Hartford civic center roof collapse. Even though there were repeated warnings that something was wrong with the design of the roof, the engineers defaulted to their expertise: "Our calculations say it's OK, so it must be OK."
This was the early 1970s, he said, why were these engineers so confident in their calculations? As guessed by many in the audience, the reason for that was "computers". In fact, when they won the bid, they told the city of Hartford that they could save half a million dollars in construction costs "if you buy us this new, whiz-bang thing called a computer". It turned out that the computer worked fine, but it was given the wrong inputs. There was an emotional investment that the engineers had made in the new technology, so it was inconceivable to them that it could be giving them the wrong answers.
He concluded by saying that no matter what field we are in, we will all encounter situations where our expertise is not a perfect fit for the problem at hand. It is important to try to recognize that situation, drop the tools that we are trying to default to, and see the facts for what they are, as Dodge did in Mann Gulch. He ended with a quote from Lao Tzu: "In pursuit of knowledge, every day something is acquired. In pursuit of wisdom, every day something is dropped."
It was an engaging, thought-provoking talk, which is generally the case with keynotes at linux.conf.au. Brady is a good speaker with a nicely crafted talk; there is certainly more that interested readers will find in the [YouTube video](https://www.youtube.com/watch?v=Yv4tI6939q0) of his presentation.
Bron: [https://lwn.net/Articles/809097/bigpage](https://lwn.net/Articles/809097/bigpage)