The Undoing Project: The science of how bias impacts our decisions


  • THE MU SIGMA TIMES
  • August 6th, 2019
  •   5196 Views

Prologue: Chris Glenn from Mu Sigma presents “The Undoing Project”, written by bestselling American author and financial journalist Michael Lewis. In an engaging narrative, Chris provides us with takeaways based on his understanding of the concepts and thoughts expressed by Lewis in the book.

“No one ever made a decision because of a number. They need a story.”

On Friday, January 27, author Michael Lewis visited the Redmond campus of one of Mu Sigma’s clients, a global technology major. Mr. Lewis is on a promotional tour for his new book, The Undoing Project.

Amos Tversky, left, and Daniel Kahneman in the back garden of their first house in Stanford, California in the late 1970s. Photograph: Penguin Random


The Undoing Project is the story of psychologists Daniel Kahneman and Amos Tversky, and how the relationship forged between the two pioneered the science of cognitive psychology (which, many would say, is the foundation of behavioral economics). Mr. Lewis, now popularly known for written works like The Big Short and The Blind Side (not to mention their movies that have received commercial and critical acclaim), began The Undoing Project after reading an academic review of Moneyball from Dr. Richard Thaler that Lewis had described as polite criticism.

Mr. Lewis stated how, when he wrote Moneyball, he was fascinated with markets (particularly large markets) and how they can miss value. Moneyball is an examination of one such “high-stakes” market loaded with talent, “experts,” money, plenty of historical data; yet how it can still be fundamentally flawed in assessing value. If these scouts, coaches, owners, etc. can miss value, miss “the point,” at the highest levels… who else is missing the point? (Lewis jokingly commented that if the B students in certain industries are “missing the point,” what are the C students in places like Wall Street doing?)


Indeed, if you can find a common thread in many of Michael Lewis’ books, it’s how the masses, with their conventional wisdom, can miss value. This is generally the theme of his stories (while a colorful protagonist or two go “against the grain” to more accurately assess, demonstrate, or counter the value of, say, speed in the digital stock market, danger of collateralized debt obligations, or the importance of an offensive left tackle in American football). Most of Mr. Lewis’ books have spoken to how we humans missed the point. This story speaks the most to why.

Dr. Thaler’s criticism, as Mr. Lewis would state, was that Lewis missed the point in Moneyball. The point was not how baseball’s conventional wisdom was flawed, but around why human beings misjudge markets, or make systematically flawed decisions, in the first place. The biases that are, for lack a better description, built in to the human condition, are what Dr. Thaler believed was the real story left untold in Moneyball.

Through a personal connection, Mr. Lewis set out to learn of this “Lennon and McCartney” of psychology (he actually taught one of their sons in his brief teaching stint). He said that as he got to know Amos and Danny, he actually thought that the subject matter was too complex for him to conquer (Mr. Lewis stated that while he was learning about Amos and Danny, he was so convinced that he couldn’t write the story well that he wrote a note to friend and popular author Malcolm Gladwell, telling him that he had “the perfect story” for Mr. Gladwell’s next book; he never heard back from Gladwell, so he wrote the story himself).

The book does what Lewis is best at doing – telling a complicated (and in the hands of many others, probably very boring) story via colorful characters that is important toward understanding how we, as flawed human beings, “missed it” in a very big way.


One example of these biases takes form in a study by Amos and Danny where they split doctors into two pools of independent samples. Each sample was given the same scenario; each doctor has a dying patient with seven years left to live who can undergo a new surgery that could save the patient’s life. To the doctor in “Pool A,” he or she is told that by doing the surgery, the patient has a 90% chance of living. To the doctor in “Pool B,” he or she is told that by doing the surgery, the patient has a 10% chance of dying.


Guess which pool of doctors were significantly more inclined to doing the surgery? (Keep in mind, these are not laymen … statistics proficiency is a required skillset of all doctors, let alone the layman who understands that this is the same information, told in different ways).

How a simple situation is framed can drastically change its impact. Another example speaks to what is now known as “Loss Aversion”. An Amos and Danny study discovered that (in one grossly over-simplistic sentence) the psychological pain of knowing that we just lost $100 is roughly double the joy of, alternatively, knowing that we just won $100. This has implications ranging from negotiating at the highest-level, to what we choose to eat for dinner.

We think that something we already own is somehow more valuable than another comparable object that we don’t own (search “The Endowment Effect” for more). We tend to stick with the same thing because we “know” it, or “it’s mine,” when a clearly superior option is out there (search “Status Quo Bias”). We trek through a lethal typhoon to see a movie because we already paid for the ticket online and that money is gone forever (“Sunk Cost Fallacy”). I’m not suggesting that you go out and gamble your paycheck, even with 2 to 1 odds; but knowing that these tendencies are built into you can help guide your actions (or at least your objectivity).

Another Lewis-discussed example talks about how teachers at a top-tier business school asked students to write down the last digits of their phone number, then asked them how many countries in Africa were in the United Nations. There was statistical bias in their answers to this totally unrelated question. If you had higher phone digits, you guessed a higher number of countries. If you had lower phone digits, you guessed less countries from Africa in the UN. This concept of “Anchoring” is a bias that we all exhibit daily.

Part of you could say that this is interesting in theory, but how does it affect me? I would argue that it affects you more than anyone else on the planet. As companies build algorithms to eliminate human error, there will still need to be important decisions made (and I’m no futurist), but at a larger scale (and, potentially, of increasing importance). The fact that we’re communicating across continents on projects that have enormous impact for Fortune 100 companies is a marvel in and of itself, let alone the waves that those decisions will make. It’s never been more important to make those decisions based on science, fact, and objectivity. Sometimes, that requires the courage to recognize your humanity and to keep your “gut” in check.

As humans might say of their own bias, the first step towards solving any problem is acknowledging that you have one. You might catch something big that we all missed.