By Mike Golden
Future Nobel Prize winner Daniel Kahneman was ten years old, living in a chicken coop in Cagnes-sur-Mer, France. He was on the run from the Nazis. It was early 1944, and his father was dying; as a Jew in German-occupied territory, he couldn’t come out of hiding to get treatment for his diabetes. Sadly, the Allies liberated France too late for Danny’s father; he died six weeks before D-Day.
Sleeping in that chicken coop, Danny never imagined he would grow up to teach flight instructors in the Israeli Air Force.[1] But, a couple of decades later, he gave those instructors now-famous advice that applies as much to trial advocacy students as flight school cadets. Kahneman told the instructors that complimenting their students for doing things right would produce improvement better than criticizing them for doing things wrong.
The trainers disagreed. “Dr. Kahneman,” one frustrated instructor said, “On many occasions I have praised flight cadets for clean execution. The next time they usually do worse. On the other hand, I have often screamed into a cadet’s earphone for bad execution and in general he does better on the next try.”[2] Kahneman spent the rest of his life teaching the world about the shortcuts in our brains that led to the flight instructor’s erroneous conclusion.
Trial advocacy is teaching. Trial lawyers teach their decisionmakers (juries, judges, arbitrators) the theory of their case, just like the flight instructors taught pilots and Daniel Kahneman taught generations of psychologists (and economists).[3] Understanding how people process information is essential to persuading them.
We know that not every exhibit or snippet of testimony is received the same way. Experienced trial lawyers use demonstratives because they effectively penetrate the unconscious parts of the brain. Too many law students reject the critical importance of pathos in the courtroom; Kahneman provides a scientific foundation for teaching the effectiveness of emotional persuasion.
Kahneman identified two ways thoughts come to our mind.[4] One type of thought feels instantaneous: “that person is angry,” “vomit is gross,” or even “the number of people seated at counsel table is two.” Kahneman labeled these thoughts—fast, unconscious, and automatic—as System 1 thinking. System 1 thoughts have a visual component. When reading those quotes, your brain produces an image.
The second type of thoughts are distinguished by their absence. If asked, “how many hours are there in a year?” there is no automatic answer, no immediate image. To generate an answer requires work—slow, conscious, effortful. This is System 2. And Kahneman called it “System 2” to illustrate a point. People wrongly believe that we (and our judges and jurors) make most of our decisions through deliberate thought and consideration; in truth, for most decisions we rely on our “gut” or “instinct.” Kahneman calls those processes System 1 because they happen before (sometimes instead of) System 2.
It would be impossible to use System 2 all the time. If you’ve taken high school algebra and driven a car, you know this already. Imagine the lengthy, tedious effort to answer that SAT question about when two cars traveling a road at different speeds will meet. But when you’re driving down the highway, you know immediately (and even viscerally) when the car ahead of you is suddenly too close. As a result, we make most of our life decisions—even really important ones—using System 1, and only occasionally power up System 2 to double-check that those decisions are correct. Effective trial lawyers exploit this cognitive laziness through re-enactments, demonstrative exhibits, looping testimony, analogies, and the use of rhetorical questions.
Kahneman explained that System 1 works by using shortcuts he called “heuristics,” but they also sometimes can be called “fallacies” or even “biases.” In the June issue of Brain Lessons, Grant Rost explained one of these heuristics: the Anchoring Effect. Here are three others to integrate into your teaching:
People make judgments using the information that comes to mind most easily and quickly. This Availability Heuristic (1) drives the power of primacy, giving outsized importance to what you say first. The Halo Effect is similar: what we learn first about a person disproportionately influences everything else we learn about them. Also, because personal experiences, pictures, and vivid examples are very “available,” they are more likely to drive decisions than mere words or statistics.
The Substitution Heuristic (2)—also called “intensity matching”—is where we replace a hard question with an easy one. This can be powerful in closing arguments on damages, for example, perhaps you’ve seen a lawyer argue that the amount of pain the plaintiff suffered has to be worth more than the price of a cup of Starbucks coffee per day.[5] The Affect Heuristic is another example of substitution, replacing the hard question “What do I think about it?” with an easy one “How do I feel about it?”[6]
Kahneman argues knowing our own biases rarely helps us avoid succumbing to them. But, critical to trial lawyers and trial teachers, understanding the way others use heuristics or shortcuts can help us improve the effectiveness of our communications to them. And Kahneman’s own flight school experience offers an example:
The frustrated flight instructor favored criticism over compliments, because trainees did better after the instructor yelled and worse after he praised. But Danny knew the instructor was experiencing another heuristic: Hindsight Bias (3). In fact, the reason trainees did better after criticism of poor performance and worse after praise of good performance was simple regression to the mean: the pilots were always more likely after an exceptionally good maneuver to have a less good maneuver next, and always more likely after an exceptionally bad maneuver to have a better maneuver next. There was correlation but not causation between the criticism and the improvement—it was just the law of averages.
When jurors see what happened after a crime or catastrophic accident, System 1 immediately goes to work explaining how the result was the obvious, easily anticipated consequence. As Kahneman says, “Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”[7] Hindsight bias happens automatically because our brain is so good at constructing stories. Kahneman was once asked if we engage in hindsight bias because it feels easier to live in a world that makes sense; he responded, no, it’s not easier; it’s inevitable: “We have to make sense of things. We can’t do otherwise. We are sense-making organisms.”[8] Your students and your juries seek stories that make sense. Anticipate the shortcuts they will use to get to that sense and help keep them on the path[9].
[1] For starters, neither the Israeli Air Force nor Israel itself existed in 1944.
[2] Daniel Kahneman, Thinking, Fast and Slow 175 (2011) (cleaned up).
[3] Although Kahneman was a psychologist, he won the 2002 Nobel Prize in Economics because the field of behavioral economics was built on his research into decision making.
[4] Thinking, Fast and Slow 19-24.
[5] This technique is so effective that some courts are trying to rein it in. See, e.g., Gregory v. Chohan, 670 S.W.3d 546, 558 (Tex. 2023) (holding that “unsubstantiated anchors” such as the price of an F-18 fighter jet were inappropriate points of reference for plaintiff’s counsel to use when discussing compensatory damages in closing argument).
[6] See A Third View of the Black Box: Cognitive Coherence in Legal Decision Making, 71 U. Chi. L. Rev 511, 513 (2004) (“[T]he mind shuns cognitively complex and difficult decision tasks by reconstructing them into easy ones, yielding strong, confident conclusions.”).
[7] Thinking, Fast and Slow 201.
[8] Shankar Vedantam, The Transformative Ideas of Daniel Kahneman, Hidden Brain, at 25:34 (Apr. 2024).
[9] Many thanks to my colleague at the University of Texas, David Gonzalez, who introduced me to Kahneman’s work a dozen years ago.