From the Magazine
December 2016 Issue

How Two Trailblazing Psychologists Turned the World of Decision Science Upside Down

After his book Moneyball became a best-seller, Michael Lewis learned that many of the ideas it presented to the general public had actually been introduced decades earlier by a pair of Israeli psychologists: Daniel Kahneman and Amos Tversky. In an adaptation from his new book, Lewis investigates their story, and the intense bond between these radically different men.
This image may contain Amos Tversky Human Person Drink Beverage Alcohol Food Meal Face Plant Tree and Drinking
Amos Tversky and Daniel Kahneman toast to their partnership in the 1970s.Courtesy of Barbara Tversky.

Back in 2003, I published a book called Moneyball, about the Oakland Athletics’ quest to find new and better ways to value baseball players and evaluate baseball strategies.

The team had less money to spend on players than other teams did, and so its management, out of necessity, set about rethinking the game. In both new and old baseball data—and the work of people outside the game who had analyzed that data—the Oakland front office discovered what amounted to new baseball knowledge. That knowledge allowed them to run circles around the managements of other baseball teams. They found value in players who had been discarded or overlooked, and folly in much of what passed for baseball wisdom. When the book appeared, some baseball experts—entrenched management, talent scouts, journalists—were upset and dismissive, but a lot of readers found the story as interesting as I had. A lot of people saw in Oakland’s approach to building a baseball team a more general lesson: If the highly paid, publicly scrutinized employees of a business that had existed since the 1860s could be misunderstood by their market, who couldn’t be? If the market for baseball players was inefficient, what market couldn’t be? If a fresh analytical approach had led to the discovery of new knowledge in baseball, was there any sphere of human activity in which it might not do the same?

In the past decade or so, a lot of people have taken the Oakland A’s as their role model and set out to use better data, and better analysis of that data, to find market inefficiencies. I’ve read articles about Moneyball for Education, Moneyball for Movie Studios, Moneyball for Medicare, Moneyball for Golf, Moneyball for Farming, Moneyball for Book Publishing, Moneyball for Presidential Campaigns, Moneyball for Government, Moneyball for Bankers, and so on. But the enthusiasm for replacing old-school expertise with new-school data analysis was often shallow. When the data-driven approach to high-stakes decision-making did not lead to immediate success—and, occasionally, even when it did—it was open to attack in a way that the old approach to decision-making was not. In 2004, after aping Oakland’s approach to baseball decision-making, the Boston Red Sox won their first World Series in nearly a century. Using the same methods, they won it again in 2007 and 2013. But in 2016, after three disappointing seasons, they announced that they were moving away from the data-based approach and back to one where they relied upon the judgment of baseball experts. (“We have perhaps overly relied on numbers,” said owner John Henry.)

The writer Nate Silver for several years enjoyed breathtaking success predicting U.S. presidential-election outcomes for The New York Times, using an approach to statistics he learned writing about baseball. For the first time in memory, a newspaper seemed to have an edge in calling elections. But then Silver left the Times and failed to predict the rise of Donald Trump—and his data-driven approach to predicting elections was called into question . . . by The New York Times!

I’m sure some of the criticism of people who claim to be using data to find knowledge, and to exploit inefficiencies in their industries, has some truth to it. But whatever it is in the human psyche that the Oakland A’s exploited for profit—this hunger for an expert who knows things with certainty, even when certainty is not possible—has a talent for hanging around. It’s like a movie monster that’s meant to have been killed but is somehow always alive for the final act.

And so, once the dust had settled on the responses to my book, one of them remained more alive and relevant than the others: a review by a pair of academics, then both at the University of Chicago—an economist named Richard Thaler and a law professor named Cass Sunstein. Thaler and Sunstein’s piece, which appeared on August 31, 2003, in The New Republic, managed to be at once both generous and damning. The reviewers agreed that it was interesting that any market for professional athletes might be so screwed up that a poor team like the Oakland A’s could beat most rich teams simply by exploiting the inefficiencies. But—they went on to say—the author of Moneyball did not seem to realize the deeper reason for the inefficiencies in the market for baseball players: they sprang directly from the inner workings of the human mind. The ways in which some baseball expert might misjudge baseball players—the ways in which any expert’s judgments might be warped by the expert’s own mind—had been described, years ago, by a pair of Israeli psychologists, Daniel Kahneman and Amos Tversky. My book wasn’t original. It was simply an illustration of ideas that had been floating around for decades and had yet to be fully appreciated by, among others, me.

That was an understatement. Until that moment I don’t believe I’d ever heard of either Kahneman or Tversky, even though one of them had somehow managed to win a Nobel Prize in economics.

How did this pair of Israeli psychologists come to have so much to say about these matters of the human mind that they more or less anticipated a book about American baseball written decades in the future? What possessed two guys in the Middle East to sit down and figure out what the mind was doing when it tried to judge a baseball player, or an investment, or a presidential candidate? And how on earth does a psychologist win a Nobel Prize in economics?


Tversky in 1970.

Courtesy Of Barbara Tversky.

The dozen or so graduate students in Danny Kahneman’s seminar at Hebrew University, in Jerusalem, were all surprised when, in the spring of 1969, Amos Tversky turned up. Danny never had guests: The seminar, called Applications of Psychology, was his show. Amos’s interests were about as far removed from the real-world problems in Applications of Psychology as a psychologist’s could be.

Amos himself seemed about as far removed from Danny as he could be. Danny had spent years of his childhood hiding in barns and chicken coops in France, from the Nazis who hunted him. Amos was born and raised in a society intent on making sure no Jewish child ever again would need to hide from those who wished to kill him. Israel had made him a warrior. A Spartan. Danny was deeply, painfully uncertain about himself. “His defining emotion is doubt,” said one of his students. “And it is very useful. Because it makes him go deeper and deeper and deeper.” Amos was the most self-assured human being anyone knew.

The people who knew Amos and Danny best couldn’t imagine them getting along with each other. “It was the graduate students’ perception that they had some sort of rivalry,” said one of the students in the Applications of Psychology seminar. “They were clearly the stars of the department who somehow or other hadn’t gotten in sync.” And yet for some reason Danny had invited Amos to come to his seminar to talk about whatever he wanted to talk about. And, for some reason, Amos had accepted.

Danny was a little surprised that Amos didn’t talk about his own work—but then Amos’s work was so abstract and theoretical that he probably decided it had no place in the seminar. Those who stopped to think about it found it odd that Amos’s work betrayed so little interest in the real world, when Amos was so intimately and endlessly engaged with that world, and how, conversely, Danny’s work was consumed by real-world problems, even as he kept other people at a distance.

Amos was now what people referred to, a bit confusingly, as a “mathematical psychologist.” Non-mathematical psychologists, like Danny, quietly viewed much of mathematical psychology as a series of pointless exercises conducted by people who were using their ability to do math as camouflage for how little of psychological interest they had to say. Mathematical psychologists, for their part, tended to view non-mathematical psychologists as simply too stupid to understand the importance of what they were saying. Amos was then at work with a team of mathematically gifted American academics on what would become a three-volume, molasses-dense, axiom-filled textbook called Foundations of Measurement—more than a thousand pages of arguments and proofs of how to measure stuff. On the one hand, it was a wildly impressive display of pure thought; on the other, the whole enterprise had a tree-fell-in-the-woods quality to it. How important could the sound it made be, if no one was able to hear it?

After the seminar, Amos and Danny had a few lunches together but then headed off in separate directions. That summer Amos left for the United States, and Danny for England, to continue his study of human attention. He had all these ideas about the possible usefulness of this new interest of his. In tank warfare, for instance. Danny was now taking people into his research lab and piping one stream of digits into their left ear and another stream of digits into their right ear, to test how quickly they could switch their attention from one ear to the other, and also how well they blocked their minds to sounds they were meant to be ignoring. “In tank warfare, as in a Western shootout, the speed at which one can decide on a target and act on that decision makes the difference between life and death,” said Danny later. He might use his test to identify which tank commanders could best orient their senses at high speed—who among them might most quickly detect the relevance of a signal, and focus his attention upon it, before he got blown to bits.

Dual Personalities

By the fall of 1969, Amos and Danny had both returned to Hebrew University. During their joint waking hours, they could usually be found together. Danny was a morning person, and so anyone who wanted him alone could find him before lunch. Anyone who wanted time with Amos could secure it late at night. In the intervening time, they might be glimpsed disappearing behind the closed door of a seminar room they had commandeered. From the other side of the door you could sometimes hear them hollering at each other, but the most frequent sound to emerge was laughter. Whatever they were talking about, people deduced, must be extremely funny. And yet whatever they were talking about also felt intensely private: Other people were distinctly not invited into their conversation. If you put your ear to the door, you could just make out that the conversation was occurring in both Hebrew and English. They went back and forth—Amos, especially, always switched back to Hebrew when he became emotional.

The students who once wondered why the two brightest stars of Hebrew University kept their distance from each other now wondered how two so radically different personalities could find common ground, much less become soulmates. “It was very difficult to imagine how this chemistry worked,” said Ditsa Kaffrey, a graduate student in psychology who studied with them both.

Danny was always sure he was wrong. Amos was always sure he was right. Amos was the life of every party; Danny didn’t go to the parties. Amos was loose and informal; even when Danny made a stab at informality, it felt as if he had descended from some formal place. With Amos you always just picked up where you left off, no matter how long it had been since you last saw him. With Danny there was always a sense you were starting over, even if you had been with him just yesterday. Amos was tone-deaf but would nevertheless sing Hebrew folk songs with great gusto. Danny was the sort of person who might be in possession of a lovely singing voice that he would never discover. Amos was a one-man wrecking ball for illogical arguments; when Danny heard an illogical argument, he asked, What might that be true of? Danny was a pessimist. Amos was not merely an optimist; Amos willed himself to be optimistic, because he had decided pessimism was stupid. When you are a pessimist and the bad thing happens, you live it twice, Amos liked to say. Once when you worry about it, and the second time when it happens. “They were very different people,” said a fellow Hebrew University professor. “Danny was always eager to please. He was irritable and short-tempered, but he wanted to please. Amos couldn’t understand why anyone would be eager to please. He understood courtesy, but eager to please—why?” Danny took everything so seriously; Amos turned much of life into a joke. When Hebrew University put Amos on its committee to evaluate all Ph.D. candidates, he was appalled at what passed for a dissertation in the humanities. Instead of raising a formal objection, he merely said, “If this dissertation is good enough for its field, it’s good enough for me. Provided the student can divide fractions!”

Beyond that, Amos was the most terrifying mind most people had ever encountered. “People were afraid to discuss ideas in front of him,” said a friend—because they were afraid he would put his finger on the flaw that they had only dimly sensed. One of Amos’s graduate students, Ruma Falk, said she was so afraid of what Amos would think of her driving that when she drove him home, in her car, she insisted that he drive. And now here he was spending all of his time with Danny, whose susceptibility to criticism was so extreme that a single remark from a misguided student sent him down a long, dark tunnel of self-doubt. It was as if you had dropped a white mouse into a cage with a python and come back later and found the mouse talking and the python curled in the corner, rapt.

Kahneman (left) receives the Nobel Prize in Economic Sciences, 2002.

By Jonas Ekstromer/AFP.

But there was another story to be told, about how much Danny and Amos had in common. Both were grandsons of Eastern European rabbis, for a start. Both were explicitly interested in how people functioned when they were in a “normal” unemotional state. Both wanted to do science. Both wanted to search for simple, powerful truths. As complicated as Danny may have been, he still longed to do “the psychology of single questions,” and as complicated as Amos’s work may have seemed, his instinct was to cut through endless bullshit to the simple nub of any matter. Both men were blessed with shockingly fertile minds. And both were Jews, in Israel, who did not believe in God. And yet all anyone saw were their differences.

The most succinct physical manifestation of the deep difference between the two men was the state of their offices. “Danny’s office was such a mess,” recalled Daniela Gordon, who had become Danny’s teaching assistant. “Scraps on which he’d scribbled a sentence or two. Paper everywhere. Books everywhere. Books opened to places he’d stopped reading. I once found my master’s thesis open on page 13—I think that’s where he stopped. And then you would walk down the hall three or four rooms, and you come to Amos’s office . . . and there is nothing in it. A pencil on a desk. In Danny’s office you couldn’t find anything because it was such a mess. In Amos’s office you couldn’t find anything because there was nothing there.” All around them people watched and wondered: Why were they getting along so well? “Danny was a high-maintenance person,” said one colleague. “Amos was the last one to put up with a high-maintenance person. And yet he was willing to go along. Which was amazing.”

Danny and Amos didn’t talk much about what they got up to when they were alone together, which just made everyone else more curious about what it was. In the beginning they were kicking around Danny’s proposition—that people didn’t depend on probability or statistics. Whatever human beings did when presented with a problem that had a statistically correct answer, it wasn’t statistics. But how did you sell that to an audience of professional social scientists who were more or less blinded by theory? And how did you test it? They decided, in essence, to invent an unusual statistics test, give it to the scientists, and see how they performed. Their case would be built from evidence that consisted entirely of answers to questions they’d put to some audience—in this case, an audience of people trained in statistics and probability theory. Danny dreamed up most of the questions, such as:

The mean I.Q. of the population of eighth-graders in a city is known to be 100. You have selected a random sample of 50 children for a study of educational achievement. The first child tested has an I.Q. of 150. What do you expect the mean I.Q. to be for the whole sample? (This test was meant to explore how new information affects decision-making.)

At the end of the summer of 1969, Amos took Danny’s questions to the annual meeting of the American Psychological Association, in Washington, D.C., and then on to a conference of mathematical psychologists. There he gave the tests to roomfuls of people whose careers required fluency in statistics. Two of the test takers had written statistics textbooks. Amos then collected the completed tests and flew home with them to Jerusalem.

There he and Danny sat down to write together for the first time. Their offices were tiny, so they worked in a small seminar room. Amos didn’t know how to type, and Danny didn’t particularly want to, so they sat with notepads. They went over each sentence time and again and wrote, at most, a paragraph or two each day. “I had this sense of realization: Ah, this is not going to be the usual thing, this is going to be something else,” said Danny. “Because it was funny.”

When Danny looked back on that time, what he recalled mainly was the laughter—what people outside heard emanating from the seminar room. “I have the image of balancing precariously on the back legs of a chair and laughing so hard I nearly fell backwards.” The laughter may have sounded a bit louder when the joke had come from Amos, but that was only because Amos had a habit of laughing at his own jokes. (“He was so funny that it was O.K. he was laughing at his own jokes.”) In Amos’s company Danny felt funny, too—and he’d never felt that way before. In Danny’s company Amos, too, became a different person: uncritical. Or, at least, uncritical of whatever came from Danny. He didn’t even poke fun in jest. He enabled Danny to feel, in a way he hadn’t before, confident. Maybe for the first time in his life Danny was playing offense. “Amos did not write in a defensive crouch,” he said. “There was something liberating about the arrogance—it was extremely rewarding to feel like Amos, smarter than almost everyone.” The finished paper dripped with Amos’s self-assurance, beginning with the title he had put on it: “Belief in the Law of Small Numbers.” And yet the collaboration was so complete that neither of them felt comfortable taking the credit as the lead author; to decide whose name would appear first, they flipped a coin. Amos won.

When they wrote their first papers, Danny and Amos had no particular audience in mind. Their readers would be the handful of academics who happened to subscribe to the highly specialized psychology trade journals in which they published. By 1972 they had spent the better part of three years uncovering the ways in which people judged and predicted—but the examples that they had used to illustrate their ideas were all drawn directly from psychology, or from the strange, artificial-seeming tests that they had given high-school and college students. Yet they were certain that their insights applied anywhere in the world that people were judging probabilities and making decisions. They sensed that they needed to find a broader audience. “The next phase of the project will be devoted primarily to the extension and application of this work to other high-level professional activities, e.g., economic planning, technological forecasting, political decision making, medical diagnosis, and the evaluation of legal evidence,” they wrote in a research proposal. They hoped, they wrote, that the decisions made by experts in these fields could be “significantly improved by making these experts aware of their own biases, and by the development of methods to reduce and counteract the sources of bias in judgment.” They wanted to turn the real world into a laboratory. It was no longer just students who would be their lab rats but also doctors and judges and politicians. The question was: How to do it?

In 1972, Irv Biederman, then a visiting associate professor of psychology at Stanford University, heard Danny give a talk about heuristics and biases on the Stanford campus. “I remember I came home from the talk and told my wife, ‘This is going to win a Nobel Prize in economics,’ ” recalled Biederman. “I was so absolutely convinced. This was a psychological theory about economic man. I thought, What could be better? Here is why you get all these irrationalities and errors. They come from the inner workings of the human mind.”

They couldn’t help but sense a growing interest in their work. “That was the year it was really clear we were onto something,” recalled Danny. “People started treating us with respect.” But by the fall of 1973 it was fairly clear to Danny that other people would never fully understand his relationship with Amos. The previous academic year, they’d taught a seminar together at Hebrew University. From Danny’s point of view, it had been a disaster. The warmth he felt when he was alone with Amos vanished whenever Amos was in the presence of an audience. “When we were with other people we were one of two ways,” said Danny. “Either we finished each other’s sentences and told each other’s jokes. Or we were competing. No one ever saw us working together. No one knows what we were like.” What they were like, in every way but sexually, was lovers. They connected with each other more deeply than either had connected with anyone else. Their wives noticed it. “Their relationship was more intense than a marriage,” said Tversky’s wife, Barbara. “I think they were both turned on intellectually more than either had ever been before. It was as if they were both waiting for it.” Danny sensed that his wife felt some jealousy; Amos actually praised Barbara, behind her back, for dealing so gracefully with the intrusion on their marriage. “Just to be with him,” said Danny. “I never felt that way with anyone else, really. You are in love and things. But I was rapt. And that’s what it was like. It was truly extraordinary.”

And yet it was Amos who worked hardest to find ways to keep them together. “I was the one who was holding back,” said Danny. “I kept my distance because I was afraid of what would happen to me without him.”

An Israeli tank during the 1973 Yom Kippur War.

By David Rubinger/The Life Images Collection/Getty Images.

The Psychology of War

It was four in the morning California time on October 6, 1973, when the armies of Egypt and Syria launched their attack upon Israel. They’d taken the Israelis by surprise on Yom Kippur. Along the Suez Canal, the 500-man Israeli garrison was overwhelmed by 100,000 or so Egyptian troops. From the Golan Heights, 177 Israeli tank crews gazed down upon an attacking force of 2,000 Syrian tanks. Amos and Danny, still in the United States trying to become decision analysts, raced to the airport and got the first flight possible to Paris, where Danny’s sister worked in the Israeli Embassy. Getting into Israel during a war wasn’t easy. Every inbound El Al plane was crammed with fighter pilots and combat-unit commanders who were coming in to replace the men killed in the first days of the invasion. That’s just what you did if you were an Israeli capable of fighting in 1973: You ran toward the war. Knowing this, Egyptian president Anwar Sadat had promised to shoot down any commercial planes attempting to land in Israel. As they waited in Paris for Danny’s sister to talk someone into letting them onto a flight, Danny and Amos bought combat boots. They were made of canvas—lighter than the leather boots issued by the Israeli military.

When the war broke out, Barbara Tversky was on the way to an emergency room in Jerusalem with her eldest son. He had won a contest with his brother to see who could stick a cucumber farther up his own nose. As they headed home, people surrounded their car and screamed at Barbara for being on the road. The country was in a state of panic: fighter jets screamed low over Jerusalem to signal all reserves to return to their units. Hebrew University closed. Army trucks rumbled all night through the Tverskys’ usually tranquil neighborhood. The city was black. Streetlamps remained off; anyone who owned a car taped over its brake lights. The stars could not have been more spectacular, or the news more troubling—because, for the first time, Barbara sensed that the Israeli government was withholding the truth. This war was different from the others: Israel was losing. Not knowing where Amos was, or what he planned to do, didn’t help. Phone calls were so expensive that when he was in the United States they communicated only by letter. Her situation wasn’t unusual: there were Israelis who would learn that loved ones living abroad had returned to Israel to fight only by being informed that they had been killed in action.

To make herself useful, Barbara went to the library and found the material to write a newspaper article about stress and how to cope with it. A few nights into the conflict, around 10 o’clock, she heard footsteps. She was working alone in the study, with the blinds lowered, to avoid letting the light seep out. The kids were asleep. Whoever was coming up the stairs was running; then suddenly Amos bounded from the darkness. The El Al flight that he had taken with Danny had carried as passengers no one but Israeli men returning to fight. It had descended into Tel Aviv in total darkness: There hadn’t even been a light on the wing. Once again, Amos went into the closet and pulled down his old army uniform, which he wore in the 1967 Six Day War, now with a captain’s insignia on it. It still fit. At five o’clock the following morning he left.

He had been assigned, with Danny, to the psychology field unit. The unit had grown since the mid-1950s, when Danny had redesigned the selection system. In early 1973 an American psychologist named James Lester, sent by the Office of Naval Research to study Israeli military psychology, wrote a report in which he described the unit Danny and Amos were about to join. Lester marveled at the entire society—a country that had at once the world’s strictest driving tests and the world’s highest automobile accident rates—but seems to have been struck especially by the faith the Israeli military placed in their psychologists. “Failure rate in the officer course is running at 15–20%,” he wrote. “Such confidence does the military have in the mysteries of psychological research that they are asking the Selection Section to try to identify these 15% during the first week in training.”

The head of Israeli military psychology, Lester reported, was an oddly powerful character named Benny Shalit. Shalit had argued for, and received, a new, elevated status for military psychology. His unit had a renegade quality to it; Shalit had gone so far as to sew an insignia of his own design onto its uniform. It consisted of the Israeli olive branch and sword, Lester explained, “topped by an eye which symbolizes assessment, insight, or something along those lines.” In his attempts to turn his psychology unit into a fighting force, Shalit had dreamed up ideas that struck even the psychologists as wacko. Hypnotizing Arabs and sending them to assassinate Arab leaders, for instance. “He actually did hypnotize one Arab,” recalled Daniela Gordon, who served under Shalit in the psychology unit. “They took him to the Jordanian border, and he just ran off.”

A rumor among Shalit’s subordinates—and it refused to die—was that Shalit kept the personality assessments made of all the Israeli-military big shots, back when they were young men entering the army, and let them know that he wouldn’t be shy about making them public. Whatever the reason, Benny Shalit had an unusual ability to get his way in the Israeli military. And one of the unusual things Shalit had asked for, and received, was the right to embed psychologists in army units, where they might directly advise commanders. “Field psychologists are in a position to make recommendations on a variety of unconventional issues,” Lester reported to his U.S. Navy superiors. “For example, one noticed that infantry troops in hot weather, stopping to open soft drinks with their ammunition magazines, often damaged the stock. It was possible to redesign the stock so that a tool for opening bottles was included.” Shalit’s psychologists had eliminated the unused sights on submachine guns, and changed the way machine-gun units worked together, to increase the rate at which they fired. Psychologists in the Israeli Army were, in short, off the leash. “Military psychology is alive and well in Israel,” concluded the United States Navy’s reporter on the ground. “It is an interesting question whether or not the psychology of the Israelis is becoming a military one.”

Tversky and Kahneman in Tversky’s backyard.

By May Bar-Hillel.

What Benny Shalit’s field psychologists might do during an actual battle, however, was unclear. “The psychology unit did not have the faintest idea what to do,” said Eli Fishoff, who served as Benny Shalit’s second-in-command. “The war was totally unexpected. We were just thinking, Maybe it’s the end of us.” In a matter of days the Israeli Army had lost more men, as a percentage of the population, than the United States military lost in the entire Vietnam War. The war was later described by the Israeli government as a “demographic disaster” because of the prominence and talent of the Israelis who were killed. In the psychology unit someone came up with the idea of designing a questionnaire to determine what, if anything, might be done to improve the morale of the troops. Upon his arrival at the psychology unit Amos seized upon it, helped to design the questions, and then used the entire exercise more or less as an excuse to get himself closer to the action. “We just got a jeep and went bouncing around in the Sinai looking for something useful to do,” said Danny.

Their fellow psychologists who watched Danny and Amos toss rifles into the back of a jeep and set out for the battlefield thought they were out of their minds. “Amos was so excited—like a little child,” recalled Yaffa Singer, who worked with Danny in the Israeli Army’s psychology unit. “But it was crazy for them to go to the Sinai. It was so dangerous. It was absolutely crazy to send them out with those questionnaires.” The risk of running directly into enemy tanks and planes was the least of it. There were land mines everywhere; it was easy to get lost. “They didn’t have guards,” said Daniela Gordon, their commanding officer. “They guarded themselves.” All of them felt less concern for Amos than for Danny. “We were very worried about sending Danny on his own,” said Eli Fishoff, head of the field psychologists. “I wasn’t so worried about Amos—because Amos was a fighter.”

The moment Danny and Amos were in the jeep roaring through the Sinai, however, it was Danny who became useful. “He was jumping off the car and grilling people,” recalled Fishoff. Amos seemed like the practical one, but Danny, more than Amos, had a gift for finding solutions to problems where others failed even to notice that there was a problem to solve. As they sped toward the front lines, Danny noticed the huge piles of garbage on the roadsides: the leftovers from the canned meals supplied by the U.S. Army. He examined what the soldiers had eaten and what they had thrown out. (They liked the canned grapefruit.) His subsequent recommendation that the Israeli Army analyze the garbage and supply the soldiers with what they actually wanted made newspaper headlines.

Israeli tank drivers were just then being killed in action at an unprecedented rate. Danny visited the site where new tank drivers were being trained, as quickly as possible, to replace the ones who had died. Groups of four men took turns in two-hour shifts on a tank. Danny pointed out that people learn more efficiently in short bursts, and that new tank drivers might be educated faster if the trainees rotated behind the wheel every 30 minutes. He also somehow found his way to the Israeli Air Force. Fighter pilots were also dying in unprecedented numbers because of Egypt’s use of new and improved surface-to-air missiles provided by the Soviet Union. One squadron had suffered especially horrific losses. The general in charge wanted to investigate, and possibly punish, the unit. “I remember him saying accusingly that one of the pilots had been hit ‘not only by one missile but by four!’ As if that was conclusive evidence of the pilot’s ineptitude,” recalled Danny.

Danny explained to the general that he had a sample-size problem: the losses experienced by the supposedly inept fighter squadron could have occurred by random chance alone. If he investigated the unit, he would no doubt find patterns in behavior that might serve as an explanation. Perhaps the pilots in that squadron had paid more visits to their families, or maybe they wore funny-colored underpants. Whatever he found would be a meaningless illusion, however. There weren’t enough pilots in the squadron to achieve statistical significance. On top of it, an investigation, implying blame, would be horrible for morale. The only point of an inquiry would be to preserve the general’s feelings of omnipotence. The general listened to Danny and stopped the inquiry. “I have considered that my only contribution to the war effort,” said Danny.

The actual business at hand—putting questions to soldiers fresh from combat—Danny found pointless. Many of them were traumatized. “We were wondering what to do with people who were in shock—how even to evaluate them,” said Danny. “Every soldier was frightened, but there were some people who couldn’t function.” Shell-shocked Israeli soldiers resembled people with depression. There were some problems he didn’t feel equipped to deal with, and this was one of them.

He didn’t really want to be in the Sinai anyway, not in the way Amos seemed to want to be there. “I remember a sense of futility—that we were wasting our time there,” he said. When their jeep bounced once too often and caused Danny’s back to go out, he quit the journey—and left Amos alone to administer the questionnaires. From their jeep rides he retained a single vivid memory. “We went to sleep near a tank,” he recalled. “On the ground. And Amos didn’t like where I was sleeping, because he thought the tank might move and crush me. And I remember being very, very touched by this. It was not sensible advice. A tank makes a lot of noise. But that he was worried about me.”

Later, the Walter Reed Army Institute of Research undertook a study of the war. “Battle Shock Casualties During the 1973 Arab-Israeli War,” it was called. The psychiatrists who prepared the report noted that the war was unusual in its intensity—it was fought 24 hours a day, at least at the start—and in the losses suffered. The report also noted that, for the first time, Israeli soldiers were diagnosed with psychological trauma. The questionnaires Amos had helped to design asked the soldiers many simple questions: Where were you? What did you do? What did you see? Was the battle a success? If not, why not? “People started to talk about fear,” recalls Yaffa Singer. “About their emotions. From the War of Independence until 1973 it hadn’t been allowed. We are supermen. No one has the guts to talk about fear. If we talk about it, maybe we won’t survive.”

For days after the war, Amos sat with Singer and two other colleagues in the psychology field unit and read through the soldiers’ answers to his questions. The men spoke of their motives for fighting. “It’s such horrible information that people tend to bury it,” said Singer. But caught fresh, the soldiers revealed to the psychologists sentiments that, in retrospect, seemed blindingly obvious. “We asked, Why is anyone fighting for Israel?” said Singer. “Until that moment we were just patriots. When we started reading the questionnaires it was so obvious: They were fighting for their friends. Or for their families. Not for the nation. Not for Zionism. At the time it was a huge realization.” Perhaps for the first time, Israeli soldiers spoke openly of their feelings as they watched five of their beloved platoon-mates blown to bits or as they saw their best friend on earth killed because he turned left when he was supposed to turn right. “It was heartbreaking to read them,” said Singer.

Right up until the fighting stopped, Amos sought risks that he didn’t need to take—that in fact others thought were foolish to take. “He decided to witness the end of the war along the Suez,” recalled Barbara, “even though he knew full well that shelling continued after the time of the cease-fire.” Amos’s attitude toward physical risk occasionally shocked even his wife. Once, he announced that he wanted to start jumping out of airplanes again, just for fun. “I said, ‘You are the father of children,’ ” said Barbara. “That ended the discussion.” Amos wasn’t a thrill-seeker, exactly, but he had strong, almost child-like passions that, every so often, he allowed to grab hold of him and take him places most people would never wish to go.

In the end, he crossed the Sinai to the Suez Canal. Rumors circulated that the Israeli Army might march all the way to Cairo, and that Soviets were sending nuclear weapons to Egypt to prevent them from doing so. Arriving at the Suez, Amos found that the shelling hadn’t merely continued; it had intensified. There was now a long-standing tradition, on both sides of any Arab-Israeli war, of seizing the moment immediately before a formal cease-fire to fire any remaining ammunition at each other. The spirit of the thing was: Kill as many of them as you can, while you can. Wandering around near the Suez Canal and sensing an incoming missile, Amos leapt into a trench and landed on top of an Israeli soldier.

Are you a bomb? asked the terrified soldier. No, I’m Amos, said Amos. So I’m not dead? asked the soldier. You’re not dead, said Amos. That was the one story Amos told. Apart from that, he seldom mentioned the war again.

You Can Lead a Horse to Water

In late 1973 or early 1974, Danny gave a talk, which he would deliver more than once, and which he called “Cognitive Limitations and Public Decision Making.” It was troubling to consider, he began, “an organism equipped with an affective and hormonal system not much different from that of the jungle rat being given the ability to destroy every living thing by pushing a few buttons.” Given the work on human judgment that he and Amos had just finished, he found it further troubling to think that “crucial decisions are made, today as thousands of years ago, in terms of the intuitive guesses and preferences of a few men in positions of authority.” The failure of decision-makers to grapple with the inner workings of their own minds, and their desire to indulge their gut feelings, made it “quite likely that the fate of entire societies may be sealed by a series of avoidable mistakes committed by their leaders.”

Before the war, Danny and Amos had shared the hope that their work on human judgment would find its way into high-stakes real-world decision-making. In this new field, called decision analysis, they could transform high-stakes decision-making into a sort of engineering problem. They would design decision-making systems. Experts on decision-making would sit with leaders in business, the military, and government and help them to frame every decision explicitly as a gamble, to calculate the odds of this or that happening, and to assign values to every possible outcome.

If we seed the hurricane, there is a 50 percent chance we lower its wind speed but a 5 percent chance that we lull people who really should evacuate into a false sense of security: What do we do?

In the bargain, the decision analysts would remind important decision-makers that their gut feelings had mysterious powers to steer them wrong. “The general change in our culture toward numerical formulations will give room for explicit reference to uncertainty,” Amos wrote in notes to himself for a talk of his own. Both Amos and Danny thought that voters and shareholders and all the other people who lived with the consequences of high-level decisions might come to develop a better understanding of the nature of decision-making. They would learn to evaluate a decision not by its outcomes—whether it turned out to be right or wrong—but by the process that led to it. The job of the decision-maker wasn’t to be right but to figure out the odds in any decision and play them well. As Danny told audiences in Israel, what was needed was a “transformation of cultural attitudes to uncertainty and to risk.”

Exactly how some decision analyst would persuade any business, military, or political leader to allow him to edit his thinking was unclear. How would you even persuade some important decision-maker to assign numbers to his “utilities” (that is, personal value as opposed to objective value)? Important people didn’t want their gut feelings pinned down, even by themselves. And that was the rub.

Later, Danny recalled the moment he and Amos lost faith in decision analysis. The failure of Israeli intelligence to anticipate the Yom Kippur attack led to an upheaval in the Israeli government and a subsequent brief period of introspection. They’d won the war, but the outcome felt like a loss. The Egyptians, who had suffered even greater losses, were celebrating in the streets as if they had won, while everyone in Israel was trying to figure out what had gone wrong. Before the war, the Israeli intelligence unit had insisted, despite a lot of evidence to the contrary, that Egypt would never attack Israel as long as Israel maintained air superiority. Israel had maintained air superiority, and yet Egypt had attacked. After the war, with the view that perhaps it could do better, Israel’s Ministry of Foreign Affairs set up its own intelligence unit. The man in charge of it, Zvi Lanir, sought Danny’s help. In the end, Danny and Lanir conducted an elaborate exercise in decision analysis. Its basic idea was to introduce a new rigor in dealing with questions of national security. “We started with the idea that we should get rid of the usual intelligence report,” said Danny. “Intelligence reports are in the form of essays. And essays have the characteristic that they can be understood any way you damn well please.” In place of the essay, Danny wanted to give Israel’s leaders probabilities, in numerical form.

In 1974, U.S. secretary of state Henry Kissinger had served as the middleman in peace negotiations between Israel and Egypt and between Israel and Syria. As a prod to action, Kissinger had sent the Israeli government the C.I.A.’s assessment that, if the attempt to make peace failed, very bad events were likely to follow. Danny and Lanir set out to give Israeli foreign minister Yigal Allon precise numerical estimates of the likelihood of some very specific bad things happening. They assembled a list of possible “critical events or concerns”: regime change in Jordan, U.S. recognition of the Palestinian Liberation Organization, another full-scale war with Syria, and so on. They then surveyed experts and well-informed observers to establish the likelihood of each event. Among these people, they found a remarkable consensus: there wasn’t a lot of disagreement about the odds. When Danny asked the experts what the effect might be of the failure of Kissinger’s negotiations on the probability of war with Syria, for instance, their answers clustered around “raises the chance of war by 10 percent.”

Danny and Lanir then presented their probabilities to Israel’s Foreign Ministry. (“The National Gamble,” they called their report.) Foreign Minister Allon looked at the numbers and said, “Ten percent increase? That is a small difference.”

Danny was stunned: if a 10 percent increase in the chances of full-scale war with Syria wasn’t enough to interest Allon in Kissinger’s peace process, how much would it take to turn his head? That number represented the best estimate of the odds. Apparently, the foreign minister didn’t want to rely on the best estimates. He preferred his own internal probability calculator: his gut. “That was the moment I gave up on decision analysis,” said Danny. “No one ever made a decision because of a number. They need a story.” As Danny and Lanir wrote, decades later, after the U.S. Central Intelligence Agency asked them to describe their experience in decision analysis, the Israeli Foreign Ministry was “indifferent to the specific probabilities.” What was the point of laying out the odds of a gamble if the person taking it either didn’t believe the numbers or didn’t want to know them? The trouble, Danny suspected, was that “the understanding of numbers is so weak that they don’t communicate anything. Everyone feels that those probabilities are not real—that they are just something on somebody’s mind.”

In the history of Danny and Amos, there are periods when it is difficult to disentangle their enthusiasm for their ideas from their enthusiasm for each other. The moments before and after the Yom Kippur War appear, in hindsight, less like a natural progression from one idea to the next than two men in love scrambling to find an excuse to be together. They felt they had finished exploring the errors that arose from the rules of thumb people use to evaluate probabilities in any uncertain situation. They’d found decision analysis promising but ultimately futile. They went back and forth on writing a general-interest book about the various ways the human mind deals with uncertainty; for some reason, they could never get beyond a sketchy outline and false starts of a few chapters. After the Yom Kippur War—and the ensuing collapse of the public’s faith in the judgment of Israeli government officials—they thought that what they really should do was reform the educational system so that future leaders were taught how to think. “We have attempted to teach people to be aware of the pitfalls and fallacies of their own reasoning,” they wrote, in a passage for the popular book that never came to be. “We have attempted to teach people at various levels in government, army etc. but achieved only limited success.”

Adapted from The Undoing Project: A Friendship That Changed Our Minds, by Michael Lewis, to be published in December by W. W. Norton & Company; © 2016 by the author.