Skip to content
What Military Strategists Can Learn From Buddha
Go to my account
World Events

What Military Strategists Can Learn From Buddha

Ditching common concepts and thought patterns can help avert disaster.

Illustration by Noah Hickey. (Photos via Unsplash/Getty Images)
Scroll to the comments section

Strategists by and large are not Buddhists. But maybe more of them should consider it. Some recent revelations about intelligence in the lead-up to the Israeli attack on Iran explain why.

As the Wall Street Journal reported last month, deep in an article about how Israel pulled off its surprise attack on Iran:

The key to the deception, said a security official familiar with the planning of the operation, was the idea implanted in the minds of the Iranians that Israel wouldn’t strike without U.S. authorization and participation. As long as the U.S. wasn’t mobilizing its forces and was engaged in negotiations, Israel could threaten to attack and even mobilize its troops in plain sight of Iran without giving away the element of surprise.

How is it that a massive Israeli decapitation attack—code-named Operation Red Wedding in reference to a famous incident in Game of Thrones—found a cluster of Iran’s most senior military leaders meeting together in one place, when the most basic precautions in a pre-war setting would have been to keep them widely dispersed? The answer is that the Iranians did not believe war was imminent. And they did not believe that because they were paying greater attention to a strategic concept—a product of mind—than they were to the brute facts available to their eyes. 

That is exactly the mistake that Buddhist philosophy warns against.

Strategists in the modern era—whether they are military planners or corporate executives—have a deep devotion to frameworks, models, and concepts. That’s understandable. The strategist’s job is to frame hard choices and trade-offs in the service of making better decisions. Any management consultant will tell you that the discipline is fundamentally about abstracting from facts—going “up a level” as the McKinsey jargon puts it. Concepts are mental constructs that group facts into patterns. Patterns then fit into frameworks that make it easier for people to debate and arrive at decisions. 

But a basic tenet of Buddhist thought is that patterns and concepts built inside the human mind deceive at least as much as they inform. The more time and energy you spend with these constructions of mind, the less you become attuned to the “sense doors”—the place where facts on the ground enter awareness.

If the religious connotations of Buddhist philosophy of mind make you uneasy you can follow essentially the same line of reasoning through the application of behavioral economics. Here the language is “confirmation bias” or “the assimilation of new evidence to previously existing beliefs,” but the impact on decisions is largely the same. 

When decisions respond to a mental model more than they do to the real-world situations that the mental model represents—imperfectly—the outcome is surprising, and often bad. And sometimes nearly existential, as it was for Israel in 1973. Right there—and not coincidentally alongside the birth of modern behavioral economics—lies the great irony of Operation Red Wedding.

Go back to late 1960s Israel. At that point, Israeli military intelligence had spent two decades developing “Ha-Konseptzia”(Hebrew for “The Concept”) as its foundational strategic doctrine to guide the Israeli Defense Forces and political decision making. 

The core discipline was simple (as good concepts typically are) and had three main tenets. 

First, Arab leaders were rational actors who understood they could not win a war against Israel and therefore wouldn’t start one, unless or until specific conditions were fulfilled—including until Egypt believed it had come close to military parity with Israel, particularly in air power. Second, Egyptian leadership was focused on economic development more so than war with Israel, which made the runway to anything approaching military parity even longer. Third, there would be sufficient strategic warning of any planned attack. Arab states weren’t capable of meaningful surprise—their militaries would require extensive preparation if they were going to launch a full-scale war, and Israeli intelligence would detect those preparations weeks or even months in advance, giving Israel plenty of time to prepare and—as in 1967—preempt if that were the best option.

The Concept—embedded in military intelligence, IDF leadership, and Mossad—became stronger over time because it worked. As a small developing economy, Israel needed to maintain a relatively lean standing army and rely heavily on reserves that could be mobilized from civilian life when necessary, and The Concept said that would work. With a much smaller population than its primary enemies, Israel had to rely on technological and intelligence superiority to stay ahead, and The Concept said that would work. And because nothing cements a mental model in strategists’ minds more than a dramatic success, the 1967 Six-Day War seemingly proved the point: The belief that Arab forces were threatening but not yet ready for war led to a bold Israeli decision to launch a preemptive attack, destroying much of the Egyptian air force while planes were resting on the ground and occupying the West Bank, the Gaza Strip, the Sinai, and the Golan Heights. In six days.

Concepts work until they don’t. And on Yom Kippur 1973, The Concept failed catastrophically. The mental model said that Egypt would not attack Israel until it had received Scud missiles and long range attack aircraft from the Russians in numbers adequate to neutralize, or at least do competitive battle with, the Israeli air force. 

A highly placed Mossad asset (Ashraf Marwan, codename “The Angel”) had warned for months that The Concept had become outdated and that Egyptian President Anwar Sadat’s preparations for war were real, but a few inaccurate “boy who cried wolf” messages about an imminent attack from him prior to October had the (expected, dysfunctional) effect of actually reinforcing IDF confidence that the Concept was in fact intact. It was only about a day before the Egyptians and Syrians launched their massive attack that Prime Minister Golda Meir reversed course, discarded The Concept, and ordered an emergency mobilization. 

Those few hours probably saved the country from what might have been a massive and possibly existential defeat. In its wake, two young Israeli academic psychologists—Daniel Kahnemann and Amos Tversky—were tasked with figuring out how a crucial, bet-the-farm strategic framework had gone so terribly wrong. This was the motivating force behind the accelerated development of what became known as behavioral economics. Every credible intelligence community in the world now builds tools and trains its analysts to avoid making that same set of mistakes again. But in Operation Red Wedding we have yet more evidence that it remains very hard to do, and the Iranians seem to have fallen prey to exactly the same kind of failure.

What we don’t know—or at least the public does not know—is this: Did the Israelis actively conduct intel operations to implant this latter day concept in the Iranian leaderships’ minds? Did the Israelis know it was already there and do everything they could to reinforce it with seemingly confirmatory “evidence”? Did the Israelis and Americans collaborate to exploit the Iranian’s concept together? After all, President Donald Trump posted on Truth Social “we remain committed to a diplomatic resolution to the Iran Nuclear Issue!” literally as the Israeli strike force planes were taking off for the first wave of attacks.

In the end it doesn’t matter that much—deception is often an essential part of strategy, whether military or corporate. Having a deep understanding of the adversary’s “concept” is arguably one of the most valuable pieces of intelligence that any strategist could wish for, because that understanding makes deception easier and more powerful.  

But what would be even more valuable—and much harder to achieve—is a deep understanding of your own concept. Because that is where the greatest damage to your interests can be done, where your adversary has the greatest capacity to deceive you. Yet like the Buddhist conundrum of a higher order sense called “awareness” trying to observe one’s own mind, it’s a very hard discipline to adopt. Behavioral economists know well that decision makers who have studied and trained to avoid confirmation bias still fall prey to it more often than not.  

There are ways to restrain or partially reverse pathologic overreliance on strategic concepts. Buddhists have a practice called “beginner’s mind.” To consciously adopt beginner’s mind is to learn to temporarily drop the concepts, models, and frameworks that the mind constructs about the world, and try to focus directly on what is coming in through the sense doors. Imagine doing this with a hard wood surface mounted on four cold steel legs—without the concept of a table guiding your experience. Or reading an intelligence report in Tehran on IDF movements—without trying to fit these facts into a pre-existing model. 

What might you see in the much more extensive possibilities of what your mind today short-circuits into a “dining room table”? What would you conclude as possible or perhaps even likely from the unfiltered awareness of what the Israeli leadership was saying and doing in the run up to the attack?  

None of this is easy—in fact, it might be one of the most difficult tasks of leadership whether it be in statecraft or corporate strategy. It’s not just Buddhists who work at it. Red-teaming can help a bit—the explicit tasking of an alternative analysis team that starts from a different set of assumptions and often tries to mirror what the concepts they believe the adversary is locked onto. Having a legitimate, protected, designated “naysayer” in a decision process—as Lyndon Johnson sometimes used George Ball—can help a bit. AI tools can help, or will soon be able to—so long as strategists have the courage to prompt the AI with something like this: “Tell me how I could be entirely wrong in my analysis, and show me what data points would line up behind a radically alternative concept that might better account for what I’m seeing.”  

The key point is that doing battle with concepts, frameworks, confirmation bias, and with the bad decisions that these tricks of mind make individuals and organizations vulnerable to is a practice, not a formula. And practice literally means practice—doing it every day, and working to get better regardless of how skilled you think you are and how successful you’ve been in the past. Tools and systems and organizational processes can help in that practice, but they won’t on their own overcome the continual tendency of the human mind and the bureaucracies it creates to slip back. 

Red Wedding won’t be repeated in precisely the same way, but the general story will happen again. The best strategists are those who can make it happen less frequently.

Steven Weber is a partner at Breakwater Strategy, an advisory firm based in Washington, D.C, and a professor of the Graduate School at University of California-Berkeley.

Gift this article to a friend

Your membership includes the ability to share articles with friends. Share this article with a friend by clicking the button below.

Please note that we at The Dispatch hold ourselves, our work, and our commenters to a higher standard than other places on the internet. We welcome comments that foster genuine debate or discussion—including comments critical of us or our work—but responses that include ad hominem attacks on fellow Dispatch members or are intended to stoke fear and anger may be moderated.

With your membership, you only have the ability to comment on The Morning Dispatch articles. Consider upgrading to join the conversation everywhere.

https://d3tp52qarp2cyk.cloudfront.net/polly-audio/post-88939-generative-Stephen.feb6a375-8746-440a-977b-3523766815c7.mp3
/

Speed