
Steven A. Yetiv. National Security through a Cockeyed Lens: How Cognitive Bias Impacts U.S. Foreign Policy. Baltimore: Johns Hopkins University Press, 2013. 155 pp. $24.95 (paper), ISBN 978-1-4214-1125-5.
Reviewed by William W. Newmann (Virginia Commonwealth University)
Published on H-Diplo (July, 2014)
Commissioned by Seth Offenbach (Bronx Community College, The City University of New York)
The study of national security decision making is one of the most important areas in the field of international relations. This is even more obvious in the light of recent policy-making mistakes, such as the failure to detect al-Qaeda’s plans for the September 11 attacks, the invasion of Iraq to destroy Saddam Hussein’s weapons of mass destruction that turned out not to exist, the surprise of the George W. Bush administration when post-Saddam Iraq collapsed into civil war, or a similar surprise within the Barack Obama administration as the civil war in Syria began to destabilize Iraq. It brings back an age-old question: how can the “best and the brightest” get it wrong? Perhaps the question can be tweaked a little: why does this happen so often and what can we do to remedy the problem?
As an answer to this second set of questions, Steve Yetiv’s National Security through a Cockeyed Lens: How Cognitive Bias Impacts U.S. Foreign Policy, serves as a seminal work, instructive for scholars and decision makers alike. Following the Vietnam debacle of the 1960s, volumes were written to try to explain why U.S. policy had gone so horribly wrong. Many of them focused on institutional problems, but some of the best were rooted in social psychology and aimed at understanding how misperception can lead a nation into a war it did not understand. At root was the hope that an autopsy of decision making might prevent the United States from ever making the same mistakes again. Yetiv’s book is rooted in that social psychology approach and is one of the best postmortems of national security decisions relating to Afghanistan, Iran, al-Qaeda, and Iraq. Again, the issue is how we often get it so wrong, so consistently.
In an age when rational choice decision-making models dominate scholarly literature, Yetiv’s most important contribution may be to remind everyone that human beings are not all that rational when they make decisions, either as individuals or as groups of individuals. We carry a briefcase full of biases when we enter a room to make a decision; national security policy making can often be a showcase for those biases. In short, decision makers are only quasi-rational. Our cognitive biases prevent us from seeing problems and solutions clearly; they dominate our thought processes and lead us to make huge analytical mistakes that in hindsight seem obvious.
While the detailed case studies are excellent, the approach to the case studies is innovative and elevates the importance of the book. Each case study is examined as an example of a specific cognitive bias that inhibited solid policy analysis, option formation, and ultimately policy choice. Yetiv also includes case studies of Soviet and al-Qaeda decision making to illustrate that these types of biases are not just pathologies of U.S. policy making in the context of international and institutional pressures; they are biases that can affect oligarchic decision making at the apex of an authoritarian state and even a small group of decision makers seeking to launch a revolution.
The first case study looks at biases related to “intention and threat perception.” In short, decision makers mistakenly believe that their actions, taken only as defensive measures with obviously transparent motives, could not be seen as threatening to others. Following a coup and countercoup in Afghanistan in 1978-79, Moscow made the decision to invade, hoping to stabilize the pro-Soviet regime. To Soviet decision makers the issue was securing an unstable border. Their cognitive biases prevented them from understanding how others might perceive the policy. For the United States, this was a Soviet attempt to use the crises in Afghanistan and Iran as stepping stones to seizure of Persian Gulf oil, a direct threat to the West and a potential opening gambit in what might escalate into world war. Soviet inability to see its actions as others might perceive them was a bias that had severe consequences: the end of détente, a new U.S. focus on the Middle East, a Sino-American-Pakistani-Saudi-Egyptian partnership to aid any groups willing to fight the Russians, and ultimately the chaotic conditions ripe for the birth of a radical group such as al-Qaeda. Amazingly, the bias prevented the Soviets from recognizing a concept that is typically explained the first week of any undergraduate course in international relations: the security dilemma.
The bias “focus feature” frames the second case study. When one aspect of the decision dominates the thinking of the decision makers, weighing in so prominently that all other considerations become secondary, issues become simplified, tradeoffs go unrecognized, and potential consequences are never fully evaluated. The decision of the Reagan administration to sell arms to Iran in the mid-1980s is used as the example of this cognitive bias. In its genesis the arguments for or against the arms sales were a complex mix of geopolitical considerations (the Iran-Iraq War, the regional balance of power between Saudi Arabia, Iran, Israel, and Iraq) and U.S. executive-legislative competition (who has the power to direct U.S. arms sales and when can legislative powers be circumvented on national security grounds). The decision, however, devolved into a focus on one aspect of the problem--freeing hostages. All other considerations became secondary. Again, the bias led to much greater consequences than the president expected (even though he had been warned by several advisers). The Iran-Contra scandal tainted the Reagan administration, and arguably the balance of power in the Iran-Iraq War was tipped at a critical moment.
Al-Qaeda’s perception of the United States is explained as an example of “confirmation bias.” In this third case, al-Qaeda saw all U.S. actions in the world in the 1980s and 1990s through the lens of its already established judgments on the superpower. Through this biased lens, U.S. actions of 1990-91 in defense of Kuwait and Saudi Arabia were seen as an invasion of Iraq and an occupation of Saudi Arabia. Al-Qaeda’s narrative was one of unrelenting Western hostility toward Islam dating back to the Crusades. Nothing that challenged that narrative ever slid through al-Qaeda’s cognitive filters. U.S. intervention to provide food aid to Somalia was viewed as an invasion and U.S. support for Bosnian Muslims in the mid-1990s was ignored. This bias leads decision makers to simplify the parameters of a situation, ignoring the complexity of reality.
The fourth case study focuses on the U.S. decision to invade Iraq in 2003 and the cognitive bias of overconfidence. Here, Yetiv examines how subjective confidence can override objective accuracy. He identifies two types of overconfidence: an overestimation of capabilities and an overestimation of the chances of success. Since military historians have been studying exactly these two issues at least since Thucydides, it is especially shocking to see a modern case where a group of experienced decision makers are so completely in the grips of such a cognitive bias. The case study is very familiar by now, but it will always be an important one. Yetiv rightly points out that there was debate, within the administration and in public, but all counterarguments were ultimately pushed aside. Typically, scholars focus on how intelligence was ignored or fell prey to confirmation bias (Yetiv could have used the 2003 invasion of Iraq as an example of nearly all of the cognitive biases he explores); however, Yetiv sees overconfidence as the root problem during the Bush administration’s decision. All other considerations, complexities, and uncertainties were overshadowed by the belief within the Bush administration that the overthrow of Saddam Hussein and the democratization of Iraq would be easy; why worry so much about a “walk in the park”? The results of the cognitive biases of the Bush administration are still echoing throughout the region.
The final case study looks at the bias of short-term thinking. It is a slightly different type of case study. Yetiv looks at why the United States does not have a comprehensive energy policy that outlines a long-term plan for perhaps the most critical economic and national security issue the country faces. Short-term bias affects every aspect of U.S. national security, but focusing on energy allows Yetiv to choose a case study related to all the other cases in the book. This case also brings the electoral politics element of U.S. national security into play. Politicians want to win reelection and that places them in a time horizon of two to six years. Voters want low energy prices and very rarely think about the energy needs of their grandchildren when they step into a voting booth. Long-term thinking about energy (or arguably anything) becomes a political liability. In this case, the bias is built into the political system and policy makers who try to educate the public to the importance of long-term issues take substantial risks by, in effect, defying the biases of an entire nation.
Yetiv’s final chapter analyzes twelve different strategies for debiasing. Here is where the book moves from descriptive to prescriptive, from an excellent scholarly study to something that future national security advisers might want to assign as required reading for anyone hoping for a position on the National Security Council staff. One of the great dilemmas for U.S. national security is that presidents are generally elected for their knowledge of domestic affairs. They are national security amateurs when they step into the Oval Office. They also possess a powerful ego, a prerequisite for a person who believes that he or she should be the most powerful person on the planet. Too often they enter their role with one other powerful bias: the belief that they are different, that they cannot make the same mistakes made by other presidents, and will not face the same the tradeoffs that other presidents have faced. They feel they are so unique that the past has little bearing on their tenure in office. If presidents and their advisers can recognize that bias and toss it aside, Yetiv’s volume could be one of the key books for presidents and their advisers to read before they begin making decisions, a kind of a reminder: you’ve never made decisions like this before, but others have. Here is where they went wrong. Organizational memory in the presidency is a huge problem. Yetiv’s book can be one way of preserving that memory.
If there is additional discussion of this review, you may access it through the network, at: https://networks.h-net.org/h-diplo.
Citation:
William W. Newmann. Review of Yetiv, Steven A., National Security through a Cockeyed Lens: How Cognitive Bias Impacts U.S. Foreign Policy.
H-Diplo, H-Net Reviews.
July, 2014.
URL: http://www.h-net.org/reviews/showrev.php?id=41168
![]() | This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. |