鹰派为什么总占上风?

来源:百度文库 编辑:神马文学网 时间:2024/04/29 16:30:26

鹰派为什么总是影响这么大呢?答案可能在于人类心灵的深处。人们作决定时有数不清的偏见,但是几乎所有的偏见都是宁愿对抗而不愿妥协。本文就是探讨硬汉为什么在不该赢的时候也能赢?人们很容易夸耀自己的力量,80%的人相信我们的开车技术比一般人要高(那么平均水平是多少?);冲突双方倾向于把自己的行为看作是别人挑衅的反应,这种夫妇吵架的典型特征,在国际间的冲突中也常常看到;人们更愿意为了潜在的损失避免实实在在的损失,即使面临损失更多的风险。
Why are hawks so influential? The answer maylie deep in the human mind. People have dozens of decision-makingbiases, and almost all favor conflict rather than concession. A look atwhy the tough guys win more than they should.
National leaders get all sorts of advice in times of tension andconflict. But often the competing counsel can be broken down into twobasic categories. On one side are the hawks: They tend to favorcoercive action, are more willing to use military force, and are morelikely to doubt the value of offering concessions. When they look atadversaries overseas, they often see unremittingly hostile regimes whoonly understand the language of force. On the other side are the doves,skeptical about the usefulness of force and more inclined tocontemplate political solutions. Where hawks see little in theiradversaries but hostility, doves often point to subtle openings fordialogue.
As the hawks and doves thrust and parry, one hopes that the decisionmakers will hear their arguments on the merits and weigh themjudiciously before choosing a course of action. Don’t count on it.Modern psychology suggests that policymakers come to the debatepredisposed to believe their hawkish advisors more than the doves.There are numerous reasons for the burden of persuasion that dovescarry, and some of them have nothing to do with politics or strategy.In fact, a bias in favor of hawkish beliefs and preferences is builtinto the fabric of the human mind.
Social and cognitive psychologists have identified a number ofpredictable errors (psychologists call them biases) in the ways thathumans judge situations and evaluate risks. Biases have been documentedboth in the laboratory and in the real world, mostly in situations thathave no connection to international politics. For example, people areprone to exaggerating their strengths: About 80 percent of us believethat our driving skills are better than average. In situations ofpotential conflict, the same optimistic bias makes politicians andgenerals receptive to advisors who offer highly favorable estimates ofthe outcomes of war. Such a predisposition, often shared by leaders onboth sides of a conflict, is likely to produce a disaster. And this isnot an isolated example.
In fact, when we constructed a list of the biases uncovered in 40years of psychological research, we were startled by what we found: Allthe biases in our list favor hawks. These psychological impulses—only afew of which we discuss here—incline national leaders to exaggerate theevil intentions of adversaries, to misjudge how adversaries perceivethem, to be overly sanguine when hostilities start, and overlyreluctant to make necessary concessions in negotiations. In short,these biases have the effect of making wars more likely to begin andmore difficult to end.
None of this means that hawks are always wrong. One need only recallthe debates between British hawks and doves before World War II toremember that doves can easily find themselves on the wrong side ofhistory. More generally, there are some strong arguments fordeliberately instituting a hawkish bias. It is perfectly reasonable,for example, to demand far more than a 50-50 chance of being rightbefore we accept the promises of a dangerous adversary. The biases thatwe have examined, however, operate over and beyond such rules ofprudence and are not the product of thoughtful consideration. Ourconclusion is not that hawkish advisors are necessarily wrong, onlythat they are likely to be more persuasive than they deserve to be.
VISION PROBLEMS
Several well-known laboratory demonstrations have examined the waypeople assess their adversary’s intelligence, willingness to negotiate,and hostility, as well as the way they view their own position. Theresults are sobering. Even when people are aware of the context andpossible constraints on another party’s behavior, they often do notfactor it in when assessing the other side’s motives. Yet, people stillassume that outside observers grasp the constraints on their ownbehavior. With armies on high alert, it’s an instinct that leaders canill afford to ignore.
Imagine, for example, that you have been placed in a room and askedto watch a series of student speeches on the policies of Venezuelanleader Hugo Chávez. You’ve been told in advance that the students wereassigned the task of either attacking or supporting Chávez and had nochoice in the matter. Now, suppose that you are then asked to assessthe political leanings of these students. Shrewd observers, of course,would factor in the context and adjust their assessments accordingly. Astudent who gave an enthusiastic pro-Chávez speech was merely doingwhat she was told, not revealing anything about her true attitudes. Infact, many experiments suggest that people would overwhelmingly ratethe pro-Chávez speakers as more leftist. Even when alerted to contextthat should affect their judgment, people tend to ignore it. Instead,they attribute the behavior they see to the person’s nature, character,or persistent motives. This bias is so robust and common that socialpsychologists have given it a lofty title: They call it the fundamentalattribution error.
The effect of this failure in conflict situations can be pernicious.A policymaker or diplomat involved in a tense exchange with a foreigngovernment is likely to observe a great deal of hostile behavior bythat country’s representatives. Some of that behavior may indeed be theresult of deep hostility. But some of it is simply a response to thecurrent situation as it is perceived by the other side. What is ironicis that individuals who attribute others’ behavior to deep hostilityare quite likely to explain away their own behavior as a result ofbeing “pushed into a corner” by an adversary. The tendency of bothsides of a dispute to view themselves as reacting to the other’sprovocative behavior is a familiar feature of marital quarrels, and itis found as well in international conflicts. During the run-up to WorldWar I, the leaders of every one of the nations that would soon be atwar perceived themselves as significantly less hostile than theiradversaries.
If people are often poorly equipped to explain the behavior of theiradversaries, they are also bad at understanding how they appear toothers. This bias can manifest itself at critical stages ininternational crises, when signals are rarely as clear as diplomats andgenerals believe them to be. Consider the Korean War, just one exampleof how misperception and a failure to appreciate an adversary’sassessment of intentions can lead to hawkish outcomes. In October 1950,as coalition forces were moving rapidly up the Korean Peninsula,policymakers in Washington were debating how far to advance andattempting to predict China’s response. U.S. Secretary of State DeanAcheson was convinced that “no possible shred of evidence could haveexisted in the minds of the Chinese Communists about thenon-threatening intentions of the forces of the United Nations.”Because U.S. leaders knew that their intentions toward China were nothostile, they assumed that the Chinese knew this as well. Washingtonwas, therefore, incapable of interpreting the Chinese intervention as areaction to a threat. Instead, the Americans interpreted the Chinesereaction as an expression of fundamental hostility toward the UnitedStates. Some historians now believe that Chinese leaders may in facthave seen advancing Allied forces as a threat to their regime.
CARELESSLY OPTIMISTIC
Excessive optimism is one of the most significant biases thatpsychologists have identified. Psychological research has shown that alarge majority of people believe themselves to be smarter, moreattractive, and more talented than average, and they commonlyoverestimate their future success. People are also prone to an“illusion of control”: They consistently exaggerate the amount ofcontrol they have over outcomes that are important to them—even whenthe outcomes are in fact random or determined by other forces. It isnot difficult to see that this error may have led American policymakersastray as they laid the groundwork for the ongoing war in Iraq.
Indeed, the optimistic bias and the illusion of control areparticularly rampant in the run-up to conflict. A hawk’s preference formilitary action over diplomatic measures is often built upon theassumption that victory will come easily and swiftly. Predictions thatthe Iraq war would be a “cakewalk,” offered up by some supporters ofthat conflict, are just the latest in a long string of bad hawkishpredictions. After all, Washington elites treated the first majorbattle of the Civil War as a social outing, so sure were they thatfederal troops would rout rebel forces. General Noel de Castelnau,chief of staff for the French Army at the outset of World War I,declared, “Give me 700,000 men and I will conquer Europe.” In fact,almost every decision maker involved in what would become the mostdestructive war in history up to that point predicted not only victoryfor his side, but a relatively quick and easy victory. These delusionsand exaggerations cannot be explained away as a product of incompleteor incorrect information. Optimistic generals will be found, usually onboth sides, before the beginning of every military conflict.
If optimism is the order of the day when it comes to assessing one’sown chances in armed conflict, however, gloom usually prevails whenevaluating another side’s concessions. Psychologically, we arereceptive not only to hawks’ arguments for war but also to their caseagainst negotiated solutions. The intuition that something is worthless simply because the other side has offered it is referred to inacademic circles as “reactive devaluation.” The very fact that aconcession is offered by somebody perceived as hostile undermines thecontent of the proposal. What was said matters less than who said it.And so, for example, American policymakers would likely look veryskeptically on any concessions made by the regime in Tehran. Some ofthat skepticism could be the rational product of past experience, butsome of it may also result from unconscious—and not necessarilyrational—devaluation.
Evidence suggests that this bias is a significant stumbling block innegotiations between adversaries. In one experiment, Israeli Jewsevaluated an actual Israeli-authored peace plan less favorably when itwas attributed to the Palestinians than when it was attributed to theirown government. Pro-Israel Americans saw a hypothetical peace proposalas biased in favor of Palestinians when authorship was attributed toPalestinians, but as “evenhanded” when they were told it was authoredby Israelis.
DOUBLE OR NOTHING
It is apparent that hawks often have the upper hand as decisionmakers wrestle with questions of war and peace. And those advantages donot disappear as soon as the first bullets have flown. As the strategiccalculus shifts to territory won or lost and casualties suffered, a newidiosyncrasy in human decision making appears: our deep-seated aversionto cutting our losses. Imagine, for example, the choice between:
Option A: A sure loss of $890
Option B: A 90 percent chance to lose $1,000 and a 10 percent chance to lose nothing.
In this situation, a large majority of decision makers will preferthe gamble in Option B, even though the other choice is statisticallysuperior. People prefer to avoid a certain loss in favor of a potentialloss, even if they risk losing significantly more. When things aregoing badly in a conflict, the aversion to cutting one’s losses, oftencompounded by wishful thinking, is likely to dominate the calculus ofthe losing side. This brew of psychological factors tends to causeconflicts to endure long beyond the point where a reasonable observerwould see the outcome as a near certainty. Many other factors pull inthe same direction, notably the fact that for the leaders who have ledtheir nation to the brink of defeat, the consequences of giving up willusually not be worse if the conflict is prolonged, even if they areworse for the citizens they lead.
U.S. policymakers faced this dilemma at many points in Vietnam andtoday in Iraq. To withdraw now is to accept a sure loss, and thatoption is deeply unattractive. The option of hanging on will thereforebe relatively attractive, even if the chances of success are small andthe cost of delaying failure is high.
Hawks, of course, can cite many moments in recent history whenadversaries actually were unremittingly hostile and when force producedthe desired result or should have been applied much earlier. The clearevidence of a psychological bias in favor of aggressive outcomes cannotdecide the perennial debates between the hawks and the doves. It won’tpoint the international community in a clear direction on Iran or NorthKorea. But understanding the biases that most of us harbor can at leasthelp ensure that the hawks don’t win more arguments than they should.
Daniel Kahneman is a Nobel laureate in economics and Eugene Higginsprofessor of psychology and professor of public affairs at PrincetonUniversity’s Woodrow Wilson School of Public and International Affairs.
Jonathan Renshon is a doctoral student in the Department ofGovernment at Harvard University and author of Why Leaders Choose War:The Psychology of Prevention (Westport: Praeger Security International,2006).
国家领导人在局势紧张或者冲突的时候总是得到各种各样的建议。但是顾问们一般来说可以分为两大基本派别。一类为鹰派,他们倾向于强制性行动,更愿意使用武力,很可能怀疑妥协的价值。对于海外的敌人,他们常常看到只知道武力语言的一贯搞对抗的政权。另外一类为鸽派,怀疑武力的效果,更愿意通过政治办法来解决冲突。在鹰派看到敌意和对抗的地方,鸽派常常能指出对话的微妙机会。
因为鹰派和鸽派相互争斗交手,人们希望决策者听从各自的优点,认真评估利弊后选择行动方针。趁早打消这个念头。现代心理学显示决策者在辩论之前就预先相信鹰派顾问了。虽然鸽派主张有相当说服力,有数不清的理由,有些理由与政治或者战略没有任何关系。实际上,鹰派观点的偏见先天存在于人类思维中。
社会和认知心理学家已经认识到人类在判断情形或者评估风险时的一些可以预测到的错误(心理学家称为偏见)。这些偏见在实验室和现实生活(大部分和国际政治没有联系的情形中)中都能得到证实。比如,人们很容易夸耀自己的力量。80%的人相信我们的开车技术比一般人要高。在潜在冲突的情况下,同样乐观的偏见让政客或者将军更能听进去过高估计战争后果的顾问的主张。这样的预设立场常常在冲突的双方领袖身上都有,因而很可能造成灾难性后果。这样的例子实在太多了。
事实上,当我们拟定过去40年来心理学研究发现的偏见清单时,我们发现结果让人惊讶:清单里的所有偏见都指向鹰派。这些心理上的冲动,我们在本文中只谈论其中个别内容—让国家领导人夸大敌对国家的邪恶企图,误判敌人的自我评价,当敌意出现后过分乐观,在谈判时不愿意做必要的妥协让步。简而言之,这些偏见很容易让战争开始,很难让战争结束。
但这并不是说鹰派总是错误的。人们只需要回忆一下第二次世界大战前英国鹰派和鸽派之间的辩论就很容易明白鸽派常常站在了历史错误的一边。而且,普遍存在有些故意表现鹰派偏见的强烈主张。比如,我们在接受危险的敌人的承诺之前要求准确性概率超过一半是完全有道理的。但是,我们考察的偏见往往超过了谨慎原则的界限,不再是仔细考虑的产物了。我们的结论不是说鹰派顾问一定是错误的,而是说他们在不配占上风的时候也占上风。
视野问题
有几个著名的实验项目考察了人们评价对手的智慧,谈判的意愿,敌意,以及看待自身立场的方式。结果是让人清醒的:即使在人们意识到背景和对方行为可能的限制时,他们在评价对方动机的时候也往往对这个因素不予考虑。但是,人们仍然假定外面的观察家抓住了他们自身行为的限制。在高度警惕的部队,将领们几乎本能地承受不起忽略这点的代价。
比如,想象一下你被安排进入一间屋子,观看一连串学生关于委内瑞拉总统查韦斯(HugoChávez)政策的演讲。事先有人告诉你学生们被指定了任务要么攻击要么支持查韦斯,自己没有选择的余地。假如人家要你评价这些学生的政治倾向,精明的观察家当然要考虑背景因素,然后按具体情况做出评价。一个发表热情洋溢支持查韦斯演讲的学生或许不过是完成一个作业而已,并不能因此就表明他真实的态度。可实际上,许多实验发现人们大部分把支持查韦斯的演讲者认定为左派。即使认识到影响他们判断的背景因素,往往也不予考虑。相反,他们试图从这个人的行为中扑捉显示本性、性格、或者连贯动机的蛛丝马迹。这样的偏见是如此强烈和普遍,以至于社会心理学家给出了玄虚的标题“基本归因误差”(thefundamental attribution error)
冲突情景下的失败影响是致命的。一个与外国政府进行紧张谈判的决策者或者外交官可能观察到该国代表者大量充满敌意的行为。这些行为中的有些内容确实是充满敌意的,可有些不过是别人对现状的自然反应而已。具有讽刺意味的是,那些把别人的行为归结为深刻敌意的人最容易解释自己的行为是被敌人逼到墙角的结果。冲突双方倾向于把自己的行为看作是别人挑衅的反应是夫妇吵架的典型特征,在国际间的冲突中也常常看到。在第一次世界大战前,即将参战的每个国家的领袖都觉得自己没有对方那么充满仇恨。
如果人们对解释对手的行为方面知识不足,他们同样在理解自己在对方眼中的形象时非常蹩脚。偏见可以在国际危机处于关键的时刻表现出来,当信号很少在外交官或者将军觉得清晰的时候。想想误判,错误理解对手意图导致的鹰派糟糕结果的明显例子—朝鲜战争吧。1950年10月,当联合国军迅速向朝鲜半岛挺进,华盛顿的决策者在辩论要前进多远,并试图预测中国会做出什么样的反应。美国国务卿艾奇逊相信“没有任何可能的证据显示中国共产党人头脑中意识到联合国军队的威胁性意图。”因为美国领袖知道他们对中国的意图不是仇视的,所以觉得中国人也明白这点。因此,华盛顿没能理解中国的干预是对威胁做出的反应。相反,美国人把中国人的行为看作对美国人极端仇视的表现。现在有些历史学家相信中国领袖实际上已经把前进的联合国军看作对对中国政权的威胁了。
满不在乎的乐观主义
过分乐观是心理学家发现的最著名的偏见了。心理学研究显示大多数人相信自己比实际上更聪明,更有魅力,有超出一般的才华,往往高估未来的成就。人们还往往有“控制的幻觉”。人们再三夸大控制对他们来说非常重要的事情的后果的能力,即使当后果实际上是偶然性的,或者被别的因素所决定。不难认识到这个错误导致了美国的决策者误入歧途,在他们对伊拉克展开地面战的时候。
实际上,乐观偏见和控制幻觉在导致冲突的过程中特别常见。鹰派喜欢军事行动而不是外交手段常常就是建立在胜利能够很容易很迅速完成这样的假设上。冲突支持者的一些认为伊拉克战争将是“步态竞赛”(cakewalk)的预言是鹰派一长串糟糕预言中最新的例子。毕竟,华盛顿精英把美国内战中第一场重大战役看作交友活动(socialouting),联邦军队非常肯定可以轻松收拾叛乱的军队。在第一次世界大战开始的时候,法国陆军参谋长将军卡斯特诺(Noel deCastelnau)宣称“给我70万军队,我将征服整个欧洲。”实际上,造成历史上破坏性最大的战争的几乎每个决策者都不仅预测自己会胜利,而且是个相对迅速和容易的胜利。这些错觉和夸张不能仅仅作为不充分的,不准确的信息造成的。通常在每个军事冲突开始的战争双方都能发现这样乐观的将军。
如果评估自己一方在武装冲突中的结果时采用乐观的看法,那么,在评价对方的妥协时往往是悲观情绪盛行。从心理学角度看,我们不仅倾向于接受鹰派关于战争的叫嚣,而且倾向于反对妥协让步的办法。某些东西一旦是对方给予的就被看作价值不大的本能在学术界被称为“反应性贬值”(reactivedevaluation)。被看作是对手的一方做出的让步这个事实本身就破坏了建议的内容。内容本身不重要,关键是看谁说的。因此,美国决策者对德黑兰政权做出的任何让步都心存疑虑。这些怀疑有些确实是根据以往经验理性思考的产物,但是有些疑虑可能来自没有意识到的或者故意的贬低对方建议价值的结果。
证据显示这个偏见是敌对双方谈判时很大的拦路虎。在一个实验中,以色列犹太人对巴勒斯坦人起草的和平计划评价就不如对自己的政府色列起草的和平计划高,虽然和平计划是一样的。亲以色列的美国人看了据说是巴勒斯坦人起草的假设性和平计划,就认为里面充满了袒护巴勒斯坦人的倾向,但如果说这是以色列人起草的,他们就说这个计划比较公平合理。
双倍或者什么也没有
决策者在面临战争与和平的抉择时,很明显鹰派常常占上风。一旦打响了第一枪,这些优势就消失了。因为战略考量已经转移到战争的胜负,承受的伤亡代价,人类决策行为中新的过敏反应就出现了,我们根深蒂固要减少损失的厌恶。比如,我们想象一下这个选择:
甲:肯定损失890美元。
乙:损失1000美元的机会是90%,不损失一分钱的机会是10%。
在这个情形下,大部分的决策者会选择乙,即使甲在统计学上来讲更划算。人们更愿意为了潜在的损失避免实实在在的损失,即使面临损失更多的风险。在冲突中如果情形变得糟糕,对减少损失的厌恶,常常加上一厢情愿的乐观,很容易主宰对损失方面的算计。这种心理因素的酝酿过程导致冲突持续更长的时间,本来理性的观察家已经清清楚楚看到必然的结果了。许多其他因素同样促成结果往这个方向来,比如把国家引向失败边缘的领导人,如果冲突比预期的长,放弃战争的后果往往不严重,虽然对于他带领的民众来说非常糟糕。
美国决策者在越南和当今的伊拉克都面临这个两难的困境。现在就撤军就是接受肯定的失败。所以这个选择肯定没有吸引力。继续留在伊拉克肯定相对来说吸引力大些,虽然成功的机会渺茫,推迟失败的代价更高。
当然,鹰派可以引用近代史上很多时刻,说明军队达到了预期效果,敌人仍然坚持不懈充满仇恨,或者早点动用武力效果更好。倾向进攻性后果的心理学偏见的明显证据不能解决鹰派和鸽派永远存在的辩论,也不能给国际社会指出在伊朗或者北朝鲜问题上的清晰方向。但是了解多数人拥有的这个心理偏见至少帮助确保鹰派不至于得到不应该得到的支持。
作者简介:
丹尼尔•卡内曼(Daniel Kahneman)诺贝尔经济学奖获得者,尤金•黑金斯(Eugene Higgins)心理学教授,普林斯顿大学威尔逊公共管理和国际关系学院教授。
乔纳森•任逊(Jonathan Renshon)哈佛大学管理系博士生,著有《领袖为什么选择战争:预防心理学》(Why LeadersChoose War: The Psychology of Prevention (西港:普瑞格国际安全出版社 2006))
《Why Hawks Win》原文
Why are hawks so influential? The answer may lie deep in the humanmind. People have dozens of decision-making biases, and almost allfavor conflict rather than concession. A look at why the tough guys winmore than they should.
National leaders get all sorts of advice in times of tension andconflict. But often the competing counsel can be broken down into twobasic categories. On one side are the hawks: They tend to favorcoercive action, are more willing to use military force, and are morelikely to doubt the value of offering concessions. When they look atadversaries overseas, they often see unremittingly hostile regimes whoonly understand the language of force. On the other side are the doves,skeptical about the usefulness of force and more inclined tocontemplate political solutions. Where hawks see little in theiradversaries but hostility, doves often point to subtle openings fordialogue.
As the hawks and doves thrust and parry, one hopes that the decisionmakers will hear their arguments on the merits and weigh themjudiciously before choosing a course of action. Don’t count on it.Modern psychology suggests that policymakers come to the debatepredisposed to believe their hawkish advisors more than the doves.There are numerous reasons for the burden of persuasion that dovescarry, and some of them have nothing to do with politics or strategy.In fact, a bias in favor of hawkish beliefs and preferences is builtinto the fabric of the human mind.
Social and cognitive psychologists have identified a number ofpredictable errors (psychologists call them biases) in the ways thathumans judge situations and evaluate risks. Biases have been documentedboth in the laboratory and in the real world, mostly in situations thathave no connection to international politics. For example, people areprone to exaggerating their strengths: About 80 percent of us believethat our driving skills are better than average. In situations ofpotential conflict, the same optimistic bias makes politicians andgenerals receptive to advisors who offer highly favorable estimates ofthe outcomes of war. Such a predisposition, often shared by leaders onboth sides of a conflict, is likely to produce a disaster. And this isnot an isolated example.
In fact, when we constructed a list of the biases uncovered in 40years of psychological research, we were startled by what we found: Allthe biases in our list favor hawks. These psychological impulses—only afew of which we discuss here—incline national leaders to exaggerate theevil intentions of adversaries, to misjudge how adversaries perceivethem, to be overly sanguine when hostilities start, and overlyreluctant to make necessary concessions in negotiations. In short,these biases have the effect of making wars more likely to begin andmore difficult to end.
None of this means that hawks are always wrong. One need only recallthe debates between British hawks and doves before World War II toremember that doves can easily find themselves on the wrong side ofhistory. More generally, there are some strong arguments fordeliberately instituting a hawkish bias. It is perfectly reasonable,for example, to demand far more than a 50-50 chance of being rightbefore we accept the promises of a dangerous adversary. The biases thatwe have examined, however, operate over and beyond such rules ofprudence and are not the product of thoughtful consideration. Ourconclusion is not that hawkish advisors are necessarily wrong, onlythat they are likely to be more persuasive than they deserve to be.
VISION PROBLEMS
Several well-known laboratory demonstrations have examined the waypeople assess their adversary’s intelligence, willingness to negotiate,and hostility, as well as the way they view their own position. Theresults are sobering. Even when people are aware of the context andpossible constraints on another party’s behavior, they often do notfactor it in when assessing the other side’s motives. Yet, people stillassume that outside observers grasp the constraints on their ownbehavior. With armies on high alert, it’s an instinct that leaders canill afford to ignore.
Imagine, for example, that you have been placed in a room and askedto watch a series of student speeches on the policies of Venezuelanleader Hugo Chávez. You’ve been told in advance that the students wereassigned the task of either attacking or supporting Chávez and had nochoice in the matter. Now, suppose that you are then asked to assessthe political leanings of these students. Shrewd observers, of course,would factor in the context and adjust their assessments accordingly. Astudent who gave an enthusiastic pro-Chávez speech was merely doingwhat she was told, not revealing anything about her true attitudes. Infact, many experiments suggest that people would overwhelmingly ratethe pro-Chávez speakers as more leftist. Even when alerted to contextthat should affect their judgment, people tend to ignore it. Instead,they attribute the behavior they see to the person’s nature, character,or persistent motives. This bias is so robust and common that socialpsychologists have given it a lofty title: They call it the fundamentalattribution error.
The effect of this failure in conflict situations can be pernicious.A policymaker or diplomat involved in a tense exchange with a foreigngovernment is likely to observe a great deal of hostile behavior bythat country’s representatives. Some of that behavior may indeed be theresult of deep hostility. But some of it is simply a response to thecurrent situation as it is perceived by the other side. What is ironicis that individuals who attribute others’ behavior to deep hostilityare quite likely to explain away their own behavior as a result ofbeing “pushed into a corner” by an adversary. The tendency of bothsides of a dispute to view themselves as reacting to the other’sprovocative behavior is a familiar feature of marital quarrels, and itis found as well in international conflicts. During the run-up to WorldWar I, the leaders of every one of the nations that would soon be atwar perceived themselves as significantly less hostile than theiradversaries.
If people are often poorly equipped to explain the behavior of theiradversaries, they are also bad at understanding how they appear toothers. This bias can manifest itself at critical stages ininternational crises, when signals are rarely as clear as diplomats andgenerals believe them to be. Consider the Korean War, just one exampleof how misperception and a failure to appreciate an adversary’sassessment of intentions can lead to hawkish outcomes. In October 1950,as coalition forces were moving rapidly up the Korean Peninsula,policymakers in Washington were debating how far to advance andattempting to predict China’s response. U.S. Secretary of State DeanAcheson was convinced that “no possible shred of evidence could haveexisted in the minds of the Chinese Communists about thenon-threatening intentions of the forces of the United Nations.”Because U.S. leaders knew that their intentions toward China were nothostile, they assumed that the Chinese knew this as well. Washingtonwas, therefore, incapable of interpreting the Chinese intervention as areaction to a threat. Instead, the Americans interpreted the Chinesereaction as an expression of fundamental hostility toward the UnitedStates. Some historians now believe that Chinese leaders may in facthave seen advancing Allied forces as a threat to their regime.
CARELESSLY OPTIMISTIC
Excessive optimism is one of the most significant biases thatpsychologists have identified. Psychological research has shown that alarge majority of people believe themselves to be smarter, moreattractive, and more talented than average, and they commonlyoverestimate their future success. People are also prone to an“illusion of control”: They consistently exaggerate the amount ofcontrol they have over outcomes that are important to them—even whenthe outcomes are in fact random or determined by other forces. It isnot difficult to see that this error may have led American policymakersastray as they laid the groundwork for the ongoing war in Iraq.
Indeed, the optimistic bias and the illusion of control areparticularly rampant in the run-up to conflict. A hawk’s preference formilitary action over diplomatic measures is often built upon theassumption that victory will come easily and swiftly. Predictions thatthe Iraq war would be a “cakewalk,” offered up by some supporters ofthat conflict, are just the latest in a long string of bad hawkishpredictions. After all, Washington elites treated the first majorbattle of the Civil War as a social outing, so sure were they thatfederal troops would rout rebel forces. General Noel de Castelnau,chief of staff for the French Army at the outset of World War I,declared, “Give me 700,000 men and I will conquer Europe.” In fact,almost every decision maker involved in what would become the mostdestructive war in history up to that point predicted not only victoryfor his side, but a relatively quick and easy victory. These delusionsand exaggerations cannot be explained away as a product of incompleteor incorrect information. Optimistic generals will be found, usually onboth sides, before the beginning of every military conflict.
If optimism is the order of the day when it comes to assessing one’sown chances in armed conflict, however, gloom usually prevails whenevaluating another side’s concessions. Psychologically, we arereceptive not only to hawks’ arguments for war but also to their caseagainst negotiated solutions. The intuition that something is worthless simply because the other side has offered it is referred to inacademic circles as “reactive devaluation.” The very fact that aconcession is offered by somebody perceived as hostile undermines thecontent of the proposal. What was said matters less than who said it.And so, for example, American policymakers would likely look veryskeptically on any concessions made by the regime in Tehran. Some ofthat skepticism could be the rational product of past experience, butsome of it may also result from unconscious—and not necessarilyrational—devaluation.
Evidence suggests that this bias is a significant stumbling block innegotiations between adversaries. In one experiment, Israeli Jewsevaluated an actual Israeli-authored peace plan less favorably when itwas attributed to the Palestinians than when it was attributed to theirown government. Pro-Israel Americans saw a hypothetical peace proposalas biased in favor of Palestinians when authorship was attributed toPalestinians, but as “evenhanded” when they were told it was authoredby Israelis.
DOUBLE OR NOTHING
It is apparent that hawks often have the upper hand as decisionmakers wrestle with questions of war and peace. And those advantages donot disappear as soon as the first bullets have flown. As the strategiccalculus shifts to territory won or lost and casualties suffered, a newidiosyncrasy in human decision making appears: our deep-seated aversionto cutting our losses. Imagine, for example, the choice between:
Option A: A sure loss of $890
Option B: A 90 percent chance to lose $1,000 and a 10 percent chance to lose nothing.
In this situation, a large majority of decision makers will preferthe gamble in Option B, even though the other choice is statisticallysuperior. People prefer to avoid a certain loss in favor of a potentialloss, even if they risk losing significantly more. When things aregoing badly in a conflict, the aversion to cutting one’s losses, oftencompounded by wishful thinking, is likely to dominate the calculus ofthe losing side. This brew of psychological factors tends to causeconflicts to endure long beyond the point where a reasonable observerwould see the outcome as a near certainty. Many other factors pull inthe same direction, notably the fact that for the leaders who have ledtheir nation to the brink of defeat, the consequences of giving up willusually not be worse if the conflict is prolonged, even if they areworse for the citizens they lead.
U.S. policymakers faced this dilemma at many points in Vietnam andtoday in Iraq. To withdraw now is to accept a sure loss, and thatoption is deeply unattractive. The option of hanging on will thereforebe relatively attractive, even if the chances of success are small andthe cost of delaying failure is high.
Hawks, of course, can cite many moments in recent history whenadversaries actually were unremittingly hostile and when force producedthe desired result or should have been applied much earlier. The clearevidence of a psychological bias in favor of aggressive outcomes cannotdecide the perennial debates between the hawks and the doves. It won’tpoint the international community in a clear direction on Iran or NorthKorea. But understanding the biases that most of us harbor can at leasthelp ensure that the hawks don’t win more arguments than they should.
Daniel Kahneman is a Nobel laureate in economics and Eugene Higginsprofessor of psychology and professor of public affairs at PrincetonUniversity’s Woodrow Wilson School of Public and International Affairs.
Jonathan Renshon is a doctoral student in the Department ofGovernment at Harvard University and author of Why Leaders Choose War:The Psychology of Prevention (Westport: Praeger Security International,2006).