以成败论英雄:导致巨大灾难的“结果认知偏差”
以成败论英雄:导致巨大灾难的“结果认知偏差”
Imagine a pilot is taking a familiar flight along a known route, during which the weather takes a turn for the worst. She knows that flying through the storm comes with some serious risks – and according to her training, she should take a detour or return. But she has flown the same route before, in similar weather – and she hadn’t experienced any problems then. Should she continue? Or should she turn back?
我们来做这样一个设想:一个飞行员正在沿着一条已知的航线进行她熟悉的飞行,就在此时,天气突然转坏。她知道穿过暴风雨飞行会有严重的危险,而且根据她的训练,此时她应该绕过暴风雨或返航。但她之前也曾在类似的恶劣的天气下飞过同一条路线,但最终安然无恙。那么,她应该继续向前飞?还是应该掉头而去?
If you believe that she is safe to fly on, then you have fallen for a cognitive quirk known as the “outcome bias”. Studies have shown that we often judge the quality of a decision or behaviour by its endpoint, while ignoring the many mitigating factors that might have contributed to success or failure – and that this can render us oblivious to potentially catastrophic errors in our thinking.
如果你相信她继续往前飞是安全的,那么你就犯了一种被称为“结果认知偏差”的认知错误。研究表明,我们在对一个决定或行动作出评估时,往往只注重于结果这一因素,而忽略了许多可能导致成功或失败的相关因素。这种认知会让我们看不见我们所做的决策可能是错的,会带来灾难性后果。
In this example, the decision to take the previous flight was itself very risky – and the pilot may have only avoided an accident through a combination of lucky circumstances. But thanks to the outcome bias, she might ignore this possibility and assume that either the dangers had been overrated, or that it was her extraordinary skill that got her through, leading her to feel even happier taking the risk again in the future. And the more she does it, the less concerned about the danger she becomes.
在此个案中,飞行员上一趟冒暴风雨飞行的决定本身就是冒很大的风险,可能只是侥幸避免了一场空难而已。但由于她的结果认知偏差,她可能会忽略发生事故的可能性,并认为要么是危险被高估,要么是她非凡的技能让她度过了难关,结果让她其后更勇于再次冒险犯难。她这样做的次数越多,就越不在乎会面临的危险。
Besides leading us to become increasingly risky in our decision-making, the outcome bias can lead us to ignore incompetence and unethical behaviour in our colleagues. And the consequences can be truly terrifying, with studies suggesting that it has contributed to many famous catastrophes, including the crash of Nasa’s Columbia shuttle and the Deepwater Horizon oil spill.
结果认知偏差除了让我们在决策时变得越来越勇于冒风险外,还会使我们忽视同事的无能和不道德行为。其后果可能真的很可怕。研究表明,这种认知失误导致了许多著名的灾难,比如美国太空总署(Nasa)的哥伦比亚号(Columbia)航天飞机爆炸解体和深水地平线(Deepwater Horizon)钻油平台石油泄漏事件。
The end, not the means
不问过程,只问结果
Like much of our understanding of human irrationality, the outcome bias was first observed in the 1980s, with a seminal study of medical decision-making.
就像我们对人类非理性的许多认识一样,20世纪80年代一项有关医疗决策过程的开拓性研究首次观察到人类有一种受结果主导的认知偏差。
Participants were given descriptions of various scenarios, including the risks and benefits of the different procedures, and then asked to rate the quality of the doctors’ judgement.
这项研究向获邀参与研究的人士描述了各种不同的状况,包括不同手术的风险和好处,然后要求这些参与者评价医生行医能力的高低。
The participants were told about a doctor’s choice to offer a patient a heart bypass, for instance – potentially adding many more years of good health, but with a small chance of death during the operation. Perhaps predictably, the participants judged the doctor’s decision far more harshly if they were told the patient subsequently died than when they were told that the patient lived – even though the benefits and risks were exactly the same in each case.
例如,参与者被告知,医生建议给病人做心脏搭桥的手术,如此病人会多活几年,而手术出意外死亡的几率很小。如果告诉参与者,病人手术时不幸死亡,或告诉参与者,病人做手术后活了下来,或许可以预料的是,在前一种情况参与者对医生决定的评价比后一种情况要严厉得多,尽管在两种情况下手术的好处和风险是完全一样的。
The outcome bias is so deeply ingrained in our brains that it’s easy to understand why they would feel that the doctor should be punished for the patient’s death. Yet the participants’ reasoning is not logical, since there would have been no better way for the doctor to have weighed up that evidence – at the time of making the decision there was every chance the operation would have been a success. Once you know about the tragedy, however, it’s hard to escape that nagging feeling that the doctor was nevertheless at fault – leading the participants to question his competence.
这种以结果作判断的偏见在我们的大脑中根深蒂固,这就很容易理解为什么他们会认为医生应该为病人的死亡而受到惩罚。然而参与者这种推论不合逻辑,因为医生在建议动手术之时是没有任何方法保证手术可以万无一失,一定成功。然而,一旦你知道病人死亡这个悲剧结果,就很难摆脱那种医生有错的感觉,并由此质疑医生的能力。
“We just have a hard time dissociating the random events that, along with the quality of the decision, jointly contribute to the outcome,” explains Krishna Savani at the Nanyang Technological University in Singapore.
新加坡南洋理工大学(Nanyang Technological University)的克里斯纳‧萨瓦尼(Krishna Savani)解释这个现象说:“受限于结果,我们很难将这些随机偶发事件与决策的能力分开来看。”
The finding, published in 1988, has been replicated many times, showing that negative results lead us to blame someone for events that were clearly beyond their control, even when we know all the facts that excuse their decision-making. And we now know that the opposite is also true: thanks to the outcome bias, a positive result can lead us to ignore flawed decision-making that should be kept in check, giving people a free pass for unacceptable behaviour.
这项研究发表于1988年。后来的许多研究也有同样的发现,显示如果结果是负面不幸的,会导致我们将不幸归咎于超出其负责范围的他人,即使我们知道所有的事实都证明与这些人的决策无关。我们现在知道,反过来也一样,由于结果认知偏差,如果结果是好的,是正面的,就可能导致我们忽视决策的错误,不去三思而行,让人们对不可接受的行为放任自流。
In one experiment by Francesca Gino at Harvard Business School, participants were told a story about a scientist who fudged their results to prove the efficacy of a drug they were testing. Gino found that the participants were less critical of the scientist’s behaviour if the drug turned out to be safe and effective than if it turned out to have dangerous side effects. Ideally, of course, you would judge both situations equally harshly – since an employee who behaves so irresponsibly could be a serious liability in the future.
哈佛商学院(Harvard Business School)的弗兰切斯卡·吉诺(Francesca Gino)做了一个实验,参与者被告知一个科学家捏造实验结果以证明他们正在测试的一种药物是有效的。吉诺发现,如果发现此药物最后是安全有效的,参与者对该科学家欺骗行为的批评比发现药物有危险副作用要轻一些。当然,符合理性的认知是,不论该药物安全与否,你都应该严厉批评这个医生,因为一个行为如此不负责任的员工将来可能会闯出大祸。
Such flawed thinking is a serious issue when considering things like promotion. It means that an investor, say, could be rewarded for a lucky streak in their performance even if there is clear evidence of incompetent or unethical behaviour, since their boss is unable to disconnect their decision-making from their results. Conversely, it shows how a failure can subtly harm your reputation even if there is clear evidence that you had acted appropriately based on the information at hand.
这种错误认知在考虑升职等事项时,会是一个严重的问题。这意味着,即使有明显证据表明投资者存在不称职或不道德的行为,他们也可能因业绩表现良好而获得奖励,因为他们的老板无法将他们的决策与业绩脱钩。相反,结果认知偏差也表明了失败会不知不觉地损害你的声誉,即使有明确的证据表明,你根据获得的信息所采取的行动是恰当的。
“It’s a big problem that people are either being praised, or being blamed, for events that were largely determined by chance,” says Savani. “And this is relevant for government policy makers, for business managers – for anyone who's making a decision.”
萨瓦尼说:“虽然事件的结果很大程度是偶然因素决定的,但作出决策者不是被赞许肯定,就是被归咎指责,这是一个大问题。政府政策制定者、企业管理者以及任何决策者都会遇到。”
The outcome bias may even affect our understand of sport. Arturo Rodriguez at the University of Chile recently examined pundits’ ratings of footballers on Goal.com. In games that had to be decided by penalty shootouts, he found that the results of those few short minutes at the end of the game swayed the experts’ judgements of the players’ performance throughout the whole match. Crucially, that was even true for the players who hadn’t scored any goals. “The result of the shoot-out had a significant impact on the individual evaluation of the players – even if they didn’t participate in it,” Rodriguez says. They could simply bask in the victory of others.
结果认知偏差甚至可能影响我们对体育运动的理解。智利大学(University of Chile)的阿图罗·罗德瑞格兹(Arturo Rodriguez)最近研究了Goal.com网站上专家对足球运动员的评级打分。他发现,在必须由十二码点球决胜负的比赛中,比赛结束前那几分钟的结果会影响专家对球员整场比赛表现的评价。关键的是,对于那些没有进球的球员来说也是如此。罗德瑞格兹说:“点球获胜的结果对球员的个人评价影响很大,即使进球的并非他们。”他们简直就是在享受他人的胜利。
Near misses
差点就酿成灾难
The outcome bias’s most serious consequences, however, concern our perceptions of risk.
然而,结果认知偏差最严重的后果是关系到我们对风险的感知。
One study of general aviation, for instance, examined pilots’ evaluations of flying under perilous weather conditions with poor visibility. It found that pilots were more likely to underestimate the dangers of the flight if they had just heard that another pilot had successfully made it through the same route. In reality, there is no guarantee that their success would mean a safe passage for the second flight – they may have only made it through by luck – but the outcome bias means that the pilots overlooked this fact.
例如,一项针对一般航空的研究,调查飞行员在能见度低的危险天气条件下如何做出飞行判断。研究发现,如果飞行员刚刚听说另一名飞行员成功地飞过了同一航线,他们会比较低估此种天气状态下飞行的危险。事实上,这并不能保证前者的成功就意味着后者的飞行也会安全,因为前者的成功可能只是运气好而已,但其成功的结果就会导致后者忽视了危险这一事实。
Catherine Tinsley, at Georgetown University, has found a similar pattern in people’s responses to natural disasters like hurricanes. If someone weathers one storm unscathed, they become less likely to purchase flood insurance before the next disaster, for instance.
美国华府乔治敦大学(Georgetown University)的凯瑟琳·廷斯利(Catherine Tinsley)发现,人们对飓风等自然灾害的反应也存在类似的模式。例如,如果有人毫发无损地经受住了一场飓风来袭,他们购买淹水保险以预防下一场灾难的可能性就会降低。
Tinsley’s later research suggests that this phenomenon may explain many organisational failings and catastrophes too. The crash of Nasa’s Columbia shuttle was caused by foam insulation breaking off an external tank during the launch, creating debris that struck a hole through the wing of the orbiter. The foam had broken from the insulation on many previous flights, however – but due to lucky circumstance it had never before created enough damage to cause a crash.
廷斯利后来的研究还发现,这种现象也可以解释许多机构为何会发生严重的失误和灾难。美国太空总署的哥伦比亚号航天飞机在返回地球时爆炸解体,是因为发射起飞时泡沫绝缘材料从外部燃料箱上脱落,产生的碎片在航天飞机机翼上撞出了一个洞。然而,哥伦比亚号在之前的许多次飞行中,已发生过泡沫从绝缘体上破裂的情况,但只是侥幸未造成足够的破坏从而导致哥伦比亚号爆炸解体而已。
Inspired by these findings, Tinsley’s team asked participants to consider a hypothetical mission with a near miss and to rate the project leader’s competence. She found that emphasising factors like safety, and the organisation’s visibility, meant that people were more likely to spot the event as a warning sign of a potential danger. The participants were also more conscious of the latent danger if they were told they would have to explain their judgement to a senior manager. Given these findings, organisations should emphasise everyone’s responsibility for spotting latent risks and reward people for reporting them.
廷斯利的研究团队受到上述发现的启示,要求实验参与者考虑一个会差点酿成大错的假设任务,并评估计划负责人的能力。她发现,如果强调安全、组织的可见性等因素,人们较可能发现问题,知道这是潜在危险的警告信号。如果告诉参与者他们必须向上级主管解释他们的判断,他们对潜在危险的意识会更强。受这些发现的启示,团体和机构应该强调每个成员都应该担负起发现潜在风险的责任,并对发现危险而上报的成员予以奖励。
Savani agrees that we can protect ourselves from the outcome bias. He has found, for instance, that priming people to think more carefully about the context surrounding a decision or behaviour can render them less susceptible to the outcome effect. The aim should be to think about the particular circumstances in which it was made and to recognise the factors, including chance, that might have contributed to the end result.
萨瓦尼同意,我们可以保护自己不受结果认知偏差的影响。例如,他发现,鼓励人们作出决定或行动之时要认真思考有关决策或行动的环境因素,由此可以使他们较不容易受到结果效应的影响。其目的是要将决策之制定纳入特定环境中加以考量,细究所有可能对决策有影响的因素,包括意外之发生,因为一旦发生意外,后果不堪设想。
One way to do this is to engage in counter-factual thinking when assessing your or someone else’s performance, he says. What factors might have caused that different outcome? And would you still rate the decision or process the same way, if that had occurred?
他说,做到这一点的一个方法是,在评估你自己或其他人的行为时,可以作反事实的假设思考。可以提出这样的假设问题:如果有不同结果,会是什么因素造成?如果真的出现不同的结果,你还会以同样的方式看待已作的决定或行动吗?
Consider that case of the scientist who was fudging their drug results. Even if the drug was safe in the end, imagining the worst-case scenario – with patient deaths – would make you more conscious of the risks he was taking. Similarly, if you were that pilot who chose to fly in unsuitable conditions, you might look at each flight to examine any risks you were taking and to think through how that might have played out in different circumstances.
让我们再回过头来讨论那位科学家捏造药物测试结果的个案。即或该药物最终是安全的,但如果你考虑到有可能发生最坏的情况,即病人服药后死亡,就会让你更加意识到这位科学家捏造数据带来的后果可能会相当严重。同样地,如果你是那个选择在不适宜飞行的条件下飞行的机师,你可能会留意每一次飞行记录,检视你所冒的风险,并认真思考在不同的天气情况下飞行可能会有什么事发生。
Whether you are an investor, a pilot or a Nasa scientist, these strategies to avoid the outcome bias will help prevent a chance success from blinding you to dangers in front of your eyes. Life is a gamble, but you can at least stack the odds in your favour, rather than allowing your mind to lull you into a false sense of security.
无论你是投资人、飞行员还是美国太空总署的科学家,防止结果认知偏差的策略有助于你避免视而不见眼前的危险,从而能让你避过意外的发生。生活是一场赌博,但你至少可以把胜算压在自己的一边,而不是任由自己的认知偏差将你催眠进入一种虚假的安全感之中。