A newspaper cannot publish for 174 years without some mistakes. This one has made its share. We thought Britain was safe in the

admin2018-01-01  37

问题     A newspaper cannot publish for 174 years without some mistakes. This one has made its share. We thought Britain was safe in the European exchange-rate mechanism just weeks before it crashed out; we noted in 1999 that $10 oil might reach $5; and in 2003 we supported the invasion of Iraq. For individuals, like publications, errors are painful—particularly now, when the digital evidence of failure is both accessible and indelible. But they are also inevitable. The trick is to err well: to recognise mistakes and learn from them. Worryingly, humanity may be getting worse at owning up to its goofs.
    Few enjoy the feeling of being caught out in an error. But real trouble starts when the desire to avoid a reckoning leads to a refusal to grapple with contrary evidence. Economists often assume that people are rational. Yet years of economic research illuminate the ways in which human cognition veers from rationality. Studies confirm that people frequently disregard information that conflicts with their view of the world. Why should that be? Last year Roland Benabou and Jean Tirole presented a framework for thinking about the problem. In many ways, beliefs are like other goods. People spend time and resources building them, and derive value from them. Some beliefs are like consumption goods. Other beliefs provide value by shaping behaviour. The conviction that one is a good salesman may help generate the confidence needed to close sales.
    Because beliefs are not simply tools for making good decisions, but are treasured in their own right, new information that challenges them is unwelcome. People often engage in "motivated reasoning" to manage such challenges. Mr. Benabou classifies this into three categories. "Strategic ignorance" is when a believer avoids information offering conflicting evidence. In "reality denial" troubling evidence is rationalised away: houseprice bulls might conjure up fanciful theories for why prices should behave unusually, and supporters of a disgraced politician might invent conspiracies. And lastly, in "self-signalling", the believer creates his own tools to interpret the facts in the way he wants; an unhealthy person might decide that going for a daily run proves he is well.
    Motivated reasoning is a cognitive bias to which better-educated people are especially prone. Not all the errors it leads to are costly. But when biases are shared, danger lurks. Motivated reasoning helps explain why viewpoints polarise even as more information is more easily available than ever before. That it is easy to find convincing demolitions of climate-change myths, for example, has not curbed misinformation on the topic. But the demand for good (or bad) information is uneven. Polling shows, for example, that Democrats with high levels of scientific knowledge are more concerned about climate change than fellow partisans with less scientific background. Even, or especially, sophisticated news consumers look for what they want to find.
    Work by Mr. Benabou suggests that groupthink is highest when people within groups face a shared fate: when choosing to break from a group is unlikely to spare an individual the costs of the group’s errors. If a politician’s fortunes rise and fall with his party’s, breaking from groupthink brings little individual benefit (but may impose costs). The incentive to engage in motivated reasoning is high as a result. Even as the facts on a particular issue converge in one direction, parties can still become polarised around belief-sets. That, in turn, can make it harder for a party member to derive any benefit from breaking ranks. Indeed, the group has an incentive to delegitimise independent voices. So the unanimity of views can be hard to escape until it contributes to a crisis.
    Lowering the cost of admitting error could help defuse these crises. A new issue of Econ Journal Watch, an online journal, includes a symposium in which prominent economic thinkers are asked to provide their "most regretted statements". Held regularly, such exercises might take the shame out of changing your mind. Yet the symposium also shows how hard it is for scholars to grapple with intellectual regret. Some contributions are candid; Tyler Cowen’s analysis of how and why he underestimated the risk of financial crisis in 2007 is enlightening. But some disappoint, picking out regrets that cast the writer in a flattering light or using the opportunity to shift blame.
    Public statements of regret are risky in a rigidly polarised world. Admissions of error both provide propaganda for ideological opponents and annoy fellow-travellers. Some economists used to seethe when members of the guild acknowledged that trade liberalisation could yield costs as well as benefits. In the long run, such self-censorship probably eroded trust in economists’ arguments more than it built support for trade. It is rarely in the interest of those in the right to pretend that they are never wrong.
Explain the sentence "This one has made its share. " (para.1) with examples.

选项

答案used to illustrate the topic sentence "A newspaper cannot publish for 174 years without some mistakes" / list some of the mistakes the magazine has made / wrong predictions over the past decades / the European exchange-rate mechanism / the reduction of oil prices / America’s invasion of Iraq / in all these issues / the magazine has made wrong / misleading predictions / the author also admits such kind of mistakes are "inevitable"

解析
转载请注明原文地址:https://kaotiyun.com/show/YqSO777K
0

相关试题推荐
最新回复(0)