Thinking like a scientist is really hard, even for scientists. It requires putting aside your own prior beliefs, evaluating the quality and meaning of the evidence before you, and weighing it in the context of earlier findings. But parking your own agenda and staying objective is not the human way.
像科学家一样思考是非常困难的,即便对科学家来说,也是如此。这需要抛弃你先前的信念,评估你先前证据的质量和意义,并在早期的调查结果中加以权衡。但是,把自己的计划和目标放在心上并不是人们的方式。
Consider that even though scientific evidence overwhelming supports the theory of evolution, a third of Americans think the theory is “absolutely false”. Similarly, the overwhelming scientific consensus is that human activity has contributed to climate change, yet around a third of Americans doubt it.
尽管科学证据压倒性地支持了进化论,但是三分之一的美国人认为这一理论是“绝对错误的”。同样,人类活动导致了气候变化是压倒性的科学共识,但是大约三分之一的美国人对此表示怀疑。
We Brits are just as blinkered. In a recent survey, over 96 per cent of teachers here said they believed pupils learn better when taught via their preferred learning style, even though scientific support for the concept is virtually non-existent. Why is it so hard to think like a scientist? In a new chapter in the Psychology of Learning and Motivation book series, Priti Shah at the University of Michigan and her colleagues have taken a detailed look at the reasons, and here we pull out five key insights:
同样,我们英国人是心胸狭隘的。在最近的一次调查中,超过96%的教师说,他们相信,通过学生喜欢的学习方式教学,学生会学得更好,尽管没有科学支持这样的想法。为什么像科学家那样思考会如此困难?在新出版的《学习和动机的心理学》丛书中的一章里,密歇根大学的Priti Shah和她的同事们详细探求了其中的原因,我们选出五个关键的观点:
We’re swayed by anecdotes
我们被奇闻轶事影响
When making everyday decisions, such as whether to begin a new treatment or sign up to a particular class at uni, most of us are influenced more powerfully by personal testimony from a single person than by impersonal ratings or outcomes averaged across many people. This is the power of anecdote to dull our critical faculties. In a study published last year Fernando Rodriguez and his colleagues asked dozens of students to evaluate scientific news reports that drew inappropriate conclusions from weak evidence. Some of the reports opened with an anecdote supporting the inappropriate conclusion, other reports lacked an anecdote and acted as a control condition. Regardless of their level of university training or knowledge of scientific concepts, the students were less competent at critically evaluating the reports when they opened with an anecdote. “Anecdotal stories can undermine our ability to make scientifically driven judgements in real-world contexts,” the researchers said. Of course much health and science news in the mainstream media is delivered via anecdotes, increasing the likelihood that news consumers will swallow any claims whole.
发表的一份研究报告中,要求几十名学生评价一些从薄弱的证据中得出不恰当结论的科学新闻报道。有些报道以一个奇闻轶事来支持这那个不恰当的结论,作为控制条件的其他的报道没有奇闻轶事。无论他们的大学培训水平或科学概念的知识如何,当他们看过那些有奇闻轶事的报告的时候,学生们在鉴别报告时,他们的鉴别性批判的能力较差。研究人员说:“奇闻轶事可能会削弱我们在现实世界中做出科学判断的能力。”当然,主流媒体中的许多健康和科学新闻都是通过奇闻轶事传递的,进而增加了新闻消费者狼吞虎咽整个主张的可能性。
我们每天做出决定,比如是否开始一个新的治疗或在某个大学的特定的班级注册时,大多数人都会受到某个人的言辞的影响,而不是通过许多人的客观评价或结果来决定。来自于奇闻轶事的力量削弱了我们的批评能力。在Fernando Rodriguez 和他的同事去年
我们过于自信
Confronted with a scientific claim, another reason many of us find it hard to reflect on it scientifically is that we overestimate our comprehension of the science. A study from 2003 asked hundreds of university students to read several science news stories, to interpret them and rate their understanding. The students made many interpretative errors – for example, confusing correlation for causation – even though they thought they had a good understanding. This is redolent of a survey from the 1980s of thousands of British and American citizens: nearly 60 per cent stated they were moderately or very well-informed about new science findings and yet far fewer were able to answer easy questions about elementary science. Part of the problem seems to be that we infer our understanding of scientific text based on how well we have comprehended the language used. This means that popular science stories written in layman’s language can contribute to false confidence. This “fluency bias” can also apply to science lectures: a recent study found that students overestimated the knowledge they’d derived from a science lecture when it was delivered by an engaging speaker.
面对科学的说法,我们中的大多数人很难进行科学反思的另一个原因是,我们高估了我们对科学的理解力。一项来自于2003年的研究要求数百名大学生读几篇科学新闻报道,解释他们并给他们的理解进行评分。学生们做出了许多解释性的错误,例如,混淆因果关系,尽管他们认为他们有很好的理解力。这使人联想起1980年代,对英国和美国的数千公民的调查:将近60%的人表示,他们基本或非常了解科学的发现,但很少能够回答基础科学的简单问题。问题的一部分似乎是,我们根据我们对所使用的语言的理解程度来推断出我们对科学文本的理解。这意味着,用通俗易懂的语言写就的科普故事会导致虚假的自信。这种“流畅的偏见”也适用于科学讲座:最近的一项研究发现,学生们高估了他们从科学讲座中获得的知识。
We’re biased by our prior beliefs
我们的偏见来自于先前的信念
This obstacle to scientific objectivity was demonstrated by a now-classic study from the 1970s in which participants were asked to evaluate scientific research that either supported or conflicted with their prior beliefs. For instance, one of the to-be-evaluated studies supposedly showed that murder rates tended to be lower in US states with the death penalty. Participants demonstrated an obvious bias in their evaluations. For example, if they supported capital punishment, they tended to evaluate the death penalty study favourably, whereas if they were against capital punishment, they were more likely to see the studies’ flaws. Scientific skills offer little protection against this bias, in fact they can compound it. A 2013 study asked participants to evaluate a piece of research on gun control. Participants with greater numeracy skills were especially biased: if the findings supported their existing beliefs, they were generous in their evaluation, but if the findings went against their beliefs, they used their skills to (in the words of Shah et al) “slam” the findings – a phenomenon dubbed “identity-protective cognition”.
妨碍科学客观性的这个障碍从20世纪70年代的一项经典研究中得到了证实,在这项研究中,要求参与者评估的科学研究要么支持他们先前的信念,要么与之有冲突。例如,其中一项有待评估的研究称,谋杀率在有死刑的美国州往往会更低。参与者在评价中表现出明显的偏见。例如,如果他们支持死刑,他们倾向于对死刑进行有利的评价,反之,如果他们反对死刑,他们更有可能看到研究的缺陷。科学技能对这种偏见几乎没有保护作用,事实上,他们可以搅和这些。一项2013年的项研究要求参与者评估枪支管制的一项研究。算计能力更强的参与者尤其有偏见:如果研究的发现支持他们现有的信仰,他们的对此评价就会很慷慨,但如果发现违背了他们的信仰,他们会用他们的技能(在Shah等人的话)“进行猛烈的抨击”——该现象被称为“同一性保护认知”(identity-protective cognition)