论文部分内容阅读
我们好骗,是因为我们太懒。这话可能过于简单化,也过于讽刺。但看看近来流行的“朋友圈”谣言,很多都有悖常识,经不起推敲,但为什么就能横行无忌,还大有野火之势呢?因为我们懒得去辨别来自熟人的信息,即使被辟谣后,还是懒得去改变想法。谣言或许危言耸听,无伤大雅,但却有如流毒,扭曲事实,蛊惑人心,祸患无穷,不得不防。
If you ever need proof of human gullibility, cast your mind back to the attack of the flesh-eating bananas.2 In January 2000, a series of chain emails began reporting that imported bananas were infecting people with “necrotizing fasciitis”—a rare disease in which the skin erupts into livid purple boils before disintegrating and peeling away from muscle and bone.3
According to the email chain, the FDA was trying to cover up the epidemic to avoid panic.4 Faced with the threat, readers were encouraged to spread the word to their friends and family.
The threat was pure nonsense5, of course. But by 28 January, the concern was great enough for the US Centers for Disease Control and Prevention to issue a statement decrying the rumour.6
Did it help? Did it heck.7 Rather than quelling the rumour, they had only poured fuel on its flames.8 Within weeks, the CDC was hearing from so many distressed9 callers it had to set up a banana hotline. The facts became so distorted10 that people eventually started to quote the CDC as the source of the rumour. Even today, new variants of the myth have occasionally reignited those old fears.11 The banana apocalypse may seem comical in hindsight, but the same cracks in our rational thinking can have serious,12 even dangerous, consequences.
Why do so many false beliefs persist in the face of hard evidence? And why do attempts to deny them only add grist to the rumour mill13? It’s not a question of intelligence—even Nobel Prize winners have fallen for some bizarre and baseless theories.14 But a series of recent psychological advances may offer some answers, showing how easy it is to construct a rumour that bypasses the brain’s deception filters.15
One, somewhat humbling, explanation is that we are all “cognitive misers”—to save time and energy, our brains use intuition rather than analysis.16
As a simple example, quickly answer the following questions:
“How many animals of each kind did Moses take on the Ark?”17
“Margaret Thatcher was the president of what country?”18
Between 10% and 50% of study participants presented with these questions fail to notice that it was Noah, not Moses, who built the Ark, and that Margaret Thatcher was the prime minster, not the president—even when they have been explicitly asked to note inaccuracies.19 Known as the “Moses illusion”, this absentmindedness illustrates just how easily we miss the details of a statement, favouring the general gist in place of the specifics.20 Instead, we normally just judge whether it “feels” right or wrong before accepting or rejecting its message.
Based on the research to date, Eryn Newman at the University of Southern California suggests our gut reactions swivel around just five simple questions:21
Does a fact come from a credible22 source?
Do others believe it?
Is there plenty of evidence to support it?
Is it compatible with23 what I believe?
Does it tell a good story?
Crucially, our responses to each of these points can be swayed by frivolous, extraneous details that have nothing to do with the truth.24
Consider the questions of whether others believe a statement or not, and whether the source is credible. We tend to trust people who are familiar to us, meaning that the more we see a talking head, the more we will begrudgingly start to believe what they say.25 “The fact that they aren’t an expert won’t even come into our judgment of the truth,” says Newman. What’s more, we fail to keep count of the number of people supporting a view; when that talking head repeats their idea on endless news programmes, it creates the illusion that the opinion is more popular and pervasive than it really is.26 Again, the result is that we tend to accept it as the truth.
Sticky nuggets27
Then there’s the “cognitive fluency” of a statement—essentially, whether it tells a good, coherent story that is simple to imagine.28 “If something feels smooth and easy to process, then our default is to expect things to be true,”29 says Newman. This is particularly true if a myth easily fits with our expectations. “It has to be sticky—a nugget or soundbite that links to what you know, and reaffirms your beliefs,”30 agrees Stephan Lewandowsky at the University of Bristol in the UK.
In light of these discoveries, you can begin to understand why the fear of the flesh-eating bananas was so infectious.31 For one thing, the chain emails were coming from people you inherently32 trust—your friends—increasing the credibility of the claim, and making it appear more popular. The concept itself was vivid and easy to picture—it had high cognitive fluency. If you happened to distrust the FDA and the government, the thought of a cover-up would have fitted neatly33 into your worldview. That can also help explain why those attempts to correct a myth have backfired so spectacularly, as the CDC found to their cost.34 Lab experiments confirm that offering counter-evidence only strengthens someone’s conviction.35 “In as little as 30 minutes, you can see a bounce-back effect36 where people are even more likely to believe the statement is true,” says Newman.
Fraying37 beliefs
As a result of these frailties, we are instantly drawn to the juicier details of a story—the original myth—while forgetting the piddling little fact that it’s been proven false.38 Worse still, by repeating the original myth, the correction will have increased the familiarity of the claim—and as we’ve seen, familiarity breeds believability.39 Rather than uprooting40 the myth, the well-intentioned correction has only pushed it deeper.
A debunked41 myth may also leave an uncomfortable gap in the mind. Lewandowsky explains that our beliefs are embedded in our “mental models” of the way the world works; each idea is interlinked with our other views.42 It’s a little like a tightly bound43 book: once you tear out one page, the others may begin to fray as well. “You end up with a black hole in your mental representation44, and people don’t like it.” To avoid that discomfort, we would often rather cling to the myth before our whole belief system starts unravelling.45
Fortunately, there are more effective ways to set people straight and make the truth stick.46 For a start, you should avoid repeating the original story (where possible) and try to come up with a whole alternative to patch up the tear in their mental model.47 For instance, when considering the fears that MMR vaccines may be linked to autism, it would be better to build a narrative around the scientific fraud that gave rise to the fears—rather than the typical “myth-busting” article that unwittingly reinforces the misinformation.48 Whatever story you choose, you need to increase the cognitive fluency with clear language, pictures, and good presentation. And repeating the message, a little but often, will help to keep it fresh in their minds. Soon, it begins to feel as familiar and comfortable as the erroneous49 myth—and the tide of opinion should begin to turn.
At the very least, staying conscious of these flaws in your thinking will help you to identify when you may be being deceived. It’s always worth asking whether you have thought carefully about the things you are reading and hearing. Or are you just persuaded by biased50 feelings rather than facts? Some of your dearest opinions may have no more substance than the great banana hoax of the year 2000.51
If you ever need proof of human gullibility, cast your mind back to the attack of the flesh-eating bananas.2 In January 2000, a series of chain emails began reporting that imported bananas were infecting people with “necrotizing fasciitis”—a rare disease in which the skin erupts into livid purple boils before disintegrating and peeling away from muscle and bone.3
According to the email chain, the FDA was trying to cover up the epidemic to avoid panic.4 Faced with the threat, readers were encouraged to spread the word to their friends and family.
The threat was pure nonsense5, of course. But by 28 January, the concern was great enough for the US Centers for Disease Control and Prevention to issue a statement decrying the rumour.6
Did it help? Did it heck.7 Rather than quelling the rumour, they had only poured fuel on its flames.8 Within weeks, the CDC was hearing from so many distressed9 callers it had to set up a banana hotline. The facts became so distorted10 that people eventually started to quote the CDC as the source of the rumour. Even today, new variants of the myth have occasionally reignited those old fears.11 The banana apocalypse may seem comical in hindsight, but the same cracks in our rational thinking can have serious,12 even dangerous, consequences.
Why do so many false beliefs persist in the face of hard evidence? And why do attempts to deny them only add grist to the rumour mill13? It’s not a question of intelligence—even Nobel Prize winners have fallen for some bizarre and baseless theories.14 But a series of recent psychological advances may offer some answers, showing how easy it is to construct a rumour that bypasses the brain’s deception filters.15
One, somewhat humbling, explanation is that we are all “cognitive misers”—to save time and energy, our brains use intuition rather than analysis.16
As a simple example, quickly answer the following questions:
“How many animals of each kind did Moses take on the Ark?”17
“Margaret Thatcher was the president of what country?”18
Between 10% and 50% of study participants presented with these questions fail to notice that it was Noah, not Moses, who built the Ark, and that Margaret Thatcher was the prime minster, not the president—even when they have been explicitly asked to note inaccuracies.19 Known as the “Moses illusion”, this absentmindedness illustrates just how easily we miss the details of a statement, favouring the general gist in place of the specifics.20 Instead, we normally just judge whether it “feels” right or wrong before accepting or rejecting its message.
Based on the research to date, Eryn Newman at the University of Southern California suggests our gut reactions swivel around just five simple questions:21
Does a fact come from a credible22 source?
Do others believe it?
Is there plenty of evidence to support it?
Is it compatible with23 what I believe?
Does it tell a good story?
Crucially, our responses to each of these points can be swayed by frivolous, extraneous details that have nothing to do with the truth.24
Consider the questions of whether others believe a statement or not, and whether the source is credible. We tend to trust people who are familiar to us, meaning that the more we see a talking head, the more we will begrudgingly start to believe what they say.25 “The fact that they aren’t an expert won’t even come into our judgment of the truth,” says Newman. What’s more, we fail to keep count of the number of people supporting a view; when that talking head repeats their idea on endless news programmes, it creates the illusion that the opinion is more popular and pervasive than it really is.26 Again, the result is that we tend to accept it as the truth.
Sticky nuggets27
Then there’s the “cognitive fluency” of a statement—essentially, whether it tells a good, coherent story that is simple to imagine.28 “If something feels smooth and easy to process, then our default is to expect things to be true,”29 says Newman. This is particularly true if a myth easily fits with our expectations. “It has to be sticky—a nugget or soundbite that links to what you know, and reaffirms your beliefs,”30 agrees Stephan Lewandowsky at the University of Bristol in the UK.
In light of these discoveries, you can begin to understand why the fear of the flesh-eating bananas was so infectious.31 For one thing, the chain emails were coming from people you inherently32 trust—your friends—increasing the credibility of the claim, and making it appear more popular. The concept itself was vivid and easy to picture—it had high cognitive fluency. If you happened to distrust the FDA and the government, the thought of a cover-up would have fitted neatly33 into your worldview. That can also help explain why those attempts to correct a myth have backfired so spectacularly, as the CDC found to their cost.34 Lab experiments confirm that offering counter-evidence only strengthens someone’s conviction.35 “In as little as 30 minutes, you can see a bounce-back effect36 where people are even more likely to believe the statement is true,” says Newman.
Fraying37 beliefs
As a result of these frailties, we are instantly drawn to the juicier details of a story—the original myth—while forgetting the piddling little fact that it’s been proven false.38 Worse still, by repeating the original myth, the correction will have increased the familiarity of the claim—and as we’ve seen, familiarity breeds believability.39 Rather than uprooting40 the myth, the well-intentioned correction has only pushed it deeper.
A debunked41 myth may also leave an uncomfortable gap in the mind. Lewandowsky explains that our beliefs are embedded in our “mental models” of the way the world works; each idea is interlinked with our other views.42 It’s a little like a tightly bound43 book: once you tear out one page, the others may begin to fray as well. “You end up with a black hole in your mental representation44, and people don’t like it.” To avoid that discomfort, we would often rather cling to the myth before our whole belief system starts unravelling.45
Fortunately, there are more effective ways to set people straight and make the truth stick.46 For a start, you should avoid repeating the original story (where possible) and try to come up with a whole alternative to patch up the tear in their mental model.47 For instance, when considering the fears that MMR vaccines may be linked to autism, it would be better to build a narrative around the scientific fraud that gave rise to the fears—rather than the typical “myth-busting” article that unwittingly reinforces the misinformation.48 Whatever story you choose, you need to increase the cognitive fluency with clear language, pictures, and good presentation. And repeating the message, a little but often, will help to keep it fresh in their minds. Soon, it begins to feel as familiar and comfortable as the erroneous49 myth—and the tide of opinion should begin to turn.
At the very least, staying conscious of these flaws in your thinking will help you to identify when you may be being deceived. It’s always worth asking whether you have thought carefully about the things you are reading and hearing. Or are you just persuaded by biased50 feelings rather than facts? Some of your dearest opinions may have no more substance than the great banana hoax of the year 2000.51