So when someone doesn’t believe something you hold to be
indisputably true, you tend to think of several possibilities. One is that
people holding the different beliefs may not have all the facts. Or you may
think they are not smart enough to draw the correct conclusion from the facts.
Or you may think they are in denial, that they simply don’t want to admit the
truth, because of some vested interest or political agenda.
But my experience with unsuccessfully trying to “convert”
9/11 truthers suggested to me it might be more complicated than that. I began to search out research into why smart
people believe illogical things.
It turns out there is a lot of scientific study of this
question, especially in the area of “heuristics,” cognitive rules-of-thumb that
are ingrained in our brains to help us make everyday decisions without having
to overthink every problem.
Science writer Wray Herbert, whose book On Second Thought is about overcoming
our natural heuristics to make smarter, better judgments, says heuristics are
normally helpful, indeed critical to getting through the myriad decisions to be
made every day. “Heuristics are amazing time savers, which makes them essential
to our busy lives,” he writes. “We don’t want to deliberate every minor choice
we make every day, and we don’t need to.” But, he adds, “there are always risks
when we stop deliberating.”[1]
Heuristics, handy though they are, are also imperfect and
often irrational. They can sometimes replace dispassionate clear-eyed analysis.
We tend to rely on our instincts, “trust our gut.” That works fairly well for most decisions we
make in the course of our daily lives, but it can also lead to an element of
self-delusion.
One way the gut overrides the brain is by substituting
“believability” for careful fact-weighing. If it sounds right, it is more
likely to be accepted than if it actually is right but doesn’t ring true. In
fact, one study of the persuasiveness of fiction and non-fiction literature
found that believability is not so much connected to the factual nature of a
story, but rather to the extent it aligns with the reader’s general worldview.
Researchers Melanie Green and John Donahue, psychologists
at the University of North Carolina, cited the research in their paper, “The
Persistence Of Belief Change In The Face Of Deception.” They cite prior research that “individuals
often respond to stories on the basis of their plausibility rather than their
truth status, so if a story presents convincing characters or situations,
individuals may not care as much about whether the events actually took place.”
[2]
Building on that previous research, Green and Donahue
conducted an experiment in which they gave subjects an updated and condensed
version of the story, ‘‘Jimmy’s World,’’ Janet Cooke’s Pulitzer-prizing winning
but fabricated account of an eight-year-old heroin addict named Jimmy, and his
mother who worked as a prostitute in order to support her own heroin addiction.
When the subjects learned that the primary character of the
story was not real, it did not change their opinion about the overall accuracy
of the story. The author’s conclusion:
The
present research suggests that belief change from a story can remain unaltered
even if the source of the persuasion has accidentally or intentionally misled
the recipient of the persuasive message. The derogation of a lying source does
not extend to correction of story-based beliefs. Similarly, evaluations of the
characters in a story can remain unaltered even in the face of an author’s
deception. The … results provide suggestive evidence that individuals attempt
to correct for false information, but that they do not do so effectively.[3]
I found this result particularly interesting in light of my
own personal experience. In the documentary, Loose Change, and in many other forums, my words had been
deliberately edited in a way to create a false impression that I was reporting
there was no evidence a plane hit the Pentagon. I thought that when I provided
corrective information to document the deception to people who saw it and
believed it their beliefs would change. I was surprised when my explanation had
no effect, just as in Green and Donahue’s experiment. They explained it this
way, again citing previous research:
Research
on belief perseverance … suggests that when individuals have created a causal
structure to support their beliefs, they retain those beliefs even if they are
informed that the initial information was incorrect. Indeed, comprehending
information often leads to automatic belief in that information, which then
requires motivation and ability to correct.
Participants
clearly recognized the manipulation, as shown by their lower ratings of the
author and their identification of more false notes, but this recognition did
not extend to belief correction.[4]
This may suggest a “heuristic effect,” in that the subjects
felt the story “rang true” even if a key fact was wrong. Their gut told them
the story was plausible, and it likely fit with their previously held beliefs
and stereotypes about lifestyles of inner city poor.
Stereotyping is one common type of heuristic that people
employ unconsciously to make snap judgments. That person looks dangerous.
Politicians lie. The news media are biased. Those kinds of stereotypes, true as
they may be at times, can color our thinking in ways that are often invisible
to us and, as we’ve seen, once a snap judgment is made it is not easily
changed.
In his 2014 book Kidding
Ourselves, Pulitzer prize-winning journalist Joseph Hallinan argues, “false
beliefs can be remarkably hardy,” taking on a life of their own, “wandering the
landscape like zombies we just can’t kill.”
…
consider some of the perceptions surrounding the current U.S. president Barack
Obama. Tens of millions of Americans believe that the president of the United
States isn’t even a citizen of the United States. In 2010, some two years after
Obama released a copy of his official birth certificate from the state of
Hawaii, a CNN Opinion Research Poll found more than a quarter of the public had
doubts about his citizenship.[5] Even after Obama released
the so-called long-form of his birth certificate, substantial numbers of voters
still weren’t convinced.”[6]
Again, this is another case where what would appear to be
definitive corrective factual information had almost no effect, at least not
initially. So as Hallinan writes, “If facts don’t change minds, what does?”[7]
It turns out quite a few researchers have studied this
question, as well as an interesting related phenomenon, the so-called “backfire
effect,” wherein people presented with more facts actually become more
entrenched in their false beliefs and misperceptions.” Popular Science’s Debunking 9/11 Myths would seem to be a prima
facie example of the backfire effect, having provoked a counter book that
fought back twice as hard.
Hallinan cites the work of two academics, Brendan Nyhan of
Dartmouth and Jason Reifler of Georgia State, who conducted an experiment using
actual news articles followed by corrections. They wanted to find out whether
the corrections had the desired effect on people with strong partisan beliefs.
The researchers found precisely the opposite, namely that correcting the record
not only doesn’t help, it can sometimes hurt – making people even more certain
they are right, even when they are not.
The experiments documented another important reason why
factual misperceptions about politics are so persistent: the subjects’
ideological and political views of the topics.
“As a result,” they write, “the corrections fail to reduce
misperceptions for the most committed participants. Even worse, they actually
strengthen misperceptions among ideological subgroups in several cases.”[8]
The authors draw a distinction between citizens who are
uninformed and those who are misinformed, that is citizens who base their
policy preferences on false, misleading, or unsubstantiated information that
they believe to be true, which is often directly related to political
preferences.
Nyhan and Reifler cite research that shows after the U.S.
invasion of Iraq in 2003, the belief that Iraq had weapons of mass destruction
before the invasion was closely associated with support for President
Bush. In reviewing the corrective effect
of providing relevant facts to people holding misperceptions, the research
indicated that subjects were receptive to what was termed “authoritative
statements of fact, such as those provided by a survey interviewer to a
subject.”
However,
such authoritative statements of fact … are not reflective of how citizens
typically receive information. Instead, people typically receive corrective
information within ‘‘objective’’ news reports pitting two sides of an argument
against each other, which is significantly more ambiguous than receiving a
correct answer from an omniscient source. In such cases, citizens are likely to
resist or reject arguments and evidence contradicting their opinions—a view
that is consistent with a wide array of research.[9]
In his book, You Are
Now Less Dumb, journalist David McRaney describes a result similar to my
own experience with 9/11 truthers when he sums up his experience attempting to
debunk myth with fact on the Internet.
McRaney puts his finger on the maddeningly frustrating aspect of the
backfire effect, namely that any debate merely convinces both sides they are
even more right.
What
should be evident from the studies on the backfire effect is you can never win
an argument online. When you start to
pull out facts and figures, hyperlinks and quotes, you are actually making your
opponent feel even surer of his position than before you started the debate. As he matches your fervor, the same thing
happens in your skull. The backfire effect pushes both of you deeper into you
original beliefs.[10]
When people are not open to objective evidence that would
contradict their current beliefs, they are exhibiting what author Rolf Dobelli
calls “the mother of all misconceptions,” namely confirmation bias. [11] He defines this as “the
tendency to interpret new information so that it becomes compatible with our
existing theories, beliefs, and convictions. In other words we filter out any
new information that contradicts our existing views.”
The irony of confirmation bias is that both sides in the
9/11 conspiracy battle accuse the other of this classic thinking error. After
my interview with 9/11 truther Victor Thorn, his criticism of me centered
almost entirely on his argument that, in having accepted the official version
of what happened, I never actively searched for “disconfirming” evidence. He
was effectively charging me with “confirmation bias,” that having accepted the
official story, I no longer searched for other explanations. My counter
argument was that once a fact is established beyond any doubt, the search for
disconfirming evidence becomes a fool’s errand. Once we have determined the
Earth is round, there really is no need to search for disconfirming evidence of
its possible flatness. But in the case of 9/11 truthers, we can’t yet agree the
world is round.
An interesting new study by a Yale Law School professor,
Dan Kahan, tackles a question a different way.[12] Kahan wanted to figure
out whether lack of understanding was the reason large segments of the
population, particularly those with firm religious views, reject some questions
of settled science such as climate change and human evolution.[13]
If, for instance, people who were simply unaware of the
strength of the evidence for climate change or evolution or didn’t understand
the science behind it, they could, in theory, be provided with the correct
information, and then might be convinced of the scientific consensus. But Kahan
found there was no significant difference between more religious or less
religious people when it came to understanding the basic science – subjects
were simply unwilling to endorse the consensus when it conflicted with their
religious or political views.
That suggests the problem is not so much a lack of accurate
information, or scientific illiteracy, but rather an aversion to endorse a
belief that runs counter to a sense of identity. In other words, we have a
strong emotional attachment to our beliefs because we believe they say
something important about whom we are. This is one of the factors that makes
debunking false beliefs so problematic. For 9/11 truthers to concede their
cause was misguided and mistaken would seriously diminish their self-image as
crusaders for truth against evil and corrupt forces in the government and the
media.
Historian Richard Hofstadter identified this phenomenon in
his classic 1964 Harper’s magazine
essay, “The Paranoid Style in American Politics:”
The
paranoid spokesman sees the fate of conspiracy in apocalyptic terms—he traffics
in the birth and death of whole worlds, whole political orders, whole systems
of human values. He is always manning the barricades of civilization…. As a
member of the avant-garde who is capable of perceiving the conspiracy before it
is fully obvious to an as yet unaroused public, the paranoid is a militant
leader.[14]
Writing about the implications of Kahan’s study in the New York Times, researcher Brendan Nyhan
(whose own study is cited earlier) argued current findings suggest a need “to
try to break the association between identity and factual beliefs on
high-profile issues.”
…for
instance, by making clear that you can believe in human-induced climate change
and still be a conservative Republican like former Representative Bob Inglis or
an evangelical Christian like the climate scientist Katharine Hayhoe. But we
also need to reduce the incentives for elites to spread misinformation to their
followers in the first place. Once people’s cultural and political views get
tied up in their factual beliefs, it’s very difficult to undo regardless of the
messaging that is used. [15]
The study, as well as other research in the field,
underscores another well-known aspect of basic human nature: once we latch on
to a belief, we are loath to let it go. As Kathryn Schultz describes in her
book, Being Wrong, the great majority
of us are often mistaken, but rarely in doubt: [16]
A whole
of lot us go through life assuming we are basically right, basically all of the
time about basically everything: about political and intellectual convictions,
or religious and moral beliefs, our assessment of other people, our memories,
our grasp of facts. As absurd as it sounds when we stop to think about it, our
steady state seems to be one of unconsciously assuming we are very close to
omniscient.[17]
Schultz observes that if we envelop ourselves in a
pleasantly delusional fog of certitude and regard our surefootedness as our
default setting, the converse is also true.
The idea that we could often be wrong seems, she writes, “as rare and
bizarre an inexplicable aberration in
the normal order of things.” But she
says both self-assessments are essentially overly optimistic delusions,
writing: “Our tricky senses, our limited intellects, our fickle memories, the
veil of emotions, the tug of allegiances, the complexity of the world around
us: all of this conspires to ensure we get things wrong again, and again.”[18]
History is full of examples of brilliant scientists and
thinkers who, when confronted by incontrovertible evidence of their mistakes,
refuse to accept or admit their errors. Mario Livio’s book Brilliant Blunders is a case study of colossal mistakes made by
five towering scientific giants of their times; Albert Einstein, Charles
Darwin, physicist Lord Kelvin, chemist Linus Pauling and cosmologist Fred
Hoyle. [19] His focus is on how “blunders are not
only inevitable, but also an essential part of science.”[20] But his book also shows
how smart people, in this case brilliant geniuses, will hold on to false
beliefs long after the weight of evidence and consensus of science has gone
against them. In fact he argues they fall victim to their own intelligence and
prior success, which can result in an unwarranted overconfidence.
Here’s how Lord Kelvin’s obstinate refusal to face facts
was summed up by science writer Marcia Bartusiak, who reviewed Brilliant Blunders for the Washington Post:
William
Thomson (later known as Lord Kelvin) was simply stubborn. After achieving
worldwide fame for formulating the laws of thermodynamics in the mid-19th century,
Kelvin went on to estimate the age of the Earth based on the time needed for a
primordial molten planet to cool to its current temperature. He figured 400
million years at most. Biologists and geologists were already estimating ages
far older – billions of years – but Kelvin stuck to his guns for decades, even
when a former pupil matched the geological age with a better physical model of
the Earth and the discovery of radioactivity introduced a new source of energy
for our cooling planet.[21]
The point being that once someone becomes emotionally
invested in a belief, often no amount of fact, logic, and reason can persuade
even the most intelligent of our species. To paraphrase an old joke, “How many
facts does it take to change a person’s mind? Only one, but the person has to really want to change.”
And the primary reason for our stubborn refusal to face
facts is basically that we are hard-wired not to, by our heuristics and biases.
One of our biggest self-delusions, (aside from thinking ourselves basically
right about basically everything) is that we also believe we are basically
rational creatures who make decisions in a logical way, weighing pros and cons,
facts and counter facts, before coming to a well-reasoned judgment. But that is
not how we typically make important decisions. Think about one of the biggest
decisions in your life: how you chose your mate. Did you make a list of your
prospective partner’s good and bad points? Did you request financial data,
medical reports, school transcripts, and a psychological profile? No, most
likely you went with your gut. Maybe you decided it was fate that brought you
together, your one soul-mate from the billions of humans on the planet. How
illogical and unscientific.
If we are honest with ourselves we can admit that we are a
race of magical thinkers. Setting aside the sensitive and intensely personal
question of religion and its supernatural implications, many of us hold
mystical or illogical beliefs. And even if we are not fully invested in them
and insist we don’t engage in magical thinking, our actions belie that. We
check our horoscopes, as if the alignment of the stars at our birth has a
bearing on our lives. Hotels are built without 13th floors. We resort to
psychics to find missing children. We wear our lucky shirt to help our favorite
sports team win. “It’s only weird if it doesn’t work,” a Bud Light beer
commercial intones. The ad agency that came up with the campaign says: “We know
NFL fans believe that their superstitions – no matter how esoteric or nonsensical
– have real-world consequences on the outcome of the game.”[22]
Luck? Karma? Fate? All magical concepts in which
subjectivity outweighs objectivity, a cognitive bias that even the most
intelligent people fall prey to. Former Psychology
Today Editor Matthew Hutson argues that magical thinking, while illogical
and sometimes outright dangerous, can actually provide benefits by offering a
sense of control and meaning that makes life richer, more comprehensible and
less scary. “Often the biologically modern deliberative system is powerless to
restrain the ancient associative system it’s built on,” he writes. “It makes no
difference how clever you are or how reasonable you try to be: research shows
little correlation between people’s level of rationality or intelligence and
their susceptibility to magical thinking. I ‘know’ knocking on wood has no
mystical power. But my instincts tell me to do it anyway, just in case, and I
do."[23]
Rolf Dobelli argues in the Art of Thinking Clearly that sometimes it is perfectly fine to let
your intuition take over. “Thinking is tiring,” he writes, “Therefore if the
potential harm is small, don’t rack your brains, such errors won’t do lasting
damage.” But he says, “In situations where the possible consequences are large
(i.e. important personal or business decisions) I try to be as reasonable and
rational as possible.”
It is important, Dobelli says, to recognize the difference
between rational thinking and intuitive thinking; the latter is fraught with
subconscious cognitive errors.
The
failure to think clearly, or what experts call ‘a cognitive error’ is a
systematic deviation from logic – from optimal rational reasonable thought and
behavior. By “systematic’ I mean these are not occasional errors, but rather
routine mistakes, barriers to logic we stumble over time and time again,
repeating patterns through generations and through the centuries.[24]
Among the examples of common cognitive errors cited by
Dobelli are our tendencies to overestimate our knowledge, to give too much weight
to anecdotes, to fear losing something more than not gaining the same
thing. “The errors we make,” he says,
“follow the same pattern over and over again, piling up like dirty laundry.”
I have a shelf of books that detail the major mistakes our
brains make on a daily basis. Many cite the same anecdotes to illustrate
various fallacies and flawed thinking. But I’m going to pick out a few from
Dobelli’s book because it is one of the most comprehensive and concise guides
to all the various ways we trick ourselves. In many ways, one could argue
that’s what is happening with many 9/11 conspiracy theorists: they are fooling
themselves into thinking nonsense makes sense. And in the process they are
falling victim to many of the following basic thinking errors, outlined in The Art of Thinking Clearly:
Confirmation Bias – As mentioned earlier, this is the mother of all thinking
errors and probably the best known and understood. We know if we only seek out
and pay attention to information that supports what we already think, we’re
unlikely to change our minds. “Why be informed, when you can be affirmed,” as
the saying goes.
Social Proof – This is sometimes called the “herd instinct” or
“groupthink.” You think you’re behaving the right way when you are doing what
everyone else is doing. As Dobelli points out, this is what drives bubbles and
stock market panic, as well as lesser evils such as fashion, management
techniques, and fad diets. As he says, “If 50 million people say something
foolish, it’s still foolish.”
Authority Bias – Authorities these days are just not all that
authoritative. Many are self-promoters who appear on cable networks because
they have an inflammatory or outrageous opinion that will stoke outrage, and
possibly draw viewers. From Dr. Oz to Dr. Phil, many who claim to have special
expertise are simply entertainers, sporting white coats or other trappings of
authority.
Clustering
Illusion – We see patterns. We see faces
in the clouds. We see the man in the moon. The world is one big Rorschach test.
After September 11, some people thought they saw the face of Satan in the smoke
billowing from one of the twin towers.[25] We see patterns where
none exist. We have trouble accepting that such events are happenstance.
Overconfidence
Effect – This is difference between what
people know and what they think they know. Dobelli says, “We systematically
overestimate our knowledge and our ability to predict on a massive scale.” For
instance, we all think we are above average drivers, which is impossible if you
think about it. Just like the kids in Garrison Keillor’s mythical Lake Wobegon,
where all the children are above average.
Coincidence – Though unlikely events are inevitable, people are always
amazed by coincidence. “What are the chances of that?” they ask. You get on a
plane and the person seated next to you went to your same high school or has
the same birthday. Coincidence? They don’t think so. But when you consider the
large universe of people on planes, it is inevitable that at some point two
people with the same birthday will sit next to each other. We’re not good at
probabilities, and we tend to impart meaning into coincidences that are in fact
expected random events.
Base Rate Neglect – We often overlook the basic truth that the most obvious
explanation is the most likely, and that exotic or fantastic scenarios should
be considered only after the more probable scenario has been ruled out. In
medical school, they teach, “When you hear hoofbeats, don’t expect a zebra.”
That might be rephrased, “When witnesses see a plane hit the Pentagon, don’t
think missile.”
Cognitive
Dissonance – When facts show you were
wrong or failed in some way, you simply reinterpret them retroactively to
conclude you were right. The term “cognitive dissonance” was introduced in 1957
by Stanford psychology professor Leon Festinger. “Festinger’s seminal
observation: The more committed we are to a belief, the harder it is to
relinquish, even in the face of overwhelming contradictory evidence. Instead of
acknowledging an error in judgment and abandoning the opinion, we tend to
develop a new attitude of belief that will justify retaining it.” [26]
Association Bias – In his book, Dobelli quotes Mark Twain for the most
trenchant example: “We should be careful to get out of experience only the wisdom
that is in it – and stop there; lest we be like the cat that sits down on a hot
stove-lid. She will never sit down on a hot stove lid again – and that is well;
but also she will never sit down on a cold one anymore.” Conspiracy theorists
often rely on experts who have an “association bias.”
Intuitive Logic
Traps – Quick! If a store sells a bat and
ball for $11, and the bat cost $10 more than the ball, how much does the ball
cost? Did you think $1? That’s the intuitive answer. It sounds right, but the
answer is 50-cents. Thinking is harder than sensing. Sometimes things that
sound right are wrong.
Affect Heuristic – This is the classic snap judgment based on emotion and
how we feel about something, rather than a considered evaluation. It is the
like or dislike on Facebook, an automatic impulse without time to consider the
facts.
Introspection
Illusion – We tend to think that
introspection, thinking and reviewing our own beliefs helps to refine and
increase our self-knowledge. But science suggests otherwise - that when we
soul-search we fool ourselves into thinking our introspections are more
reliable.
The Sleeper
Effect – This is a phenomenon whereby we
are initially unpersuaded by an argument, advertisement, or propaganda because
of what would seem to be an obvious agenda or lack of objectivity of the
source, but later we find ourselves more receptive to the message. Why? Our
memory of the discredited source fades, while the message endures.
This is just a short list of the many pathways humans can
take to arrive at the wrong destination and thereby remain resistant to belief
modification. There are in fact many
more cognitive errors that could factor into conspiratorial beliefs held by
9/11 truthers. But the research offers a
reassurance of sorts, that many misperceptions, false beliefs, and myths endure
– not because of evil intent or malicious efforts, but because of simple human
nature, the natural way our brains are wired.
The studies on how we fool ourselves comport with my
personal interactions with doubters of the plane narrative. In discussions and debates they came across
as sincere and reasonable.
But
not all had fooled themselves. Some were
deceived by others.
[1] Wray Herbert, On Second
Thought, Outsmarting Your Mind’s Hard Wired Habits, (New York: Crown,
2010), 5.
[2] Melanie Green, and John K. Donahue, "Persistence of Belief
Change in the Face of Deception: The Effect of Factual Stories Revealed to Be
False," Media Psychology 2011,
no. 3: 312-331.
[4] Ibid.
[5] “CNN Poll: Quarter doubt Obama was born in U.S.,” CNN.com, August 4, 2010. http://politicalticker.blogs.cnn.com/2010/08/04/cnn-poll-quarter-doubt-president-was-born-in-u-s/
[6] Joseph Hallinan, Kidding
Ourselves: The Hidden Power of Self-Deception (New York: Penguin Random
House, 2014), 97.
[7] Hallinan, Kidding Ourselves,
103.
[8] Brendan Nyhan and Jason Reifler, "When Corrections Fail: The
Persistence of Political Misperceptions," Political Behavior, 2010
(2): 303-330.
[9] Nylan, Reifler 303-330.
[10] David McRaney, You Are Now
Less Dumb (New York: Gotham, 2013),149.
[11] Rolf Dobelli, The Art of
Thinking Clearly (HarperCollins, 2013), 19.
[12] Dan M. Kahan, Climate Science Communication and the Measurement
Problem (June 25, 2014). Advances Pol. Psych. (forthcoming). Available at
SSRN: http://ssrn.com/abstract=
[13] “Public’s Views on Human Evolution,” PewForum.org, December 30, 2013. http://www.pewforum.org/2013/12/30/publics-views-on-human-evolution.
The Pew Research Center found that 33 percent of the public believes “Humans
and other living things have existed in their present form since the beginning
of time” and 26 percent think there is not “solid evidence that the average
temperature on Earth has been getting warmer over the past few decades.”
[14] Richard Hofstadter, “The Paranoid Style in American Politics” Harper’s Magazine, November 1964.
http://harpers.org/archive/1964/11/the-paranoid-style-in-american-politics/
[15] Brendan Nyhan, “When Beliefs and Facts Collide,” New York Times, July 5, 2014.
[16] Kathryn Schultz, Being
Wrong, Adventures in the Margin of Error (New York: Harper Collins, 2010)
4.
[17] Ibid, 5.
[18] Ibid, 9.
[19] Mario Livio, Brilliant
Blunders From Darwin to Einstein — Colossal Mistakes by Great Scientists That
Changed Our Understanding of Life and the Universe (New York: Simon &
Schuster, 2013.
[20] Ibid, 10.
[21] Marcia Bartusiak, “ ‘Brilliant Blunders’ by Mario Livio, on scientists’
breakthrough mistakes,” Washington Post,
June 6, 2013.
[22] It’s Only Weird if It Doesn’t Work Campaign for Bud Light. http://www.translationllc.com/iframe/?type=work&id=875
[23] Matthew Hutson, The 7 Laws
of Magical Thinking: How Irrational Beliefs Keep Us Happy, Healthy, and Sane
(Hudson Street Press, 2012).
[24] Dobelli, The Art of Thinking
Clearly, xvi.
[25] “Faces in the Cloud,” snopes.com,
2001. http://www.snopes.com/rumors/wtcface.asp
[26] Robert A. Burton. M.D., On
Being Certain: Believing You Are Right Even When You Are Not, New York (St.
Martin’s Press, 2008).