The facts, studies, and arguments reviewed in this thesis
tend to support several broad conclusions:
1. Conspiracy
theories, myths and other false beliefs have always been -- and will likely
always be -- with us.
The general consensus of those who have spent time studying
conspiracy theories and their believers is that they are not going away.
Richard Hofstadter, whose book Anti-intellectualism in American Life was awarded the Pulitzer
Prize for General Nonfiction in 1964, wrote in his seminal Harper’s essay:
“This glimpse across a long span of time emboldens me to make the conjecture –
it is no more than that – that a mentality disposed to see the world in this
way may be a persistent psychic phenomenon, more or less constantly affecting a
modest minority of the population.” He ended his essay with this wistful
observation, “We are all sufferers from history, but the paranoid is a double
sufferer, since he is afflicted not only by the real world, with the rest of
us, but by his fantasies as well.[1]
Author Jesse Walker, the books editor of Reason magazine, summed it up this way
in his own 2013 book, The United States
of Paranoia:
The
conspiracy theorist will always be with us, because he will always be. We will
never stop finding patterns. We will never stop spinning stories. We will
always be capable of jumping to conclusions, particularly when dealing with
other nations, factions, subcultures, or layers of the social hierarchy. And
conspiracies, unlike many of the monsters that haunt our folklore, actually
exist, so we won’t always be wrong to fear them.
Which
leads to another fundamental conclusion of this research, namely:
2. Belief in
conspiracies is not, in and of itself, irrational.
The predisposition to believe the worst about the
government, the media, large corporations or other groups with vested interests
is not necessarily a sign of irrational paranoia. It can often be a sign of
healthy well-founded skepticism. Michael Shermer, the founding publisher of Skeptic magazine writing in Scientific American, notes:
Conspiracies
do happen, of course. Abraham Lincoln was the victim of an assassination
conspiracy, as was Austrian archduke Franz Ferdinand, gunned down by the
Serbian secret society called Black Hand. The attack on Pearl Harbor was a
Japanese conspiracy (although some conspiracists think Franklin Roosevelt was
in on it). Watergate was a conspiracy (that Richard Nixon was in on). How can
we tell the difference between information and disinformation? As Kurt Cobain,
the rocker star of Nirvana, once growled in his grunge lyrics shortly before
his death from a self-inflicted (or was it?) gunshot to the head, "Just
because you're paranoid don't mean they're not after you.”[2]
3. What we think we know from history can be
wrong.
Our beliefs are often viewed through the lens of what we
believe has happened in the past. The common maxim, “History is written by the
victors,” is often attributed to Winston Churchill, but quote researcher Ralph
Keyes says it is actually an old idea given modern expression by many people,
including Napoleon, Nehru, and Stalin.[3]
A pithier and more
cynical view of history is rendered by Ambrose Bierce in his classic Devil’s Dictionary: “History, n. An account mostly false, of events
mostly unimportant, which are brought about by rulers mostly knaves, and
soldiers mostly fools.”[4] But a more scholarly
examination of “things we know that ain’t so,” is offered by American University
Professor W. Joseph Campbell in his examination of what he labels, “ten of the
great misreported stories in American journalism.” He blames not fringe
conspiracy theorists for the persistence of many false beliefs, but mainstream
news organizations and well-respected historians for perpetuating what he terms
“media myths.” Campbell debunks what he calls “some of the most treasured
stories in American journalism,” such as the belief that President Lyndon
Johnson, after watching a broadcast by CBS news anchor Walter Cronkite,
experienced a sudden epiphany, saying, “If I’ve lost Cronkite, I’ve lost middle
America.”[5]
Media
myths also tend to minimize or negate complexity in historical events and present
simplistic and misleading interpretations instead. Edward Murrow no more took
down Joseph McCarthy than Walter Cronkite swayed a president’s views about the
war in Vietnam. Yet those and other media myths endure, in part because they
are reductive: they offer unambiguous, easily remembered explanations of
complex historic events.[6]
The corrosive effect of such false accounts of historic
events is that they erode trust in what purports to be factual, accurate
reporting. If the accepted view of history is wrong, how reliable is the
accepted view of anything? And the media myths also tend to support negative
stereotypes. Campbell cites what he labels “the widely misreported pandemic of
crack babies in the late 1980s and early 1990s, and the exaggerated reports of
violence in the aftermath of Hurricane Katrina in 2005” as two examples that
falsely confirmed the “worst pathologies associated with inner-city poor
people.”[7]
Those who do not remember history may be condemned to
repeat it, as the adage goes, but those who misremember history are also at
risk of learning the wrong lessons and perpetuating damaging stereotypes.
4. There can be
honest disagreement about what is a fact and what constitutes truth.
Is Jesus the son of God? For many Christians this is a matter
of faith, not subject to any scientific, evidence-based test. But a rational
fact-driven argument can be constructed as well. Paul Little does a commendable
job in his book Know Why You Believe.[8]
And because the scientific method leaves room for doubt, and in fact encourages
doubt and the search for disconfirming facts, it is by its nature a process
that accommodates new ideas and concepts that may have at one time seemed
unthinkable or improbable. All human knowledge is subject to revision if new facts
or evidence emerge. There is a difference between what is “settled science,”
and what may be an immutable law of the universe. Truths we hold to be
“self-evident” may not always be true or self-evident.
5. Human nature
is not predisposed to unemotional rationality.
In this thesis I have explored many ways we humans fool
ourselves into thinking we are behaving rationally and acting logically when in
fact we are not. I have examined how heuristics – the mental shortcuts that
help us navigate life’s decisions without having to burden our brains with
exhaustive analysis – also make us more likely to believe and accept false
beliefs and faulty logic. And I have concluded that this is not necessarily a
failing. We simply could not function if we had to do an exhaustive analysis
before we adopted any belief. We trust our instincts and that mostly works for
us. I do think that it can be very beneficial to attempt to be more cognizant
of when we are relying more on intuition than reason. Just that increased understanding
of the subconscious ways our brain fools us can improve our judgments and
decision-making.
6. Efforts to
correct false beliefs are often ineffective and sometimes counterproductive.
This paper also documents the strong emotional attachment
we have to our beliefs. Satirist Stephen Colbert is widely credited with the
tongue-in-cheek observation, “I'm not a fan of facts. You see, the facts can
change, but my opinion will never change, no matter what the facts are.” I
spent an inordinate amount of time searching for time and place Colbert
originally made that comic riff, including scouring an entire book about his
life, but never found the source.[9] Maybe he never really said it. I cite it
anyway because it so perfectly encapsulates our natural tendency toward
cognitive dissonance. After reviewing hundreds of pages of research and dozens
of books on the subject, if I had to distill the reasons we believe things that
are not true, in defiance of all logic and reason, it would be this. “We
believe untruths mostly because we want to.”
7. It’s
complicated.
Some subjects are just very complex and difficult to
understand, and not just for the layman. Some may require special knowledge in
areas such as science, mathematics, medicine, or even history. And we all have
gaps in our knowledge. We simply can’t know everything there is to know about
everything. The result is we may defer to experts, who themselves can be
confused, or more often just rely on our instincts, which the research shows
can mislead us in myriad ways.
What’s the Answer?
There are essentially two areas of pursuit when it comes to
correcting false beliefs: debunking the purveyors and/or convincing the
believers. It may seem, at first blush, that the two areas are in fact the
same. That is to say that the first goal, “debunking the purveyors,” is simply
a means of achieving the second goal, “convincing the believers.”
But they are in fact two very different things. If someone
is spreading a false rumor about you, you can try to get the gossipmonger to
stop, or you can try to convince people the rumor is untrue.
As we have seen when it comes to the false narrative about
the September 11 attack on the Pentagon, efforts to dissuade the purveyors of
the misinformation ultimately proved fruitless. To the extent there was any
success in containing the spread of the Pentagon attack “inside job” conspiracy
theory, it came not from curbing the selling
of the story, but from discouraging the buying
of it.
Simply offering an alternative product – in the form of a
higher-quality, more authoritative, better-reasoned, fact-based account – also
had limited effectiveness among those who were predisposed to accept the false
account.
In his book, On
Rumors, Harvard Law professor Cass Sunstein posits – for the sake of
argument – that society could be thought of as consisting of two groups: the
“sensibles and the unreasonables.” Both groups, he argues, are likely to
process information in a biased manner, accepting those materials that fortify
what they thought before and rejecting those that contradict their original
views as “implausible, incoherent, ill-motivated, and probably a bit crazy.”[10]
Sunstein cites two factors in the biased assimilation of
information: strong prior belief and skewed trust, and he cited the research of
Kahan, et al in asserting: “If you want to move people away from their prior
convictions, it is best not to present them with the opinions of their usual
adversaries, whom they can dismiss, but instead the views of people with whom
they can closely identify.” [11]
So while Thomas Patterson argues in Informing the News the solution is training better journalists and
giving them a more solid grasp of their subject matter, and a knowledge of
using knowledge, I would argue that the solution, to the extent there is one,
is to expand that argument to education as a whole.
A more savvy news and information consumer is the best
defense again the flim-flam artists peddling pseudo facts to an
all-too-gullible public. One can marshal all the facts, logics and reason in
the world, but a closed mind will remain an impenetrable barrier.
Researchers Green and Donahue recognized this factor in
their study of the effect of corrective information on people exposed to
fictional narratives. For the new facts to change beliefs, the subjects had to
be open to the information, as well as have the thinking and reasoning skills
that would allow them to properly process the information. “People also have to
possess certain qualities of mind: critical reasoning skills essential
to drawing valid inferences from evidence; a faculty of cognitive perception
calibrated to discerning when a problem demands such reasoning; and the
intrinsic motivation to perform the effortful information processing such
analytical tasks entail.” [12]
In 2012, Charles Negy, Associate Professor of Psychology,
University of Central Florida, encountered what he considered a deficit in
critical thinking skills in his Cross-Cultural Psychology lecture class of
almost 500 students. The issue arose when a small number of Christian students
asserted their religious views to be the “most valid,” and when one student
urged others not to engage in an exchange of ideas because presumably the
validity of Christianity is not debatable, it was more than Professor Negy
could stand.
After the class, Negy composed an email he students
decrying what he labeled as “let's just put our fingers in our ears so we will
not hear what we disagree with,” “anti-intellectualism,” and “religious
bigotry,” and in particular the unwillingness of college students to seriously
consider other points of view. “Some
students,” he wrote, “erroneously believe a university is just an extension of
high school, where students are spoon-fed ‘soft’ topics and dilemmas to
confront, regurgitate the ‘right’ answers on exams (right answers as deemed by
the instructor or a textbook).”[13]
Negy’s email went on to explain that the purpose of a
university, and his course in particular, is to struggle intellectually with
some of life's most difficult topics, while noting “it is not the case that all
views are equally valid; some views are more defensible than others:”
Critical
thinking is a skill that develops over time. Independent thinking does not
occur overnight. Critical thinkers are open to having their cherished beliefs
challenged, and must learn how to “defend” their views based on evidence or
logic, rather than simply “pounding their chest” and merely proclaiming that
their views are “valid.” One characteristic of the critical, independent
thinker is being able to recognize fantasy versus reality; to recognize the
difference between personal beliefs which are nothing more than personal
beliefs, versus views that are grounded in evidence, or which have no evidence.[14]
The body of research on false beliefs and magical thinking
firmly demonstrates that they’re not a manifestation of ignorance or
psychological problems. Rather they are a normal and common part of our
psychology, a coping mechanism for humans to process and understand a world
that can seem disordered and even sinister.
In his book, Believing
in Magic, Connecticut College psychology professor Stuart Vyse argues,
“Without making criminals out of believers, we must adopt policies that
encourage people to choose reason over unreason. We must provide alternate
methods of coping with life’s uncertainties, and promote other more rational
systems of belief.” But he concedes, “It will not be an easy task.”[15]
Professor Vyse teaches a class that I would think should be
required at all universities, in all disciplines. It is a freshman seminar
called “Psychology and Critical Thinking,” and he describes it as basically
“how to tell a good argument from a bad one.” His assigned reading is an essay
by American philosopher Charles Sanders Peirce, in which the 19th Century
pragmatist outlines four basic ways of attaining knowledge: tenacity,
authority, a priori, and the scientific method.[16]
Vyse argues tenacity – holding onto an idea because of
“stubborn loyalty” is the poorest source of knowledge. Authority is not much
better, he says, because authorities are often wrong, especially when they
“resort to their powerful status as support for the validity of their ideas.”[17] A priori is essentially
our intuitive sense: something that makes sense or seems to fit. Earlier in
this thesis, I outlined the shortcomings of relying on intuition over reason,
and why it is prone to error. Finally is the “scientific method,” which Vyse
actually divides into two methods, “empiricism” and “rationalism.”[18]
It would take another thesis entirely to explore the ways
scientific methods differ and how the self-correcting process of scientific
inquiry can result in theories that are modified or discarded over time.
But Vyse makes the overarching case, that I would make as
well, namely that it is important to teach the skills of critical thinking
early, not waiting until students get to college, but rather beginning in
elementary school.
Students
could be taught to evaluate the authorities they encounter on television and
elsewhere. They can learn how to determine whether the views of the authority
are based on empirical inquiry (good) personal experience (not so good) or yet
another authority (bad). Does the authority attempt to convince her audience by
an appeal to the evidence or to her personal status and power? Does the
authority have a vested interest in a particular view? What is the quality of
the evidence given?[19]
I would suggest journalists seeking to find and further the
truth, as well as engaged citizens at every level, would do well to ask and
answer those questions.
This thesis took me on a journey of inquiry that over a
period of several years circled back close to where I started, when finding
myself at the center of a conspiracy theory, I wondered: how could so many
smart people believe dumb things? I thought then, as I think now: We all have
to learn to think more critically, except now I have a much better
understanding of how challenging that can be, and why for some people it is not
likely to happen.
I have concluded from my research and from my personal
experiences as detailed in this thesis that there is no quick fix, no antidote
to human nature and all its cognitive errors, no way to marshal facts, logic
and reason that can’t and won’t be met with pseudo-facts, faulty logic, and
emotion. But that does not mean it is
not a fight worth fighting.
[1] Richard Hofstadter, “The Paranoid Style in American Politics” Harper’s Magazine. November 1964. http://harpers.org/archive/1964/11/the-paranoid-style-in-american-politics/
[2] Michael Shermer, “Paranoia Strikes Deep” Scientific American, September 2009, Vol. 301, Issue 3.
[3] Ralph Keyes, The Quote
Verifier (New York: St. Martin’s, 2006), 90.
[4] Ambrose Bierce, The Devil’s
Dictionary (1911 p.57), as cited by Leonard Roy Frank, Random House Webster’s Quotationary, (New York: Random House,
1999).
[5] W. Joseph Campbell, Getting
It Wrong: Ten of the Greatest Misreported Stories in American Journalism
(Berkley: University of California Press, 2010), 5.
[6] Ibid, 4.
[7] Ibid, 5.
[8] Paul E. Little, Know Why You
Believe (Illinois: Inter-Varsity Press, 1971).
[9] Lisa Rogak, And Nothing But
the Truthiness: The Rise (And Further Rise) of Stephen Colbert (New York:
St. Martin’s Griffin, 2011).
[10] Sunstein, On Rumors, 51.
[11] Ibid, 52.
[12] Melanie Green, and John K. Donahue, "Persistence of Belief
Change in the Face of Deception: The Effect of Factual Stories Revealed to Be
False," Media Psychology 2011,
no. 3: 312-331.
[13]Charles Negy, Associate Professor of Psychology, University of
Central Florida, email to students, Huffington
Post, Aug 16, 2012. http://www.huffingtonpost.com/2012/08/16/charles-negy-reddit-letter-to-students_n_1789406.html
[14] Ibid.
[15] Stuart Vyse, Believing in
Magic, The Psychology of Superstition, Updated Edition (New York: Oxford
University Press, 2014), 255.
[16] Charles S. Peirce, “The Fixation of Belief,” Popular Science Monthly, November 12, 1877), 1-15. http://www.peirce.org/writings/p107.html
[17] Vyse, Believing in Magic,
256.
[18] Ibid, 255.
[19] Ibid.