Robert Kirvel

Believe You Me


     What’s the most outlandish statement you’ve heard a person offer as an opinion the individual nevertheless believes is true? A Midwesterner in my clan, after dismissing the value of books, followed up by insisting, “History never happened,” and defended the belief. Fantastic assertions that appear to be expressed in earnest invite two questions. What’s really going on here? Can science provide any insight? Explanations may lie in understanding some of the emotional triggers and neurological foundations underpinning and reinforcing belief.
     Science, unlike expressions of personal opinion or belief, generates consensus assertions only after accumulating and analyzing evidence. Empirical evidence (data observed through experiments or direct observation) and conclusions from such data are deemed credible in science to the extent the information under peer review meets objective standards for reliability and validity. Reliability is a measure of how consistent and repeatable findings are. Validity is the extent to which procedures measure what they claim to measure and the results can be generalized.
     Beliefs—from relatives or anyone else—are more slippery concepts than data, arising as they do between the fragments of thought and experience. Contemporary philosophers describe a belief as a “propositional attitude” or mental stance. It’s frequently a state of mind or attitude involving trust, faith, or confidence about what’s not there, or the unknown. Science is about understanding what’s really there, as in real data. Science rests on facts or evidence then, whereas people thinking as they please can say or believe any serious or silly thing they want to believe, and they can deny or discount factual evidence that contradicts what they think. The problem with resistance to changing a person’s mind in the face of counterevidence is that such stubbornness threatens interpersonal relationships and sometimes society as a whole, especially in situations where human cooperation is crucial.
     The distinction between scientific fact and personal belief is reasonably clear-cut then. Scientists value data, and credulous people cherish some of their notions too, especially when it comes to beliefs central to personal identity, but the gullible have no monopoly on balderdash. What are we to make of the following statements attributed to genuine experts, including scientists, over the years?
     The French naturalist Jean-André de Luc, who first proposed the term “geology” and advanced the modern concept of geochronology (dating geological events), thought granite boulders came to rest on unlikely spots, such as plateaus in the mountains, after being shot from caverns by compressed air. Picture a large underground blowgun, and you have the idea. Linus Pauling, the only person to be twice awarded an unshared Nobel Prize, was convinced until his death from cancer that vitamin C cures cancer. These are remarkable claims, but other scientists have made even weirder statements about what they believe.
     One of the men responsible for unraveling the double-helical structure of DNA and honored with a Nobel Prize for the work, Francis Crick, maintained without evidence our planet was “deliberately seeded with life by intelligent aliens.” William Hershel, the highly regarded astronomer who discovered Uranus and several of Saturn’s moons believed aliens living on the sun have giant heads so their noggins don’t explode. Another of the greatest scientific minds of the twentieth century, British astronomer Fred Hoyle who was knighted in 1972, suggested human nostrils evolved with the openings facing the ground so that pathogens floating down from outer space wouldn’t fall into them.
     Perhaps it’s less amusing when a politician tells the public a free press is the enemy of the people. It may be depressing or even shocking to hear a leader suggest good people don’t go into government or that individuals can never be too greedy. The problem is that once we adopt a belief as our own, we carry a representation of the belief in our heads—a sort of “belief box” or mental dialogue as some psychologists think of it—and the belief can be taken out of the box and implemented to play an important role in shaping behavior. Accompanying a belief is the potential for real behavior arising from it, but if a belief is false, then action and reaction can become problematic.
     What we conclude about a person’s beliefs begins to make more sense by pondering that person’s viewpoint, the individual holding the belief, but intentions are not always obvious. When an otherwise reputable scientist claims boulders are shot out of the Earth like cannonballs, or the ancestors of mankind are from outer space, the claimant might just be seeking attention or notoriety. When an expert claims noses point downward to protect against cosmic debris, the objective might be nothing other than to provoke folks with sheer speculation. When a politician makes ridiculous statements in public, knowing the press will have a field day, the purpose could be simply to instigate, distract, or garner publicity.
     Other interpretations, however, come to mind. It’s possible a person is genuinely convinced what he or she believes must be true. Is it so far-fetched to imagine boulders being hurled from the bowels of the Earth? After all, fragmented rocks the size of compact cars have been documented to fly into the air from explosive volcanoes. Is it possible for aliens to have seeded the Earth with the fabric of life delivered in meteors or spaceships? Certainly. An hypothesis about life being transferred between planets or star systems dates back to the ancient Greeks and is dubbed panspermia. It’s also possible the Earth’s moon’s core is made of cottage cheese or macaroni, though such “hypotheses” are not scientific propositions because they are not experimentally testable or measurable, at least at the moment. A subject-matter expert might actually believe in an idea that appears on the surface to be ludicrous, but when an authority figure states an unsupported belief, one conclusion is indisputable. A scientist mouthing unsupported personal opinion is no longer speaking as a scientist or from reliable and valid scientific evidence. A politician spewing outrageous ideas is no longer speaking on behalf of the nation as a whole or necessarily in the best interest of humanity.
     Beliefs are generally underwritten by feelings. Virtually any emotion—from lust or love to fear and rage—can shape a given belief. For example, we might subconsciously opt to believe in an eternal after-life to cope with emotional pain associated with loss of a loved one, or to reduce our dread over the certain knowledge one day we too shall die. Throw into the mix one’s age, education, political bent, gullibility, ego, aspirations and passions, religious conviction or its absence, anxieties and preoccupations, and it becomes apparent that gut feelings can reinforce or erode opinions we hold dear. 
     When my relative asserted, “History never happened,” the conversation had already been politically charged. Context, emotional or otherwise, is not just relevant but usually key to understanding what appear to be nonsensical statements of belief. Here, the unexpected statement was in response to a suggestion the individual might profit from reading a book on a contentious topic. “I don’t need to read books,” he replied. “Anyone can say or write anything in a book these days. Besides, anything written in books about things happening, say, a hundred years ago is phony. No one knows about any of that stuff. History never happened.”
     Now to be sure, I like exaggeration in myself far more than in others, as most do, but these are among the most jarringly anti-intellectual remarks I’ve encountered in recent memory. Will and Ariel Durant (The Story of Civilization) would surely take exception to the idea of worthlessness of history books in documenting cultural knowledge, but there’s little doubt my relative has confidence in what he says. Belief, in short, can be and often is about thin air floating on a foundation of thinner air. In contrast to belief, the only thing that legitimately drives science is empirical evidence, but of course scientists are human too. To the extent a scientist’s emotions or ulterior motives affect science, the process is rendered corrupt and unscientific in direct proportion.
     A person who renounces book-reading along with the very existence of history appears on the surface to be illogical, even delusional. One might as well claim the universe doesn’t exist, but the individual hawking gobbledygook in this case was radically politicized rather than solipsistic. It's one thing if a book-hater deploys the belief by advocating book burning; however, perspective changes entirely when considering some likely underlying motivations. If an individual feels insecure about a lack of book reading or education in general, or feels threatened by opposing viewpoints—in other words, experiences emotions arising from uncertainty or conflict—one of the most common and effective responses is to resort to a defense mechanism, such as denial. In the present context, the internal voice of denial might announce itself along the following lines in response to a perceived threat:
     “No, I’m not wrong, you are wring. If some book says I’m wrong, then books are worthless and so is history.”
     The argument that anybody can write anything in a book also entails rationalization, which is another common defense mechanism for responding to a perceived threat. Rationalization often involves justifying some attitude or behavior (making excuses) to avoid or hide the real explanation or motive. On factual grounds, a free hand in writing anything one pleases might apply to love letters and blogs and some fiction, but not to historical writing of scholastic merit, or to annotated and peer-reviewed science writing with its emphasis on reliability and validity. It’s fair to ask whether the claim made about books being worthless, along with history, is actually about something else. 
     If a person says, for example, “Rats eat garbage and live in it, and they do just fine,” to justify neglect of the homeless, the argument is not about rats or garbage. Similarly, statements about books and history being rubbish are about neither books nor history, but about carpet-bombing a conversation to win at any cost. Discounting the value of all books in this case, and rejecting any new information available in them without even considering the content, is an example of rationalization to maintain a sense of social and political identity. 
     The same argument can be made about skeptics of science in general or those who appear to reject any science-derived and -validated conclusions. From the examples of celebrated scientists such as Pauling, Crick, and Hershel, it's obvious not all scientists are faultless in their reasoning, and science itself as a discipline is hardly immune to criticism. Indeed, science welcomes criticism. But an individual who disparages the scientific method in general, or dismisses all science conclusions as worthless, just to defend a particular conviction, is not making a valid argument against science. Is a rejection of science more likely to arise from an intimate familiarity with the scientific method or its opposite? If the latter, then the criticism has little to do with science. Rather, what is being verbalized is a misconception about what someone imagines science to be, or it’s about intellectual insecurity along with an ignorance of science. 
     Beyond psychological concepts such as insecurity or denial and rationalization, brain researchers are discovering neural sites and pathways in the brain active in maintaining belief when a person is challenged. In a paper published in Scientific Reports, Kaplan, Gimbel, and Harris ( showed alteration in the activity of prefrontal and orbitofrontal cortex as well as emotional and “feeling” (e.g., threat) centers, such as the insula and amygdala, when deeply held political beliefs of test subjects were challenged. Such research verifies an emotional, neurological component underlying certain beliefs.
     The Irish writer, teacher, and language activist, Máirtín Ó Cadhain, said (in translation) the year before he died, “We know more about the stars in the firmament than about what’s going on under that small skull beside you.” Although we are only at the frontier of figuring out the brain’s role in maintaining or modifying beliefs, neurophysiological research shows considerable promise in unraveling what’s going on at the physical level of neurons. 
     So what led up to the declaration about books being worthless and history never happening? The trigger had been the idea of splitting, yet another ego defense mechanism. Psychological splitting is sometimes described as polarized or black-and-white thinking in which aspects of the world are split into right and wrong or good and bad, with no allowance for nuances. A person who believes all Republicans are righteous and all Democrats are demons, or vice versa, might hold such a belief to simplify unsettling complexities or to reinforce a sense of self-worth. In response to a suggestion that reading something about splitting might be useful, my relative dismissed both books and history: all books and all of history older than the age of those who today have experienced actual events. The irony—if it can be called irony—is the mention of one defense mechanism, splitting, was perceived as an attack rather than a path to understanding, initiating the expression of two more defenses: denial and rationalization. 
     When encountering what appear to be outlandish statements made in earnest, each of us has options. As tempting as it might feel to launch a hot or cold rebuttal, the wiser course can be to show restraint in interpreting such remarks literally. What’s really going on may not be so much about the words spoken, but rather what prompts or underlies the words, including emotional triggers and neurological foundations underpinning and reinforcing belief. What a person says, especially regarding an inflexible or nonsensical belief, can sometimes reveal more about aspects of underlying character and neurobiology unavailable directly to eye and ear than any literal interpretation of words spoken aloud.



© 2015 by William Ray