Here’s a book I read recently, just a week or so after returning from the hospital in late June, and which I’ve skimmed again in the past week to take notes and write up this summary. It’s one of two I read around that time by authors who are also media personalities. Shankar Vedantam runs a podcast called The Hidden Brain, and for a while (though I haven’t heard him there recently) was a special correspondent on NPR, IIRC in the mornings on All Things Considered. (The other book, which I’ll cover shortly, is by Fareed Zakaria, host of a CNN show and a guest columnist for the Washington Post.)
Vedantam’s book is another in a line of books in recent years about how our perceptions and cognitions are molded by the evolutionary priority of survival, and not accurate perception of the “real world.” It’s also a book that resembles various pop-psychology books, like those of Malcolm Gladwell, in that it presents a thesis upfront and then illustrates it with various case studies, some of them anecdotes about individuals, some of them results of psychological experiments.
This book is especially true to that form. In fact, the thesis of the book is contained entirely in the introduction, over about 10 pages, and the rest of the book is a series of examples to justify it, some of them at great length. So I’ll summarize the introductory thesis in some detail, and then summarize the 10 chapters of the book more briefly.
Useful Delusions The Power & Paradox of the Self-Deceiving Brain (Norton 2021), by Shankar Vedantam & Bill Mesler.
Introduction
- SV was on a road trip to meet Donald Lowry, notoriously known as a con artist who’d scammed hundreds of men via a direct-mail scheme. What was odd is that many of his victims *defended* him. How could this be?
- Exploring this case caused SV to reconsider his worldview. He’d always figured that avoiding mental errors and biases was a good thing; that rationality should prevail. So what was going on? Perhaps the victims were gullible rubes? Or could Lowry’s “Church of Love” have provided some kind of valuable service?
- Could self-deception have good outcomes? Could it be functional?
- He came to realize that foregoing self-deception is a sign of privilege. “If you don’t believe in Santa Claus or the Virgin Birth, it’s because your life does not depend on your believing such things. … But should your circumstances change for the worse, were the pillars of your life to buckle and sway, your mind, too, would prove fertile ground for the wildest self-deception.” (page xviii)
- We need hope in order to function, but the world offers mostly despair and dysfunction. Human history is a tiny sliver of cosmic history; the earth is just one planet among hundreds of billions of planets in our own galaxy alone. Some of us see this and wonder; others see these things as suggesting that humans are insignificant in the large scheme of things. But this is not a useful attitude for survival.
- Thus all human cultures have beliefs that tell people their lives have purpose and meaning; religions offer reassurances about what happen to you after you die. “Poking holes in these claims is easy, because they are often illogical and far-fetched.”
- [[ Passing thought: thus the need for such beliefs is the price for thinking and self-awareness; humans still need to survive, and thus allow instincts to prevail; the problem is resolving such thinking with the instincts. Dumb animals don’t need such things; they just follow instinct to reproduce and survive. Of course the distinction isn’t crisp. ]]
- Thus if deception is functional, it will survive. Life doesn’t care about what’s true; it cares about what works.
- The eye, for example, receives far more information than it passes on to the brain; yet the brain gives you the illusion that you’re seeing everything. “Our eyes and brain are not in the truth business; they are in the functionality business…” “Your brain has been designed [[ well no this should be ‘has evolved’; designed only in a passive sense ]] to help you survive, to forage for opportunities, to get along with mates and friends, to raise offspring to adulthood, and to avoid feelings of existential angst. From the perspective of evolution, objective truth is not only not the goal, it is not even the only path to the goal.” (both p. xxi)
- Freud recognized that the mind has layers; newer layers, like those in the city of Rome, do not wipe out the older ones. Thus the most recent mental faculties that humans have evolved enable us to override what feels true with what we know to be true. But that does not mean, it turns out, that logic and rationality are all that matter. That would be thinking that the past does not matter and plays no role in shaping current reality.
- “This book argues that across many domains today, and especially where we see the forces of culture and reason and logic besieged by unreason, tribalism and prejudice, we are really seeing projections of a war raging inside our own heads. Conflicts between traditional and modernity have parallels inside the brain…” The old city and the new speak different languages.
- The argument is “that just because self-deception can lead us to ruin, it does not necessarily follow that it has no role to play in ensuring our well-being.” (p. xxiv).
- “Our minds are not designed to see the truth, but to show us selective slices of reality, and to prompt us toward predetermined goals. Even worse, they are designed to do all this while giving us the illusion that we are seeing reality.” (p. xxiv)
This dovetails with my thesis that people not only aren’t rational most of the time, they don’t *need to be* rational to get along and survive; they can rely on the instinctive protocols of basic survival, which value tribal identities and superstitious hypotheses about the world, rather that rational understanding of the world.
Personally, I’m interested in sidestepping the mind’s illusions, peeking around the corner so to speak, and see what reality is really like. (And this, at its best, is what science fiction is about.)
Of course these themes have run through prior books over the past decade or more. Jonathan Haidt contrasts the “elephant” and the “rider,” where the latter is the rational part of the mind yet has only limited control over the irrational part, which makes decisions on emotional grounds. Matthew Hutson’s book invites us to recognize the various kinds of magical thinking we are subject to, and to welcome them, even use them judiciously, to make us happier.
For the rest here are relatively brief summaries of the remaining chapters. (Which as I type I can’t help expanding to include salient points and pertinent quotes.)
- Ch1, Hot Air. Example of a “professional liar” who is in fact a hotel concierge, having to constantly calm hotel guests and be nice to them (chart of sample phrases, p9), no matter how rude they are. Thus the idea that we often say what we do not think. Trump, unlike Obama, had no filter. King Lear; Dickinson’s “Tell the truth but tell it slant.” Can we try not lying for a few days? In fact we all commit lies of politeness, and especially to people most close to us. To avoid lies is to care more about the truth than about outcomes.
- Ch2, Everything Is Going to Be OK. Americans value honesty—except when children are involved. Thus Santa Claus, George Washington and the apple tree (an invented story). And coaches to athletes. And to friends about their illnesses (“you look great!”). Reading the Bible makes some people feel so much better. Example of Dan Ariely*, burn victim, who didn’t want the truth about his condition. And in fact studies have shown that patients who want to realistically confront their conditions – the example is about AIDS victims – die earlier.
- Ch3, The Theater of Healing. Examples of medical fraud and placebos. In Paris, 1784, Mesmer claimed “magnetic healing”. In 1994 a surgeon named Bruce Mosley realized that his knee surgery patients felt better whether or not he actually did the surgery. Is there a difference between using placebos and what witch doctors do? Yet deliberate use of placebos is contrary to medical ethics.
- Ch4, The Invisible Hand. About Penn & Teller; the placebo effect; and the stories behind what we buy. Advertising. Sometimes *raising* the price increases sales. Perceived quality of wine, and of gym drinks, is affected by prince. Logos vs. mythos. About Joshua Bell playing on a subway attracts no notice; changing the story changes the music.
- Part II, chapters 5 through 7. This is all about the scammer Donald Lowry, who set up an email scam to lonely men, pretending to be lonely women. He was successful in his depictions of the women, to the point of asking the men for favors, for money, which the men were happy to provide. And yet, the broad point is, then even when exposed, the “victims” were happy to support Lowry, because he’s provided the valuable portions of their lives.
- P65: “In recent years, psychologists and neuroscientists have shown that the human brain is designed [[ again: has evolved ]] to make a number of errors in perception and judgement. These ‘bugs’—distortions, shortcuts and other cognitive cross-wiring—produce slanted pictures of reality. They exist for a reason: Evolution found that, on average, the bugs lead to a great likelihood of survival and reproduction. It’s a grave mistake to think that evolution is remotely interested in helping us perceive reality accurately.”
- And p69, how this relates to the film, loved by Roger Ebert, about pet cemeteries, <i><b>Gates of Heaven</b></i>.
- Part III: The Tribe.
- Ch8, Walking through Fire. About rituals, in Rwanda, the Ghost Dance at Wounded Knee, and their function as commitment to a group.
- P142.4: “While bulletproofing is an extreme example of the vast array of ritual practices we see throughout the world, it provides important clues to why billions of human beings perform an array of seemingly senseless acts every day. Rituals encompass everything from hazing at university fraternities and the consumption of bread and wine at the Eucharist to shaking hands when we greet each other and celebrations by soccer players when they score a goal. …”
- P 150m: “Taking part in painful or difficult rituals—what experts call “costly rituals”—is particularly useful in <i>signaling our commitment</i> to group, ideologies and causes, and in eliciting trust and commitment from others.”
- Ch9, Something Worth Dying For. About commitment to one’s country, commitment to founding myths and national truths.
- “These self-deceptions are responsible for creating some of the crowning glories of human civilization.”
- Without the story of the US, we wouldn’t have Neil Armstrong walking on the moon, Apple and Google, the Yankees or the Eagles.
- Yet the Muslims also have their videos of sacred causes.
- P170: “Our minds are vulnerable to myths, falsehoods and fiction not merely because we are dumb or stupid, but because we are frail, flawed, and easily afraid.”
- P171: “In the future, should nations no longer be the principal way people organize themselves on the planet, it’s a safe bet we will develop new stories and myths to make whatever systems we come up with seem like eternal truths.”
- Ch10, The Grand Illusion. About the fear of death, and terror management. The Egyptian pyramids, the capacity for self-deception.
- P180, the mortality paradox: “We understand that one day we will die, but we cannot really imagine being dead.”
- P181, Thus humans have invented “immorality narratives”. And as societies grew, these became religions, with punitive gods and moral codes. What would be the converse? And in fact religious belief does extend life spans, if only through having extended social support networks.
- P192: “religious faith might be the canonical example of how beiefs that are unprovable, or even demonstrably false, can sometimes be good for us.”
- Epilogue. P 198: “This book as attempted to reveal a fundamental paradox in our relationship with the truth.”
- P199: “What psychological benefit does holding a false belief confer on the people who hold it? What underlying needs does it address? Are there other ways to address those needs? … Throwing evidence and data against passionately held false beliefs is important, but often futile. Many people hold false beliefs not because they are in love with falsehoods, or because they are stupid—as conventional wisdom might suggest—but because those beliefs help then hold their lives together in some way. Perhaps the delusion provides comfort against anxiety, or a defense against insecurity. Perhaps it draws the approval of their groups. Just as beliefs about an omniscience, angry God fall away when a functioning state provides us with infrastructure, laws and public safety, and when market economics provide us with consumer goods, entrepreneurial opportunities and good jobs, so also the way to root out self-deception is by compassionately asking what people lack, and exploring how we might help replace what is missing.”
- So, p200, “As a pragmatist, I say we should care much less about what’s true, and much more about what works.” Concerning climate change. So can we use religion to solve these large problems?
- Finally the author invites you to consider the last time you read a great novel. Those incidents, those characters, likely moved you, even as you knew they weren’t literally true. It doesn’t matter that they are not true; it matters what those stories do for us.
Ch4 has as its epigraph the famous Shakespeare line: “There is nothing either good or bad, but thinking makes it so.” (p.45)
Inspired by this book, I found my copy of Sam Harris’ short book Lying, and have now breezed through it once. He makes points that contradict Vedantam’s. I’ll reread again it more closely and post about it here soon.
*Dan Ariely wrote one of the earliest books about self-deceptions and mental biases, Predictably Irrational: The Hidden Forces That Shape Our Destinies (2008), which I summarized here back in 2019.