Catching up on links and comments from this past week.
First, a couple book reviews in last Sunday’s NY Times Book Review.
First, a review by Adrian Chen of Suspicious Minds: Why We Believe Conspiracy Theories, by Rob Brotherton. Subtitle: “We are hard-wired to believe that nothing happens by accident”.
George Washington entertained conspiracy theories about the Illuminati.
Brotherton attacks the stereotype, which he says was popularized by the historian Richard Hofstadter in his influential essay “The Paranoid Style in American Politics,” of conspiracy theorists as a small band of tinfoil-adorned loonies — the paranoid fringe. Brotherton’s main argument is that we all possess a conspiracy mind-set to some extent, because it is hard-wired into our brains. “Suspicious Minds” details the various psychological “quirks and shortcuts” that make us susceptible to conspiracy theories.
For example, psychologists have discovered that we possess an “intentionality bias,” which tricks us into assuming every incidental event that happens in the world is the result of someone’s intention.
And especially this:
Paradoxically, the illusion of an evil, all-powerful conspiracy guiding events can be more comforting than the reality that humans are rarely in control.
\\
And this review, by Robert A. Burton (author of On Being Certain: Believing Your Are Right Even When You’re Not, a book on my shelves), of two books, Black Box Thinking: Why Most People Never Learn From Their Mistakes — but Some Do by Matthew Syed, and Failure: Why Science Is So Successful, by Stuart Firestein.
More about cognitive biases, Daniel Kahneman, Steven Pinker, and whether we can overcome those biases. Can there be a program for self-improvement?
This problem becomes particularly acute when a book both outlines our deeply rooted behavioral inclinations and simultaneously suggests that they might be overcome. The better your argument for our inherent limitations, the weaker become your bootstrap suggestions for self-improvement.
\\\\
Last week Michael Krasny, host of the nationally-syndicated radio program Forum, hosted a segment asking guests and callers What Have You Changed Your Mind About…And Why?.
What’s notable in this program about whether people have changed their minds about anything, and why, is that the political and scientific issues people have change their minds about are invariably from conservative/denialist positions to liberal/reality positions. Just saying. Note that the show is broadcast cross-country, so it’s not as if all the callers are from the relatively liberal Bay Area…
\\\\
The Atlantic: All Stories Are the Same: From Avatar to The Wizard of Oz, Aristotle to Shakespeare, there’s one clear form that dramatic storytelling has followed since its inception, by John Yorke.
An essential essay about how stories work and why they are appealing.
Also this, from The New Yorker: How Stories Deceive, by Maria Konnikova.
More and more discussion of how humans think in terms of *story*, and how it’s hard for humans to understand aspects of reality that do not conform to a familiar kind of story.
My follow-on thought: while I’d like to think about science fiction as being a way to explore understanding of reality outside the usual parameters of human culture, I have to admit that still, SF consists of a body of *stories*, stories which are more or less successful depending on their appeal to human biases, and so, how can it be trusted to reveal anything new?
But actually, SF has to some extent anticipated this. More on this in next posts.