Novella et al: THE SKEPTICS’ GUIDE TO THE UNIVERSE

Novella, Dr. Steven, et al. The Skeptics’ Guide to the Universe: How to Know What’s Really Real in a World Increasingly Full of Fake. Grand Central, 2018. ***

I read this shortly after publication, nearly three years ago now, and haven’t gotten around to writing it up here mainly because my notes were so long! But I’ll dump them here anyway, with one pass to clean up and trim, and add a summary on the NF reviews page.

“The skeptical, critical thinking, and scientific principles outlined in this book are like the rungs of a long ladder that humanity has used to climb laboriously out of the swamp of superstition, bias, hubris, and magical belief. We tend to look back now at medieval beliefs and congratulate ourselves for being born in a later age. But not every individual or even every institution has followed our best thinkers out of the muck. We all need to climb the ladder for ourselves.”

(P.S. Novella, a neurologist at Yale, is founder and executive editor of Science-Based Medicine, and frequently writes on its site.)

Gist

Compiled by the producer and host of podcast of the same name (aka SGU), with four co-authors, this is an encyclopedic guide to core concepts of skepticism (how our senses deceive ourselves; motivated reasoning; logical fallacies, cognitive biases, and heuristics; to examples of science vs pseudoscience (p-hacking, conspiracy theories) and cautionary tales from history (clever Hans, quantum woo, pyramid schemes). Then examples of applying skepticism to topics like GMOs and claims of free energy. Media issues (fake news, false balance). How pseudoscience can kill (naturopathy; exorcism; denial of medical treatments). And a final section about being willing to change one’s mind when evidence compels you to. Don’t tell people what to think, but how.

Take

It’s long, almost 450 pages of text, and encyclopedic, and frankly was less fun to read than some other books on similar topics, e.g. those by McRaney, Sagan, and Pinker. The structure of chapters seems odd at times; I suspect different chapters were written by different authors without careful editing for balance. (In fact, some of the later chapters describing detailed examples *are* credited to different authors.) But the book is thorough; think of any topic in these areas and it’s likely covered.

Detailed Summary

Intro

How author and brothers grew up watching the TV show In Search Of, narrated by Leonard Nimoy; were fascinated by science; and were raised Catholics. Until they realized these narratives were not sustainable. You have to think about what to believe and what to doubt. This is a never-ending journey. It’s not about doubting everything, or being cynical. We build a process for evaluating claims. Part of this is science; logic and philosophy are also involved.

Cosmos made a big difference, as did encounters with creationism. It’s not enough to point out facts.

He and friends formed skeptics group in New England, and started podcast in 2005. This book is a guide for skeptics, “because the world is actively trying to deceive you and fill you with stories and lies. The forces of ignorance, conspiracy thinking, anti-intellectualism, and science denial are as powerful as ever.” Xvii.b

Section 1: Core Concepts Every Skeptic Should Know

1, Scientific skepticism

It’s about accepting claims in proportion to logic and available evidence. Not the same as philosophical skepticism, e.g. Descartes, because we have a legacy of centuries of science to draw on.

There are several facets:

  • Respect for knowledge and truth
  • Promotion of science
  • Promotion of reason and critical thinking
  • Science vs. pseudoscience (i.e. recognizing the difference)
  • Ideological freedom and free inquiry
  • Neuropsychological humility
  • Consumer protection

Neuropsychological Humility and Mechanisms of Deception

P8: aside about sightings of airplanes in the years before the Wright Brothers…

2, Memory Fallibility and False Memory Syndrome

Our vivid childhood memories are mostly, or entirely false. People recall incidents differently. Different kinds of memory especially, long-term memory, are malleable, subject to fusion, confabulation (e.g. Hillary Clinton, or memories created by suggestion). Memories are personalized, contaminated, distorted. People are over-confidant in the accuracy of their memories, even when tests show they are not.

False Memory Syndrome arose in the 1980s when therapists supposed, despite lack of evidence, that women especially were suppressing memories of abuse. The evidence has not born the idea out. In 1990 a man was convicted of murder based on the recovered’ memory’ of his daughter…

3, Fallibility of Perception

Perception is constructed, and can be tricked, e.g. optical illusions. What we see is interpreted by the brain using assumptions based on past experience. Other senses can be disrupted, even our sense of inhabiting a body.

There’s also attentional blindness, as when we don’t see something we don’t expect, e.g. the gorilla [cf. https://en.wikipedia.org/wiki/The_Invisible_Gorilla] , or where the nearest fire extinguisher is. And we don’t notice small changes if they occur out of sight or if we blink. Or we misinterpret something unexpected for something that isn’t there—flying saucers; a missile that hit Flight 800.

4, Pareidolia

The process of perceiving images in random noise, like faces in clouds. More generally, apophenia. This arises from the way we perceive (prev ch) and that we parallel process. Expectation plays a big role. We’re especially skilled at seeing human faces; we’re a social species.

Famous example: the face on Mars, 1976. Conspiracy theorist Richard Hoagland has made a career promoting such examples. Others: the Virgin Mary in tree bark.

5, Hyperactive Agency Detection

The tendency to interpret events as if by deliberate intent of a conscious agent, rather than the product of natural forces or chaotic events. The genesis of conspiracy theories about JFK, 9/11, the moon landings. It’s about how we detect things in the environment that are alive—have agency—or not. It makes sense in evolution to assume wrongly that something is alive, than to ignore a clue and be eaten by predators. HADD. Assuming agency is the default, and the tendency is more active in unsure situations, thus superstitions, and of course, the development of religion. Skepticism is the process of testing HADD.

6, Hypnagogia

This is the state in which dreaming and waking are fused, producing experiences often mistaken for paranormal ones. Common when just waking, or trying to fall asleep. Related to sleep paralysis, when dreaming. Studies suggest some 8% of the population experiences it. In the past such incidents caused people to think they were possessed by demons; or these days, visitations by aliens. Whitley Strieber.

7, Ideomotor Effect

An involuntary subconscious muscle movement driven by expectation, creating the illusion of some external force. Thus dowsing. The idea arose in the 1800s. Ouija boards. Instances of fraud, or self-deception.

Metacognition – thinking about thinking

Intro note about the irony of a skeptic convention in Vegas, a city whose main industry depends on fallacies and biases.

8, Dunning-Kruger Effect

The overestimation of one’s abilities.

Like the characters in The Office, too incompetent and full of themselves to realize how incompetent they are. Everyone thinks they’re above average. The ignorant are full of irrelevant and misleading ideas that gives them the sense of actual knowledge, p47. It relates to confirmation bias. It applies to everyone, even to experts in other fields. The response is to exercise systematic self-doubt. and realize that we are as ignorant as the average person in areas in which we are not expert.

9, Motivated Reasoning

How we defend a belief that we hold with emotional investment. Most people are mostly rational in a Bayesian sense; we update our beliefs as new information comes to our attention. But not about beliefs that support our sense of identity or ideology; sacred cows. And then we defend those beliefs in various ways, 51. It arises from cognitive dissonance. There are organizations that prepackage such rationalizations and misinformation—about evolution, vaccine effectiveness, etc, 53.4 Also much information requires judgment or interpretation. Applies especially to political opinions. People pursue both a directional goal and an accuracy goal, and the combination can produce the ‘backfire’ effect. Different parts of the brain are involved, as MRIs show.

Try to focus on accuracy reasoning; realize that people who disagree are just people with different perspectives; but focus on logic and evidence.

10, Arguments and Logical Fallacies

A logical fallacy is an invalid connection between a premise and a conclusion. (Tyson quote about how logical thinking is not natural to the human mind, p57)

An argument isn’t adversarial; it’s an attempt to identify premises and reach a valid conclusion. Statements without evidence are just assertions. If the premises are true and complete, then the conclusion must be true. Arguments cannot resolve feelings or judgments.

The goal isn’t to tear down the other side, or to win; but to find the most valid premises and logic.

Examine the premises. Some premises are just false, e.g. that there are no transitional fossils. A premise might be an unwarranted assumption. Some arguments are rationalizations for conclusions already desired. Some premises are hidden—e.g. using different definitions of ‘transitional’, or a notion of how many there should be. A fourth problem is when a premise contains a subjective judgment, leading to circular reasoning.

There are also fallacies of logic. The examples are informal, that is their validity depends on context; they’re not invalid all the time.

  • Non sequitur
  • argument from authority (but consider individual rogue scientists vs scientific consensus)
  • argument from final outcome, e.g. teleological thinking
  • post hoc ergo propter hoc—assuming that since B follows A, A caused B.
  • confusing correlation with causation; there are always four possible correlations. Note website Spurious Correlations. Easily abused. Requires investigation, e.g. multiple correlations might suggest the same causal relationship. Cigarettes and lung cancer.
  • Special pleading, or ad hoc reasoning. Typically an arbitrary premise introduced to support the conclusion, e.g. ESP and Bigfoot. Sagan’s dragon in his garage.
  • To quoque—“you too.” Deflecting an argument by accusing the other side of something else. E.g. alternative medicine advocates.
  • Ad hominem. Attacking the person rather than addressing the argument. But not all name-calling is a logical fallacy. Close-minded vs open; skeptical vs being gullible, p72.
  • Ad ignorantiam – an argument must be true because we don’t know that it isn’t true. UFOs; intelligent design’s ‘god of the gaps’. But absence of evidence *is* evidence of absence, to a degree; you can’t prove a negative, but…
  • Confusing currently unexplained with unexplainable. Often used to justify paranormal or spiritual explanations.
  • False continuum. The idea that two things are the same because there’s no definite boundary, e.g. cults and religion. Similar to false equivalency.
  • False dichotomy. As if there are only two possibilities, not many.
  • False analogy. Because things are similar in some way, they must be so in other ways. An effect of our pattern recognition brains.
  • Genetic fallacy. Arguing against something because of where it came from.
  • Applying different standards, e.g. about herbal drugs, or intelligent design’s ‘information’ (Dembsky). Acupuncture.
  • Naturalistic fallacy. Confusing what is with ought to be. E.g. moral judgments vs what happens in nature.
  • Nirvana fallacy. If something isn’t perfect, it’s no good, e.g. vaccines, creationist arguments.
  • No true scotsman. Changing a definition to make an argument true.
  • Reduction ad absurdum. A legitimate argument in logic, abused when stretching a valid argument to an invalid one.
  • Slippery slope. Rejecting a moderate position because the extreme position must be rejected.
  • Straw man. Implying an argument concludes something that it does not. Creationist arguments.
  • Circular reasoning; the conclusion is the premise. Begging the question; the bible is the word of god because the bible says so.
  • Texas sharpshooter fallacy. Choosing criteria for success after knowing the outcome. Conspiracy theories.
  • Moving goalpost. Changing criteria for acceptance out of range of current evidence. Anti-vaxxers.
  • Fallacy fallacy. Just because an argument is unsound doesn’t mean the conclusion isn’t true. This happens when accusing someone of a fallacy they haven’t committed, and or detecting a genuine fallacy and declaring the conclusion therefore wrong.

Once aware of these, you’ll encounter them everywhere.

11, Cognitive Biases and Heuristics

These are flaws in how our brains process information, and rules of thumb that are not reliably true. Nice summary para 86-87. (This is where I first noticed that the structure of the chapters seems odd.)

Cognitive biases include our tendency to oversimplify; our right-hand bias. We understand the world in terms of physical relationships. Another: framing bias. We’re risk averse about some things, risk-seeking about others. Gambler’s fallacy.

We’re motivated toward our in-group, and to ourselves. We minimize cognitive dissonance. We suffer projection bias, and consensus bias. Hindsight bias; political pundits. And confirmation bias—see own chapter.

Heuristics are ‘90% rules.’ These include the availability heuristic. The representativeness heuristic. The unit bias, p92, where we reduce everything to a single quality, e.g. processor speed of a computer. Megapixels for a digital camera.

And the anchoring heuristic, common in marketing. You should always make the opening bid.

12, Confirmation Bias

The tendency to seek out new information that supports previously held beliefs; e.g. remembering the successes and forgetting the misses.

This is the one bias to rule them all. We notice and accept information that supports an existing belief, and ignore, distort, explain away, or forget information that seems to contradict an existing belief, 95.4. It’s more unconscious than motivated reasoning.

Example: anecdotes about vaccines build up to a solid belief. Only objective numbers reveal whether vaccines are useful or not.

Remembering the hits and forgetting the misses. Many examples. Bigotry. How it fits with other biases, including desirability bias. Polling data. Beware count-claims about mature science; how could all the dedicated researchers have missed this? Example about MS and auto-immune diseases. Also beware ‘preliminary studies’, which often support the researcher’s hypothesis. Control experiments. Conclusions about toupees, or sports cars.

P104: Wason Selection Task. About the four cards and the hypothesis [[ Pinker covers this too. ]] Core lesson: try to prove your hypothesis wrong; don’t just look for evidence that it’s right.

13, Appeal to Antiquity, p106

The idea that ancient wisdom must be valid. We’re fascinated by old monuments. And old ideas linger, like astrology. And acupuncture—though the current version is only 100 years old. People argue that these ideas have stood the test of time. But this is invalid; time doesn’t always test such ideas. They linger even when not true. Beliefs can linger for thousands of years – like bloodletting. Marketers like to say that science discovers what the ancients knew all along—but this simply isn’t always so. [[ Haidt’s first book examines this theme explicitly. ]]

14, Appeal to Nature

The idea that things that are natural are inherently superior to things that are manufactured. Similar to but distinct from the naturalistic fallacy. But obviously not everything natural is good for humans. Nor can we usefully define natural. Nature is full of poisons that would kill us. Our food supply is mostly the result of thousands of years of tinkering. Even the FDA has no strict definition of natural; they allow its use as long as there is no food coloring, etc.

15, Fundamental Attribution Error

How we attribute other people’s action to internal factors, and our own to external factors beyond our control. We blame our own errors on circumstance, but others on gullibility. Onlookers can’t know what led up to a moment of lack of self-control; screenwriters know this.

It’s easy to assume others are gullible. Perhaps they had a powerful personal experience, or are influenced by someone close to them. We make assumptions about people in social media all the time. Misunderstanding can lead to conspiracy theories. So: withhold judgment; give other people the benefit of the doubt.

[[ this aligns with comment below: people believe weird things, including religions, because those around them do, not because they evaluated evidence and came to their own conclusions. ]]

16, Anomaly Hunting

Looking for anything unusual, assuming it can’t be explained, and therefore is evidence for one’s pet theory. [[ the so-called “god of the gaps” argument ]] E.g. the umbrella man when JFK was shot.

True anomalies can lead to new discoveries about nature, e.g. Mercury’s orbit. But anomalies occur all the time, given incomplete understanding of what is known. The error is assuming they’re all significant. Similar to the lottery fallacy. Most of the time prosaic explanations can be found for apparent anomalies. E.g. the apparent discovery that neutrinos traveled faster than light. Pseudoscientists look for anomalies, assume that if they’re unexplained, they’re unexplainable, and then argue from ignorance that their pet theory [religion] must be true. Thus UFOs are alien spacecraft.

Examples. Crop circles. Ghost hunting. JFK and umbrella man. The assumption anything unusual must be part of a plan. The umbrella was a protest against Joseph Kennedy. Moon landing hoax—no evidence for it, just supposed anomalies, all explainable. Conspiracy theorists dismiss supporting evidence as fabrications. Flat-earthers, citing anomalies. Humans have an almost unlimited capacity to deny the obvious and convince themselves of almost any claim… p128

[[ And why does this happen? Book doesn’t go in to the psychology, which I would speculate (based on other books) is a combination of: lack of savvy understanding of the how world actually works; and the need to feel or be special, because of being in possession of a truth unknown to most. ]]

17, Data Mining

The process of sifting through large sets of data looking for any kind of correlation. Which may lead to hypotheses, but not confirmation.

Example, finding correlations between accident data and horoscope signs. We’re prone to pattern recognition, and confuse correlation with causation. Even true randomness looks like patterns. We’re hypersensitive to seeking patterns – but need ‘reality testing’ [science] to see if they are meaningful. This is about misunderstanding statistics; looking for any correlation, not a particular one. In science such correlations are often reported as preliminary, without noting that follow-up studied need to be done. Support for astrology consists almost entirely on data mining, never confirmed.

18, Coincidence

The chance alignment of two events that seem to be independent. E.g. dreams that ‘predict’ an event of the next day.

Again, we have a tendency to see patterns, and we’re poor at ideas of probability and very large numbers. Birthdays among a group of people; particular sequences of flipping coins. We remember unusual experiences and forget routine ones. We have so many experiences in our lives, it would be remarkable if there *weren’t* some remarkable coincidences. And our memories shift. Psychics make lots of predictions, gambling that at least one will hit and the others will be forgotten.

Add magical thinking and you have superstition. Driven by need for control and feeling of lack of control.

Famous example: The Monty Hall Problem, p137. Example of a problem in which even the intuition of experts is fooled. [[ Pinker covers this one too. ]]

Science and Pseudoscience

ESP is an interesting case because real research is being done and it’s difficult to understand why their results aren’t convincing. E.g., an experiment that seemed to follow all the rules, yet can’t be true (sensing future events). Rigorous replication tests were negative. This section reviews the nature of science and how its methods can be affected by bias and error. And we’ll see exactly the mistakes made in that study.

19, Methodological Naturalism and Its Critics

This is the idea that the universe follows natural laws and all effects have a natural cause. Materialism is that all physical effects have physical causes. There is no spiritual realm; the supernatural is untestable magic.

Methodological naturalism is the idea that nature is all that we can know – whether or not there actually is something else. Thus, we can dismiss the idea that the universe was created five seconds ago… 146.8 The key feature is that scientific ideas are testable. Ideas like ID or 5-second-ago are outside the realm of science, because they can’t be tested. Long passage from court ruling about ID, whose proponents want to change the ground rules of science to include supernatural explanations.

[[ The psychology here is about desire for belief, ignoring the everyday lessons of how superstitions have real-world explanations. God of the gaps. ]]

The ‘wedge’ documents show how the ID battle is ideological—proponents feel their supernatural worldview is threatened. Thus a kind of culture war. They pretend to do their own ‘science’.

But what if the supernatural exists? [[ then nothing is predictable, everything is random, and there’s no point in having developed our technological civilization. ]] That is, effects exist without natural causes. In such a universe, science would make no real progress. In fact, science is doing pretty well.

20, Postmodernism

This is fhe idea that science is just a cultural narrative with no special relationship to the truth.

Example: declaring evidence-based medicine to be a kind of fascism. The idea that all ideas are understood as human storytelling, narratives driven by culture and bias. But actually this is just a way to dismiss science you don’t like, a kind of sour grapes.

Thomas Kuhn introduced the idea of ‘paradigm’ in 1962, claiming that shifts from one to the next occurred for cultural reasons. That preference between paradigms is subjective. But this is a false dichotomy; change occurs along a continuum. Second, it doesn’t matter how ideas are thought up; what matters is how they are tested. Further, scientific theories are rarely replaced by others; once established, they are refined, not replaced. Like defining the exact shape of the earth, in greater and greater detail. [[ The recent book by Strevens explores this theme. ]]

And all science is about the same reality; different fields aren’t distinct, but have to agree. Wilson’s ‘consilience.’ Cultural biases may show up in science, but are eventually corrected.

21, Occam’s Razor

This is the idea that the simplest explanation should be preferred.

But the idea is rarely properly understood, e.g. invoked to support the idea that UFOs are alien spacecraft. The simplest answer *isnt’t* necessarily the best one. Thus, a very rare diagnosis should be avoided if several more common ones serve. It’s all about probability. Alien spaceships is a massive new assumption, and therefore very unlikely to be a correct explanation. The UFO proponents need to invent endless new ideas to explain how spaceships explain various phenomena.

21, Pseudoscience and the Demarcation Problem

There is no sharp demarcation between rank pseudoscience and rigorous science.

There have been all sorts of crazy hypotheses, e.g. list p162m. It’s critical to tell the difference between valid claims and such nonsense. Science has wide support; and so proponents of silly ideas like to claim scientific backing for them. We can define several characteristics that define both.

Good science uses objective observations, tests hypotheses and tries to disprove them, always open to new data, does not confuse cause and effect, and is done in a transparent way, p163-4.

Many ways pseudoscience can work:

  1. Working backward from conclusions. Science actually works to disprove hypotheses; pseudoscientists offer lawyerly reasons why their ideas must be true.
  2. Hostility toward scientific criticism, claims of persecution. Science always involves criticism; pseudoscientists reject criticism as personal attacks or witch-hunts, comparing themselves to Galileo. These are excuses for not doing rigorous science.
  3. Making a virtue out of ignorance. It’s no longer possible to make significant advances without thorough education in current science and its practices. Some amateurs think lack of such training is an advantage. Example of claim that the earth is hollow.
  4. Reliance upon weak forms of evidence while dismissing more rigorous evidence. E.g. Andrew Weil, or UFO believers citing mountains of poor evidence in the absence of any good evidence.
  5. Cherry-picking data. Related to the previous. Choosing which ESP data to accept. Cheating.
  6. Fundamental principles are often based on a single case. Some ideas are based on single, unverified observations, e.g. chiropractic practice, and iridology.
  7. Failure to engage with the scientific community. The easy questions have been answered [[ an important point!! ]] It’s difficult for an individual to make substantial progress. Pseudoscientists tend to work alone, or with similar-believers.
  8. Claims often promise easy and simplistic solutions to complex problems or questions. [[ another crucial principle, as in politics ]] E.g., ideas that all human disease has some single cause. A common psychological appeal, as with the supernatural and religious beliefs.
  9. Utilizing scientific-sounding but ultimately meaningless language. Example: Gwyneth Paltrow’s goop.
  10. Lack of humility—making bold claims on flimsy evidence. Pseudoscientists are always making claims about revolutionizing science or humanity, easily dismissing the entirety of established models.
  11. Claiming to be years or decades ahead of the curve. E.g. making a claim that would take years to validate, like being able to transplant a head.
  12. Attempts to shift the burden of proof away from themselves. The burden of proof is on the claimant. [[ very common – you can’t prove me wrong, therefore I’m right – especially in religion. ]] Argument from ignorance; if something can’t be explained, it must be supernatural [[ or God did it ]]
  13. Rendering claims non-falsifiable. E.g., inventing reasons why evidence doesn’t exist. Homeopaths; creationists. If a scientific hypothesis isn’t falsifiable, it’s “not even wrong”; it’s worthless.
  14. Violating Occam’s razor and failing to fairly consider all competing hypotheses. Pseudoscientists often ignore competing hypotheses, e.g. for UFOs. [[ there’s a bias here about the false dilemma, as if only two possible explanations exist – another common in religion. ]] And for them no amount of disconfirming evidence will make them change their mind.
  15. Failure to challenge core assumptions. E.g., studies about details without verifying that the core phenomenon exists, such as tooth fairies. Feynman’s “cargo cult science.” Motivated reasoning; special pleading.

So, keeping these in mind, we can make reasonable judgments. At one end is evolutionary science, supported by multiple lines to evidence and debated transparently for decades, and at the other end is the guy who thinks the world is hollow. And in the middle is the guy doing ESP research, sloppily.

23, Denialism

The motivated denial of accepted science using invalid strategies.

This works backward from its desired conclusion, of course, and generally have political motivations: about climate change, evolution, germ theory, consciousness, mental illness, AIDS, vaccines. But it’s marked by strategies, not particular beliefs. These include:

  • Manufacture and exaggerate doubt. There’s always doubt in science, but deniers keep ‘just asking questions’ over and over long after the answers have been provided. Science is always provisional, but that doesn’t mean it isn’t as robust as it can be and at given time.
  • Always ask for more evidence that exists or can exist. That is, move the goalposts. Scientists will change their mind based on evidence; deniers never do. E.g., gaps in the fossil record. When a gap is filled in, they focus on another gap. The question is the progress of filling gaps—not whether any remain. Or deniers ask for evidence that could never exist, e.g. randomized trials of vaccinations and non-vaccinations. There’s special pleading here as well.
  • Use semantics to deny categories of evidence. E.g arguing that science is only about experiments, so anything that happened in the past can’t be studied as science—evolution. Or those who deny mental illness, e.g. scientologists.
  • Interpret disagreements about details as if they call the deeper consensus into question. As science progresses it focuses on deeper and deeper details. Thus, we know that DNA carries genes, even though there are details we don’t know about DNA. This argument is used by those why deny that consciousness is a function of the brain; and of course by evolution deniers.
  • Deny and distort the consensus. There are always disagreements among scientists; deniers magnify these. Thus the 97% about climate change; or the consensus on evolution.
  • Appeal to conspiracy; question the motives of scientists. Thus, deniers of global warming mine emails and take statements out of context; GMO scientists are Monsanto shills; evolutionary biologists hate god.
  • Appeal to academic/intellectual freedom. E.g., academic freedom to teach creationism; parents’ right to choose not to vaccinate. But professions have standards, and universities are not obligated to teach nonsense.
  • Argument from consequences. Common: evolution will undermine belief in god; addressing global warming will result in government takeover of private industry. But this ties your moral or ethical position to a false scientific conclusion. Also called ‘solution aversion.’ Science requires courage – 192.7

24, P-Hacking and Other Research Foibles

Ways to bias scientific outcomes.

This is an idea like changing the rules mid-game, or deciding that whatever the outcome, you’ve still won. There is much research and some of it is shoddy. How to you tell? Example of a study that claimed TM affects murder rates, using an artificial threshold and an arbitrary study period. [[ like focusing on only a portion of a temperature change graph. ]] Also called torturing the data until they confess.

The problem with P-values: This is a statistic related to likelihood of data given a null hypothesis. It’s a way to tell interesting data from random noise. Examples with medical tests and can be false-positive or negative. You need a Bayesian approach to assessing all possibilities. You can assess the correctness of scientific studies in the same way.

P-hacking. Researchers can choose when to start or stop taking data…thus affecting the p-value. As soon as they reach 0.5, they publish. Thus many studied can’t be replicated.

These problems are all fixable. More rigorous replication studies, results that are significant, etc. – four criteria, p202. Homeopathy, acupuncture, and ESP fail such criteria on multiple grounds.

25, Conspiracy Theories

Claims that a powerful group is carrying out a deception against the public for their own ends.

The one ring to rule them all, of faulty thinking. A get-out-of-jail card; the last refuge of the intellectual scoundrel. Yet they remain common.

More technically a ‘grand conspiracy.’ Three legs: the powerful organizations that carry it out; the theorists who see through it; and everyone else, the dupes.

The problem is they’re insulated from refutation; they’re exercises in special pleading. Any evidence against it is just part of the conspiracy. No one ever comes forth to admit being part of the conspiracy. Eventually the entire world is involved. It’s always ‘they’. Who controls the research, e.g. for the cure for cancer? And they must be simultaneously powerful and stupid to make the obvious errors that the conspiracy theorists give the game away, e.g. the moon landing.

Derives from pattern recognition, hyperactive agency detection, confirmation bias, anomaly hunting, fundamental attribution error, ad hominem attacks on anyone skeptical of the conspiracy; tautological arguments about who benefits.

A 2016 paper examined how likely a vast conspiracy would not fail, by someone within giving the game away, given the number of people presumably involved. Using conservative numbers, the result was that they would fail in four years. Koontz quote, 211b.

We’re all a bit prone, especially if a conspiracy supports our political notions. But some people are prone to believe any conspiracy theory that comes along. Examples of percentages, p212-3. Psychologically, belief relates to a desire for control and understanding, triggered by lack of control or a “crippled epistemology” 215t. A recent example is the Sandy Hook killings, and a Florida blogger who thinks it was a conspiracy. Anomaly hunting, and unethical journalism. How could it possible have all been a conspiracy, among the entire town? [[ Conspiracy theories arise from lack of understanding how the world actually works – savviness. E.g. scientific research and competition. ]]

26, Witch Hunts

An unjust investigation of a person or group in which the extreme nature of the crime is used to justify ignoring the usual rules of evidence.

The actual witch hunts in Salem in 1692. And for hundreds of years in Europe. A 1487 German book.

Six parts: accusation equals guilt; suspending normal rules of evidence; allowing spectral evidence (i.e. dreams and visions); methods of investigation as bad as the punishments (e.g. torture); encouraging accusations; accusations used as a weapon. Modern witch hunts: McCarthyism; preschool abuse; the use of ‘facilitated communication’ p224.

27, Placebo effects

An apparent response to a treatment that is due to something other than a biological response.

Radioactive tonics; radio wave diagnosis; recent bracelets made of rubber or plastic. Why would people swear by them?

Many factors. There’s a reporting bias among patients who want to feel better. People in clinical trials take better care of themselves and really do get better. And regression to the mean means that bad symptoms will be followed by moderate ones. It’s not mind-over-matter; it’s that mood and belief affect perception of pain. Most alternative medicine relies on these effects.

28, Anecdote

A story or experience offered as evidence, despite lack of control for bias. The weakest form of evidence; wishful thinking.

Defenders of alternative medicine, e.g., seem convinced by anecdotal evidence that it works. Scientists aren’t, after centuries of treatments based on anecdotes have been disproven. At best anecdotes are sources of hypotheses to test.

We use such evidence in our personal daily lives; we don’t have time to conduct scientific studies of everything. But these can’t be generalized, not even to stories from ‘credible’ witnesses. Anyone can be fooled.

If someone isn’t obviously a con artist, aren’t they telling the truth? Not necessarily; there are other motives: make life more interesting; pious fraud to insure justice is done.

Iconic Cautionary Tales from History

Don’t miss the big picture. Santayana quote, p236.

29, Clever Hans effect

  1. The Dr Doolittle books popular because of rising acceptance of idea of evolution. Hans is the ‘observer-expectancy’ effect.

30, the Hawthorne effect

Simply observing something may alter its behavior. Things are always more complicated than you think. We tend to oversimplify things, p242b. Name derives from studies of employees at a factory in Hawthorne—whatever conditions changed, employee performance improved. But—most one-paragraph summaries aren’t correct. In this case, the effects are mixes of expectations by both observers and participants. Also the placebo effect.

31, Cold Reading

(etc) recall that TZ episode. Also a South Park episode, p254. Note p256.

32, Free Energy

Examples. Never a lone brilliant scientist who has achieved where all others have failed… Lack of humility.

33, Quantum Woo

Brin quote. Past examples: electromagnetism. Then radiation. And now quantum mechanics. Three ways it’s weird, p265. Deepak Chopra.

34, Homunculus Theory

Examples, 271. See text

35, Intelligent Design

Michael Behe in 1996. Problem is, it’s not falsifiable. Discovery Institute. False dichotomy of assuming evolution is random, and what appears designed must be caused by intelligence. Contrast an evolved city with a planned city. Does life display bottom-up or top-down design? Overwhelmingly the former. Examples of common descent. The young-earth creationists suppose that the creator imagined life to *look* as if it had common descent, for some reason; nothing therefore could disprove their idea. Claims of irreducible complexity also rely on a false premise. Mousetrap. Bacterial flagellum. That, and argument from ignorance. ID is ultimately a ‘god of the gaps’ argument. ID has never been able to make a prediction of positive evidence that would show top-down design. Their real statement would be as on 283… that the creator made the universe to look exactly as if it were evolved.

36, Vitalism and Dualism

The idea of a ‘life force’, and the idea that the mind is separate from the physical brain.

Our brains do in fact sort the world into living and nonliving things. Every culture has a word for the supposed animating force. Children have naïve notions like this. But vitalism was a placeholder until knowledge of biology grew. Now such ideas are the realm of pseudoscience—‘energy’ and vibrations. Chiropracty; therapeutic touch. Kirlian photography.

Similar history with dualism. But we now know our brains are our minds. Example of a dualist is Jerry Fodor. Denialists are like creationists, who take a dispute about the mechanism of evolution into a denial of evolution entirely, p290t. of focusing on a snapshot of science rather than its progress over time.

Cartesian dualists; light fairies; the hard problem. Chalmers is a dualist and acknowledges progress in neuroscience, but dismisses these as easy problems. Not the hard problem: why do we experience our own existence? Dennett says it’s all easy problems. Chalmers’ qualia.

37, N-Rays

Discovered by Blondlot in 1903, supposedly. Debunked by Wood. Later, 1988, a paper with the basis for homeopathy. Debunked by John Maddox et al.

38, Positive Thinking

Chopra to Oprah. The Secret. ‘Law of attraction’. All magical thinking.

39, Pyramid Scheme

Example.

Section 2: Adventures in Skepticism

Now we use these tools to do some deep dives into various topics. Everything is on the web, but we need to know how to assess what we find. Pointers:

  • What is the source of the information? Academic sites and group blogs by professionals are best; commercial sites less so. Worst are ideological sites promoting a specific narrative.
  • Look at a variety of sources; track information back to its original source.
  • Follow a discussion through to its end.
  • Examine the arguments of each side to see who has better arguments.
  • Check yourself against others you respect and have expertise.
  • Don’t cherry pick the answer you want.
  • All conclusions are tentative; be willing to incorporate new information

40, Motivated reasoning about genetically modified organisms

When GMOs came along people like Greenpeace rallied people against them; Monsanto was an evil corporation. So Steve set out to investigate.

Bottom line: the anti-GMO side is built on sand. Their claims don’t hold up. It’s a narrative by Greenpeace and the organic food lobby for ideological reasons. Challenge one of their claims, and they’ll switch to another. Examples of journalists who came to similar conclusions.

Virtually everything we eat has been artificially selected – via genetic changes—from original wild species. Forced mutations have been done since 1930. Genetic modification involves inserting genes from organism to another. Cisgenic, transgenic. Opposition seems to arise that such things couldn’t happen in nature—but they do. Specific issues:

  • Health effects. The studies show they’re safe. To critics, there’s never enough testing (moving the goalpost). And critics find other reasons.
  • Environmental effects. Complex topic.
  • Indian farmer suicide. A myth. Terminator seeds. Saving seeds. Agent orange. Patenting life. Golden rice.

Still a contentious issue.

41, Dennis Lee and Free Energy

This is about a perpetual motion machine, which is impossible. Dennis Lee is a famous modern-day example, with a long history of arrests and frauds, p337. In 2001 authors attended one of his events—attended mostly by old people—and tried to hand out their own fliers, until challenged by security guards and forced to leave. Lee doesn’t sell products; he asks for investments, appealing to patriotism and religious faith. He plays to those susceptible to conspiracy theories.

42, Holly-woo

Southern California is infested with woo, from Scientology to Jenny McCarthy to Gwyneth Paltrow to Deepak Chopra. Example of ‘Aubrey’ and her susceptibility to all the latest health fads. Diets, colonics, CrossFit. Gwyneth’s Goop. All of this comes from not trusting the “mainstream medical establishment.” And so on.

43, The Singularity

People typically overestimate near-future technology and underestimate the deeper future. Examples about flying cars, cell phones, the Jetsons.

So consider the singularity. The idea of smart machines building even smarter ones. Kurzweil. Neal Asher’s Polity universe. Etc.

There’s a lot of potential. But: we don’t have self-aware AI yet. And there are many kinds of intelligence. So we just don’t know. We can be optimistic, and skeptical, at the same time.

44, The Warrens and Ghost Hunting

Ed and Lorraine Warren were well-known ghosthunters in New England. They claimed to have scientific evidence. They were nice people, but none of their evidence stood up to scrutiny. Ed linked understanding his research with faith in God. Asking for the best evidence results in changing the subject. Most of their evidence were photographs, with blobs of light. In almost every case, the ghost wasn’t seen at the time, only after the photo was developed. All photos presented without discussion of possible causes—other than ghosts.

And they wouldn’t allow any of their evidence to be copied or shared.

There are many more ‘eyewitness’ testimonies by people who think a good story is all the evidence they need—no idea about unreliability of human memory and perception.

45, Loose Thinking about Loose Change

Quote from Alan Moore about how no one is in charge—the world is rudderless.

Loose Change was a documentary that claimed the 9/11 attacks were staged by the government. Authors talked about the film on their podcast. It seemed alluring—but only because of the emotional connection of having been near the event. So keep in mind:

  • If you’re only hearing one side of the story
  • If the volume of information is onerous—where there’s smoke, etc.
  • How easily cherry-picked information creates a false reality
  • How your own bias distorts your perception.

Section 3, Skepticism and the Media

About common problems with journalism in the age of the internet, the web, and social media.

46, Fake News

The internet is a war of ideas; it’s also a venue for fraud, lying, misinformation, and manipulation. The idea of fake news has, like the idea of being a skeptic, been subverted by the villains. Look at the varieties of news outlets: traditional journalism (within which, a spectrum of quality); biased or ideological journalism (on a scale orthogonal to quality); opinion outlets; satirical news; and fake news.

Social media– “The only real solution is for each individual to be a savvy and skeptical reader.” Echo chambers. Like sites dedicated to one topic, where dissenters are branded ‘trolls’…p380b

Astroturfing. This is fake grassroots activism – nonprofits, facebook pages, letters to the editor, as if there is a movement for some agenda. But this can be used by the villains (example). It’s hard to know where reality is. Accusing someone of being a shill.

47, False Balance

Presenting both sides seems reasonable. But in science the weight of evidence is usually in one direction, not divided in two. When the media applies the two sides reasoning to science, it’s known as false balance. Deepak; climate deniers.

48, Science Journalism

There’s a tension between science and journalism. Author does it every day on the SGU Fb page. One study showed about 2/3 of medical news stories had major flaws. Sometimes the problem goes back to the press releases themselves. Some reporters write their pieces with a message in mind, before interviewing relevant experts. Or ask leading questions.

In reading science news, pay attention to the source and the reputation of the reporter. Is there context?

49, Enter the Matrix

Example of bad reporting by the Daily Mirror and Discovery News, about brain-machine interface technology; what really happened; the journalistic failures; and how the press release was largely to blame.

50, Microbiomania

About the subject; recommends Ed Yong’s book, and a particular blog. But many articles overhyped. Again, many from the press releases, by design. Then there are wellness cranks peddling pseudo journalism—with products to sell. E.g. Dr Mercola, with a book to sell, who thinks we shouldn’t bathe. Many of these show familiar red flags: simple solutions for complex problems; conspiracy theories about distrust of mainstream experts. And Deepak Chopra, the worst.

51, Reporting Epigenetics

Example of an Australian reporter – note how reporters love to suggest new findings that overturn established science, especially if they can invoke famous names, like Darwin’s.

Epigenetics means tweaks to the expression of genes in response to environmental factors. It doesn’t really affect evolution, and it’s certainly not inheritance of acquired characteristics. Long example…

Section 4: Death by Pseudoscience

Addressing questions like, what’s the harm if people believe in pseudoscience?

52, Death by Naturopathy

How a 32-year-old woman died of a heart attack because she’d been given intravenous infusions of… turmeric, the spice. Via a naturopath named Kim Kelly, who claims all sorts of effects without evidence.

53, Exorcism: Medieval Beliefs Yield Medieval Results

Nice quote p415: “The skeptical, critical thinking, and scientific principles outlined in this book are like the rungs of a long ladder that humanity has used to climb laboriously out of the swamp of superstition, bias, hubris, and magical belief. We tend to look back now at medieval beliefs and congratulate ourselves for being born in a later age. But not every individual or even every institution has followed our best thinkers out of the muck. We all need to climb the ladder for ourselves.”

And so there are those who still believe in demonic possession and perform exorcisms to cure it. There are guidelines in the Catholic Church, but amateurs try it on their own: examples of mothers murdering their children.

54, Death by Denial

Nice Feynman quote, p419

For example, those who, for whatever reason, think that HIV is not the cause of AIDS. In South Africa the president withdrew HIV treatment for years. P422: “When challenged about the lack of evidence to support beetroot and other such treatments, she [the health minister] again criticized ‘Western’ medicine for being ‘bogged down’ in clinical trials.” These policies lasted until 2005, causing an estimated 33,000 premature deaths.

55, Suffer the Children

Parents who ‘treat’ their children on the basis of religious beliefs or New Age thinking. Example of girl wrapped in a sheet until she died.

Section 5: Changing Yourself and the World

56, Being Skeptical

Remember all of this applies to you, too. Author’s strategy is to be willing to change one’s mind, and be proud of doing so when the evidence compels you to. Resists telling other people what to believe. Be humble, but be courageous. Speak up. Remember we are all a product of circumstances; p432, “most people will end up accepting the belief system they are born to.” Be polite with friends and family. Advice, 436: plant the seed; don’t compete or confront; find common ground; don’t attack head on; watch your tone; try to understand the other person’s narrative, 438.

Further discussion of raising children. Don’t tell people what to think, but how.

Epilogue

This has been a book for the listeners of the podcast…. But don’t trust them. Think for yourself.

\\\\

Early thoughts: [[ as written in early 2019; I’ve since developed these ]]

\\ I suspect things like being anti-vax are as much about cohering to one’s social group, as about thinking through things for oneself – this should be a key point. To dispute the consensus of one’s community would be like declaring oneself an atheist—to some that means you can’t trusted.

\\ recall Haidt’s description of human thinking as being lawyerly rather than rational. It’s about defending one’s tribe—a social function of group selection, perhaps. Not about accurate understanding of the world.

This entry was posted in Book Notes, Science, Thinking. Bookmark the permalink.