Daily Archives: June 18, 2010

Michael Shermer’s Believing Brain

This looks like it may well be a good year for scientific books on belief. My paperback “The Science of Superstition” is out later this month and Robert Park (author of Voodoo Science) has a new book, “Superstition: Belief in the Age of Science,” out in August. Moreover, one of the high priests of skepticism, Michael Shermer, also has a forthcoming book on belief, tentatively entitled, “The Believing Brain.”

Back in February, Shermer gave a TED talk where he gave us a tantalizing 15min glimpse of what will be in his new book.  In true Shermer tradition, it was a very entertaining presentation, and I was very pleased that he highlighted the ADE651 story about bomb-detecdting in Iraq.  I was even more delighted to see he referred to my work in his Slant article.

I agree with Shermer’s main “patternicity” idea that we are inclined to ascribing agency everywhere. He also referred to Susan Blackmore’s seminal work on signal-to-noise thresholds in believers which is strongly associated with supernatural belief propensity.  In much the same way I argued in “The Science of Superstition” (formerly known as SuperSense), Shermer supports the idea that the natural inclination in humans is to believe and the skepticism and the scientific approach is unnatural. This of course, is an old idea and we are both indebted to the great 18th century Scottish rationalist philosopher David Hume, who said over 200 years ago,

“We find human faces in the moon, armies in the clouds; and by a natural propensity, if not corrected by experience and reflection, ascribe malice and  good-will to everything, that hurts or pleases us.”

Michael uses the modern language of statistics to explain the propensity of humans to detecting all manner of patterns that we have documented so much in this blog, as Type I errors – rejecting the null hypothesis. In other words, saying something is present when in fact it is not. This is much better than making a Type II error which is rejecting a real signal as not being there when in fact it is.

Why are Type II errors a disadvantage and how could a Type I bias have been selected for?   Stewart Guthrie in his book, “Faces in the Clouds,” argues that our intuitive pattern processing biases us towards seeing faces which leads us to assume that hidden agents surround us. Building on David Hume’s, “We find human faces in the moon, armies in the clouds,” observation, Guthrie presents the case that our mind is predisposed to see and infer the presence of others which explains why we are prone to see faces in ambiguous patterns. If you are in the woods and suddenly see what appears to be a face, it is better to assume that it is one rather than ignore it. It could be an assailant out to get you. Why else would they be hiding in the shadows?  In this case it is always better to err on the side of caution.

Of course, there is still much further research to be done, such as why are there individual differences and why does our shift to Type I errors increase under certain circumstances. These are some of the questions that we are currently researching in our lab but I look forward to reading Michael’s account which I know will be eminently entertaining and engaging.

3 Comments

Filed under book publicity, In the News, Research