Over on my other blog, yesterday I made fun of a bunch of Republican presidential candidates (Cain, Bachmann, Perry, Santorum) for believing that God had told them to pursue their political dreams.
But it would have been just as easy for me to make fun of myself -- or anyone -- because we all are prone to mistaking messages from the hidden part of our own brains for guidance of cosmic import.
If you've ever said, "I think the universe is sending me a message" (I sure have), this is an indication that the difference between you and someone who believes that God is talking to them merely is a matter of what unseen entity is supposedly doing the communicating.
I've started reading Daniel Kahneman's book, "Thinking, Fast and Slow." He's the psychologist who gave a fascinating TED talk about our experiencing and remembering selves. The fast and slow aspects of the brain is a different, yet somewhat related, subject.
Here's what Kahneman says at the end of a chapter I finished this morning:
Some years ago, the psychologist Timothy Wilson wrote a book with the evocative title Strangers to Ourselves. You have now been introduced to that stranger in you, which may be in control of much of what you do, although you rarely have a glimpse of it.
System 1 [the fast brain] provides the impressions that often turn into your beliefs, and is the source of the impulses that often become your choices and your actions. It offers a tacit interpretation of what happens to you and around you, linking the present with the recent past and with expectations about the near future.
It contains the model of the world that instantly evaluates events as normal or surprising. It is the source of your rapid and often precise intuitive judgments. And it does most of this without your conscious awareness of its activities.
System 1 is also, as we will see in the following chapters, the origin of many of the systematic errors in your intuitions.
Thus it's easy to see how a deeply devout Christian presidential candidate can feel that God is speaking to him/her after prayerful requests for guidance have been made to the Almighty.
This isn't so different from me visiting Amazon when I hear about an interesting new book, reading the book's description and reviews, then pausing before I click on the order button, trying to discern what the correct course of action is.
Why, most of the time I hear a "Buy it!" in my brain. Must obey.The Lord of my book-buying inclinations has spoken.
Who cares if that entity happens to reside in my own unconscious? The command sure seems wise and authoritative to me. And we wouldn't be able to live our lives effectively if we ran every decision through the System 2 slow brain, the rational, analytic side of us.
Not that we have a choice. Kahneman writes:
Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2.
As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions.
The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people's mistakes than our own.
That's why it's so easy for me to recognize the ridiculousness of Republican presidential candidates trusting that God is telling them what to do, and so difficult for me to perceive my own unjustified intuitions.
Reason, rationality, reliance on evidence, and seeking outside opinions are ways to counteract our often-flawed System 1 "fast brain."
Religious dogma is a good example of how an illuminating notion about how the world is can be intuitively appealing, perhaps even seeming like a revelation, yet its truth value can begin to dim once the slower, more systematic side of our brain looks upon it more carefully.
The lesson: trust your intuition, except when you shouldn't.