UPDATE: If you want to get the gist of Kurzban's book in a 17 minute video, rather than buying and reading the book, here's Kurzban explaining some central concepts about the brain that cause us humans to be hypocrites.
Having finished Robert Kurzban's book, Why everyone (else) is a hypocrite: Evolution and the modular mind, it's time to share some (possibly) final thoughts about the book before it gets shelved away.
The basic notion of brain modules has grown on me. Though computer analogies are risky when discussing how the brain works -- the human brain is what it is, not a computer -- Kurzban likens modules to subroutines that get called on to perform some special function.
These aren't physical parts of the brain. They're functions spread out across the brain's complex architecture.
Mindfulness practice suggests labeling as a worthwhile activity. Meaning, if I'm anxious about something, telling myself "worrying is happening" can provide a certain distance between me and that emotion.
After reading this book, I've adapted that advice to say "worry module is active," or words to that effect. For me, this works even better than "worrying is happening," because viewing myself as composed of modules reduces my perception of myself being a distinct unified self.
Instead, I'm a collection of modular processes crafted by evolution. These modules mostly work unconsciously below the surface of awareness. What Kurzban calls the "press secretary" is the module most of us view as Me.
When the workings of a module become partly conscious, the press secretary describes what's going on both to the outside world and to our inside world -- those voices that speak within our head in the form of thoughts, images, and other forms of mental chatter.
Kurzban stresses that evolution cares about our genes being passed on to the next generation. Knowing the truth about ourselves, or feeling good about ourself, is a lesser priority, if a priority at all.
Imagine a brain that when faced with a bear, instead of feeling all those unpleasant things like fear and terror, bathes itself in contentment. A bear... I think I will experience "flow" and be in the moment, me and the bear... I am one with the bear... ahhhh... AAAHHH!...
The bear food brain is no more plausible than the brain that arrives at various facts -- like Fred's belief that he's not going to die of cancer -- because doing so is "protective" or "feels good." Mechanisms whose function is to make someone feel good per se have no real function at all as far as evolution is concerned, since the feeling itself is invisible to selection.
For most of my life I've felt that something was wrong with me if I didn't feel good. But this assumes there is a "me" who doesn't feel good, and an "I" who worries about the inability of "me" to feel good. In line with Buddhist philosophy and modern neuroscience, the modular view of the brain undermines this assumption.
Which I find highly reassuring. It's sort of akin to me not worrying about the health of my entire iPhone when an app stops working as it should because I know that the apps are separate entities, so a malfunction with one doesn't mean the whole device is messed up.
Similarly, an understanding of brain modules allows us to look upon our diverse thoughts, emotions, intentions, actions, and such as separate mental processes. If one module is doing something painful, distressing, or unproductive, this doesn't mean my whole psyche is messed up.
Here's Kurzban speaking about preferences.
The evidence reviewed here suggests that we shouldn't think of preferences as being things that are recorded in people's heads. Decisions are the result of the operation of different subroutines of the mind being brought to bear on individual decision problems.
For this reason, far from being consistent, choices change depending on the way different modules operate. Because modules work differently depending on context, state, and history, changing any of these things might change the decisions we observe.
...Indeed, various threads across different fields have been converging on the view that far from preferences being listed in a book in one's head, they are constructed on the fly as one is faced with different decisions.
In any case, if choices are the result of the activity of different modules, each of which operates by its own evolved logic, then it's no wonder that people's choices look so inconsistent, even reversing themselves.
It's because there's no particular reason that they would look consistent. If the context one's in, or the state one's in, turns certain modules on, then the preferences in those modules will drive one's performance. In a different context, the very same options will be evaluated by different modules, leading to the possibility of a different choice being made.
So when you ask what pen I want, you can't think that you're just asking "me" what pen I want. You're asking the modules that are active in the particular context in which you ask me.
Near the end of his book, Kurzban has a passage about hypocrisy -- an important concept, since it's in the title of the book.
Modularity explains why everyone is a hypocrite. Moral(istic) modules constrain others' behavior. The mob's moral sticks can be used to prevent an arbitrarily wide set of acts. At the same time, other modules advance our own fitness interests, often by doing the very same acts our moral modules condemn.
In this sense, the explanation for hypocrisy lies in the rather quotidian notion of competition. Organisms are designed to advance their own fitness interests, which entails harming others and helping oneself and one's allies. Hypocrisy is, in its most abstract sense, no different from other kinds of competition.
So if hypocrisy is nothing less -- but nothing more -- than a manifestation of competition, then why does it seem so offensive?
...The answer lies in the nature of morality. One might wonder why morality evolved in the first place, but a key feature of morality is that humans seem designed to accept -- even create -- rules that constrain their own behavior, as long as these rules constrain others' behavior as well.
Morality can be seen as the informal equivalent of a justice system. I'll agree to rules that specify that I can be punished for various deeds, but only as long as everyone else is subject to the same rules. This makes sense; we shouldn't expect evolved creatures to be designed to consent to limit their own options, but not others'.
Recent Comments