Tuesday, August 14, 2007

Watts: Do unto others before they do unto you?

Peter Watts argued a couple of days ago that people are, at heart, selfish bastards, who are only good when they think it will benefit them. It's an old argument, going back to Socrates' diaologues, and no doubt much further back than that.

There is no doubt that peer pressure is an important influence on our morality. But I just can't believe it's the only influence, or else the world would be even a much more horrible place than it is. Watts seems to think that outside of kin selection and immediate reciprocal altruism, there is no direct benefit for moral actions.

This is an important issue for those of us who don't think Big Daddy God is always watching over our shoulders ready to throw us into Hell for doing bad things. Why, after all, should we do good things if it doesn't directly benefit us? In spite of the fact that there's not a shred of evidence that athiests/agnostics are any less moral than believers, we are constantly accused of it because people just can't see any reason why we should be good.

I'd argue that the best evidence that there is a personal benefit for moral behavior even when we're not aware of immediate payback is the fact that we have the urge to do it at all. Imagine a time that you had the opportunity to do something that you knew was immoral, and you also were almost sure you could get away with it. Whether you did it or not, you probably still had a sense of guilt that urged you in the direction of "moral" action. This urge to be moral is undeniable. Of course it's not as strong as our urge to eat or have sex, which is why it loses out so often. But the fact that it is there at all implies there is some evolutionary benefit for it.

But, you might point out, you didn't know for sure that no one would know. If you could ever know absolutely for certain, you might feel nothing. But then, I'd point out, that can't happen. The theoretical example of the opportunity to do bad and be absolutely certain no one will ever know remains completely theoretical. You can never know for sure if down the road your immoral actions will reflect back on you negatively.

Imagine you were in a casino, playing roulette. The roulette wheel has purple and green numbers (I avoid red and black because those numbers have implicit associations with morals). If the colors are fifty fifty, you have no more reason to pick one than the other. But if there were fifty-one purple and forty-nine green, your only sensible bet would be to go purple every time, except in outstanding circumstances, like if someone will kill you if you bet purple. In fact, even if purple had only a 0.0001 percent advantage, it would be to your benefit to go purple every time.

Consequently, since it is always uncertain whether moral actions will reflect positively back on us with the rest of our species, we would have evolved an urge to act this way every time, though subject of course to stronger urges that might overrule it. In fact it seems obvious that this biological urge must have come before any religious or societal rules, or they wouldn't all be so similar.

And if people who have developed the adaption of this moral urge have survived in spite of the obvious benefits of immoral behavior, it seems clear that a moral lifestyle is statistically most likely to result in a happy life. Of course this says nothing about what a moral lifestyle actually is, but let's face it, the important stuff is pretty obvious. The "final six" commandments, the part that doesn't involve man's relationship to God, sums up most of it.

Of course this hypothesis might be difficult to state in a falsifiable way. But then I'm not sure how falsifiable Watts' "we're all selfish bastards" hypothesis is either.

Interestingly enough, though, the connection between belief and morality is quite experimentally testable. As far as I know, most religions tend to believe that there is a direct link between the belief in a (their) diety and moral behavior. In the Abrahamic religions, this would be the belief that there is a direct correlation between the "first four" and "final six" commandments, or the "man to God" ("Thou shalt have no other God before me", etc.) and "man to man" (shalt not kill, bear false witness, etc.) commandments.

Anyone who's thinking straight should be able to think of an experiment that tested this correlation. For example, controlling for race, income, etc., you could take people who are in state penetentiaries for violations of the final six (killers, thieves, perpetrators of fraud) and a control group of people with no known offenses, then have them fill out a questionnaire about what religious beliefs they were raised with. (That's better than asking them what they believe now, since lots of people get born again in prison.) If there was a correlation between the first four and final six, you should find a lot more believers among the non-offenders. Somehow I imagine this would be unlikely.

This experiment would no doubt piss a lot of people off, and be quite contentious. To really test this, you'd need to approach it a lot of different ways. But if what I suspect turned out to be correct, that nonbelievers are no more or less moral than believers, it would be quite handy to throw in the face of the next person who implicitly accused me of being inherently immoral.

2 comments:

Denni said...

Interesting post, although the suggested experiment is a bit too simplistic because the majority of people in prison have mental health problems which would have to be allowed for.

Also, it's tricky to control for the belief systems people were raised in. Only one of the atheists I know has in fact be raised as an atheist (his father used to say: "godliness is next to clumsiness"). Most atheists have been raised in some belief system.

Finally, the majority of people I know who aren't explicit atheists don't give religion a second thought but it could influence their moral decision making, whereas the minority which is religious doesn't like to talk about it and it probably doesn't influence their decicions differently from the former.

Anonymous said...

'Whether you did it or not, you probably still had a sense of guilt that urged you in the direction of "moral" action.': Right on the spot.