Sunday, October 16, 2011

Thinking Clearly About Religion & Politics


When it comes to hot topics that are emotionally volatile, it is almost impossible to think clearly about them. We are prone to so many biases that I have come to think that the more certain someone is about their position, the more likely it is that they don’t know what they are talking about. I am not the first to make this observation though, Bertrand Russell said “The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.” Charles Darwin also observed that “Ignorance more frequently begets confidence than does knowledge.”

I first realized this several years ago when I was asked to be the “token skeptic” on an evangelical Christian radio show out of San Diego. Very quickly, it became quite clear to me that the host knew one side of an argument, and had absolutely no idea what he was talking about regarding the other side. What bothered me so much about this was his certainty. He was absolutely 100% convinced he was right, and completely closed off to reconsidering that he might not be familiar enough with the relevant topics to make an informed decision.

Of course, I was convinced of the truth of other side of the argument. However, I couldn’t even imagine being so certain. In my mind, when thinking about my positions, I used words such as “likely”, “probable” and “suggested” and I would always try to have an example of something that, if true, could disprove my position. The host, on the other hand, spoke in terms of absolute certainty, and acted as if the fact that nothing could change his mind was evidence of how true his position was. It was then that I decided to NEVER let myself be so certain about something.

The theory of cognitive dissonance says that when there are contradicting ideas in our head, or if our behavior disagrees with our actions, we will rationalize it away as to no longer experience the dissonance. For example, the more work you put into something, the harder it is to admit that you might have made a mistake. “I have devoted my life to this… there is no way I could be wrong! My colleagues are just idiots.” Science is aware of this problem, and that is the reason there are things such as peer review, and the emphasis on independent replication of studies. Even movies do this to a small extent. The editor of a movie doesn't (ideally) have ANY part of the production aspect of the movie. He needs to be completely removed from the work that went into filming, in order to make the best decisions while editing. If the director is editing, he might see a shot and say “Ah! We spent 10 hours setting up that shot… we HAVE to use it”, even if it isn’t the best available option. The editor won’t be prone to those biases.

Now, you might be thinking “Sure, Zak. Some people are like that. But I know both sides of political/religious/etc. debates, and my opinions are firmly planted in reality.” Well, maybe. Let’s look at a relevant study:

A study by Geoffrey Cohen found that people would agree with a political policy, as long as they thought that it was their political party proposing it. For example, Cohen took a policy regarding welfare that was proposed by a republican, and presented it to a group of democrats. He told the democrats that the policy was proposed by a democratic leader, and asked if they agreed with it. Yep, they did. He then took a welfare policy proposed by a democrat, and presented it to a group of republicans, telling them it was from a republican leader. Did they agree with it? Absolutely.

But it gets better. Both groups stated that they would NEVER be influenced to agree with a policy, simply because their party was proposing it. Of course, they didn’t realize it at the time, but they had done just that. They did agree, however, that the OTHER party surely would be influenced by such tricks.

What this study (as well as others like it) shows us, is that our opinions of political policies have very little to do with the actual policy, but are mostly just a result of social identity. You are part of party X, so you agree with whatever they say… unless of course that same thing was said party Y. THEN you are against it.

Here is a real life example. Democrats agree with Obama, no surprise. However when Reagan says the same thing, with Republicans agree with him. There is little doubt that those same Republicans disagree with the Obama's similar statements, simply because he is Obama. How is this possible? Social identity.

Lastly, a 2004 study placed “strong democrats" and “strong republicans" in a brain scanner while they listened to speeches by president Bush and presidential candidate John Kerry. Both groups listened to instances where the candidates clearly contradicted themselves. The democrats would blame Bush for the contradictions, but let Kerry off the hook with some mental gymnastics. The republicans did the same thing. What was most interesting though, were the processes going on in the brain at the time. The researcher concluded, “We did not see any increased activation of the parts of the brain normally engaged during reasoning. What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts. Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones.”

Do you hold the positions you hold because you have thought them through, or because they feel good? According to the study, the answer is the later. You agree with a position for emotional reasons and THEN come up with reasons to support your stance.

So what now? Are we just to fall back into a sea of postmodern relativism where there is no truth, and there is no way to go about making decisions, since we are too overwhelmed with bias? Absolutely not. What needs to happen is for people to be aware of these biases and act accordingly to try and circumvent them. Here are some tips on how I think one should go about it.

  1. Whenever you hear a politician from the other party say something that you don’t like, stop for a second. Imagine your reaction to it if it had been stated by your favorite politician, and think about how you would feel about it (and vice versa-- do this if your favorite politician says something you like). I have used this tactic for several years, and it is amazing to see just how biased I really am when it comes to who says what. But this technique has also helped me define where exactly I stand on what issues.
  2. Realize where the biases in the media are. Research has shown that the most left of center news outlet is the CBS Evening News, while the most right of center is Fox. The three most neutral outlets were PBS’s NewsHour, CNN’s NewsNight and ABC’s Good Morning America. However, the most politically central news source of all was found to be USA Today. If watching the news, try and find a balance and take everything from the CBS Evening News and Fox with a grain of salt. Or, do as I do, and don’t even bother with them to begin with!
  3. Make an effort to understand and fairly represent the other side’s position. The mathematical psychologist, Anatol Rapoport, said that when it comes to criticizing your opponent, you need to be able to restate their position so clearly, vividly and fairly, that your opponent would absolutely agree with it, and would even thank you for stating the position so well. Once you have done this, only then will your criticisms be valid. So when thinking about how foolish the other side's position is, ask yourself "if I were to describe my understanding of the other groups position to them, would they agree with my assessment?" If the answer is no, you are biased, and need to rethink your position more carefully.
  4. Avoid organizations, clubs or groups that are solely dedicated to a political or religious idea. Nothing breeds biases more than being surrounded by people who already agree with you, will pat you on the back for slamming the other side, who won’t offer you any criticisms, and will reward you for continuing to believe what everyone else already believes. Of course, I realize that this is basically saying “if you want to be an unbiased religious believer, stop going to church.” And I stand by that. Likewise, I avoid any atheist or skeptic groups, since I have learned that they don’t help me think better, but only serve to reinforce beliefs I already have-- even if they are wrong. Ask yourself, “What is more important, fooling yourself into thinking you are right, or actually being right?” Being objective is hard, and very few people care much about it.
  5. Have friends on the other side of the fence, be it political or religious. It is easy to get caught up and think that “all democrats are idiots” or “all republicans are greedy.” We slip into these “us versus them” mindsets extremely easily, and it’s important to realize that people on the other side are just as convinced of their position that you are. It is also important to have an actual face with the position you are against, as not to make the other side out into some abstract villain.
Of course, I don’t mean to come across as some perfectly rational machine. I surely am not. However, I am very interested in learning what biases effect people, and I realize that I am just as susceptible to them as everyone else. Though, knowing these biases exist absolutely helps me fight against them, in an attempt to be just a little more rational. To quote GI Joe, “knowing is half the battle.” The other half is putting that knowledge into practice.

Good luck!

6 comments:

  1. Yo, wake up and smell the coffee man! We live in a very politically polarized world. Some of it can be attributed to our wonderful "pluralist" party system and how the media dictates the parameters of so called "legitimate discourse". Yet, some of it also originates from innate human characteristics of wanting to be apart of a group. I think it's difficult sometimes to differentiate between those with biases and those who have decided to take an ideological stand.I think having a bias isn't inherently negative but rather it just conveys your up-bringing and where your interests lie. Biases can only be harmful if you haven't explored other alternative/contrasting opinions or viewpoints. For instance, I'd rather watch the Red Sox play than the Yankees, even though they are both valid baseball teams with comparable talent, because I grew up in New England. I think we can both agree that biases don't necessarily impede one from making a sound judgement but they can if you don't challenge them. It's almost impossible to separate bias from any decision anyone makes. I feel like throughout this essay your stressing for people to challenge their biases constantly, implying that those who have biases can't make informed decisions or struggle too. I agree to some extent.

    P.S:
    Also, it seems you've made the decision that a rational decision is better than one that is irrational. That's a bias right there! haha

    ReplyDelete
  2. I am not exactly sure what me waking up and smelling the coffee is in regards to. My blog is ABOUT the world being politically polarized and how to think more clearly in it.

    I didn't say all biases are bad. This blog post is about biases that can come into play when thinking about religion and politics... Hence the name of the blog post. Of course, some biases, like you preferring the Red Sox, don't matter at all. And that is why I didn't write anything about them.

    I disagree that biases don't necessarily impede one from making a sound decision. That's what a bias is-- something that gets in the way of making a sound decision. That is why it is important to know about the biases and work to correct them.

    The claim that my preference of rational decisions to irrational ones is incoherent. Trying to argue against rationality by presenting a rational argument? Nope, that doesn't work.

    ReplyDelete
  3. I think it's alot harder to discern ones biases than you might suggest in this essay. And my argument about you favoring pragmatic thinking or rational solutions over irrational is valid.

    ReplyDelete
  4. I agree, it is very difficult to see your own biases. Though, you have to start somewhere.

    No, your argument isn't valid. In fact, it's not even an argument, just an assertion. If you tried to make it into an argument, you would be undermining the whole thing, since you can't use a rational argument to argue against rationality.

    Plus, it's just silly. No one on earth would ever say they prefer an irrational decision to a rational one.

    ReplyDelete
  5. I think your article here makes a good point. The difficulty comes in that no side can claim to be the "rational" one since it depends on the reasoning behind the idea - not the idea itself.

    To use the evolutionist/christian debate as an example - I've met and discussed that topic with quite a few people over the last few years and if I've learnt anything it's that the ignorance and arrogance is general.

    The people that claim to be evolutionists rarely know anything on the topic beyond the fundamental concept behind the theory. Likewise, those on the christian side know virtually nothing of evolutionary theory and, in many cases, know very little of biblical scripture.

    In many ways we seem to live in an age of arrogance - which is particularly galling given that information is so freely available now. People draw conclusions without bothering to research (something that I, like most, have been guilty of in the past - but I now keep that objective foremost in mind when breaching new ideas or researchign existing ones). I think the true mark of an intellectual is their committment to impartiality and study - which is especially difficult in topics of controversy.

    Anyway, those are my thoughts on the matter. I'll look forward to future posts.

    ReplyDelete
  6. Thanks for the comment, Peter!

    You are right, there are many people that don't know what they are talking about, and feel like that simply because they have a strong opinion, that is good enough reason to talk about the topic. This is actually an effect that has been studied quite a bit... it is called the Dunning Kruger Effect, and can be summed up basically by quoting Darwin, "Ignorance more frequently begets confidence than does knowledge."

    I think part of it is to blame on the internet (though I also think the internet has helped with the spreading of knowledge just as much). People will see a youtube video about X, and then suddenly think they are an expert on it.

    ReplyDelete