Saturday, February 11, 2017

Convincing People You Are Right (And They Are Wrong) - Part Four

We have gone over a few methods to determine if your views are true, and then discussed how to break down and analyze your views. In this post, we are going to briefly cover a variety of points to keep in mind when considering if your viewpoints are true. The points I cover are going to be somewhat all over the board, touching on a variety of random things. We can then start trying to convince people they are wrong.

----
When is the last time you were wrong? Not with something small like mixing up what time you were meeting your friends for dinner, but with something that you had put a fair amount of thought into? Something you had defended and strongly believed in? When did you last change your mind on something you had a strong opinion on, and thought people who disagreed with you didn’t know what they were talking about?

For most people, I suspect coming up with an example isn’t quite so easy. This could be for several reasons. One, you’ve never made a mistake and have never had to change your mind (unlikely). Two, you are too stubborn to admit when you have made a mistake (possible). Three, you made a mistake, changed your mind, and then forgot about it (probable).

To help battle this (probable) selective memory, I find it helpful to make a list of all the large things I have been wrong about, as a reminder that just because I’m absolutely certain about something, in no way means I am right. Unfortunately, certainty and reality are not as related as we would like to think. Keeping a list helps with the selective memory, and hopefully will serve as a reminder to take a breath and ease up—because you might be wrong.1

----
What is something that is extremely important to you? Perhaps you identify as part of a certain religion, or political party, or some other group. If you were to fill in the blank, “I am a _______”, what would you say? Now ask yourself “what is more important: advancing the cause of this group, or advancing truth?” If you are a good member of your group, you will probably think “my group does have the truth… I wouldn’t be part of it if that wasn’t the case.” Fair. But pause for a second. You’ve just decided that your group determines the truth. What about when it’s wrong? If you associate truth with what your group says, how would you know if it ever was in error? No group identity, beliefs, values, etc ever stays the same—the conservatives of today are not the same as conservatives of a few decades ago. Heck, liberals of the past used to advocate for eugenics. Aligning yourself with a group and deciding that they have it right is very dangerous, as it puts you in a position where being part of the group is more important than what is true.

To battle this, you have to make a commitment to what is true—and not to a group you currently identify with. Truth has to be more important to you than your religious beliefs, your political viewpoints, etc. You might be thinking, “Zak, I’m intelligent enough to be able to see when something the group identifies with isn’t true.” Maybe… but research suggests otherwise. The psychologist Geoffrey Cohen wanted to investigate the link between an individual’s opinion and their group identity, so he rounded up two groups of volunteers who described themselves as either strong conservatives or strong liberals. Cohen asked the members of each group to review and give their opinion on a welfare reform policy. The members of the conservative group were given a policy proposed by a republican leader, and the members of the liberal group were given a policy that had been proposed by a democratic leader. No surprise, the conservatives all agreed with the republican policy, and then liberals all agreed with the democratic policy.

Cohen then swapped the policies and asked each member of both groups to review them. Again, to no one’s surprise, the members of both groups found that the policy proposed by the other party was terrible. The policies were described as immoral, impractical, unrealistic, etc. Finally, Cohen asked the members of both groups if they had reached their conclusions about the policies because of the details of the policy itself, or because of the group the policy was associated with? Everyone scoffed at the suggestion that their opinion of the policies would have anything to do with the group associated with it—their conclusions were based solely on the details of the policy. Both groups agreed, however, that the OTHER group would definitely fall for such group-think silliness.

Well, surprise surprise! The policy that the conservatives had first reviewed and agreed with was NOT from a republican—but from a democrat. Likewise, the liberals had actually reviewed and approved of a policy from a republican. Cohen had tricked everyone into doing what they both claimed they would never do—support a policy simply because of which group they viewed the policy to be associated with.

Your social identity is a HUGE source of bias. And the more you associate with groups that confirm your beliefs, the more difficult it will be for you to change your mind if you come across disconfirming evidence. The stronger you associate with the group, the more likely you are to dismiss evidence, rationalize it away, or just ignore it. 

There are two ways to try and prevent being misled in such a way. First, don’t identify with a group—just be you. Soon as you decide you are part of a group, you will want to defend that group, and defending the group, rather than defending what is true, is a huge misstep.2 Rather than being “Zak the liberal atheist”, it’s better to just be Zak, and agree that atheism reflects certain views I have, and that my political views tend to fall on the liberal side of things. Like with the Sam Harris quote, don’t join a tribe. That brings up to the second step: if you do find yourself in part of a tribe (group), such as being a registered democrat, remind yourself of aspects of that group that you don’t identify with (or even actively disagree with). This will help you keep a skeptical sense about you.

----
Next up is the concept of Illusory Superiority, which is the tendency for people to overestimate everything about themselves. Compared to others, people tend to view themselves as being healthier, better looking, better drivers, more popular, having happier relationships, etc. One study found that 90% of professors polled rated themselves as above average teachers. Of that 90%, 65% rated themselves in the top 25% for teaching ability. Something is clearly askew!3

To make matters even worse, people also tend to view themselves as being more rational and less prone to blind spots than others! In a study with over 600 participants, only one person stated that she felt she was more prone to biases than average. Some people thought they were just as biased as everyone else, but a whopping 85% claimed that they were less prone to bias than everyone else. You might think “I am calm, I don’t freak out, I listen to points being made and don’t fall for rhetoric. I am absolutely more critical and less biased than your average person.” Maybe you are. But it’s more likely that you’re falling for the Bias Blind-Spot and just don’t know it. Simply put, “I am not biased” is exactly what someone blind to their biases would say.

To add one more layer to this, when people differ on opinions, everyone starts accusing the other party as being biased, but view themselves as the rational deliberator. “You believe in Global warming!? That’s just because you listen to libtards and shill scientists. Go watch Fox News, check out these Breitbart articles and get off the fake news train.” On the other hand, “You think vaccines are GOOD!? Of course the scientists who are in Big Pharma’s pocket would say that. You should listen to what Natural News and the Food Babe say—they do their own research, and aren’t corrupted by the system.”

Everyone is biased—EVERYONE. It’s just a part of being human. But with a little practice, and knowledge of how blind spots work, we can take a step in the right direction of being a little less biased than before.

----
In conclusion of this section, remember: keep a list of things you’ve changed your mind about to remind yourself that you’re not as omniscient as you’d like to think. Reject the desire to identify as part of a social group—especially groups with strong ideological foundations (political, social, religious, etc)—just be YOU. Lastly, always keep in mind of how biased you are about everything!

----
1. The two things I have been most certain about—bet my life on certain—turned out to be wrong. Both instances involved women I thought I would eventually marry. It’s no coincidence that I was so certain, seeing that I was heavily emotionally invested in both relationships. The more emotion involved, the easier it is to be convinced of something, even if you shouldn’t be.
Of course, one might argue that relationships are a whole different ball game, seeing as they are based primarily on emotion. That’s fair. We talk about love in absolute terms, and I can attest that having those feelings completely makes you believe that you will be together forever, etc. Several years ago, the musician Katie Melua changed the lyrics to one of her songs to reflect more scientific accuracy. The results were quite amusing, simply because we never hear people speak about love in the language of science.

Don’t take this to mean that you should be emotionless when it comes to decision making. Contrary to what was once thought, some emotion can be very helpful in making decisions. People who are sociopaths and don’t let emotion factor into their reasoning are unable to make ethical decision. Likewise, people with brain damage to areas of the brain that regulate emotion struggle to make even the simplest of decisions, such as “when should I schedule my dentist appointment: Wednesday or Thursday?” They tend to get bogged down in the minor logical details (such as “what will traffic be like on both days? Will it be different?”) and can’t just say “Thursday will be fine.”


2. The ease of getting people to align with groups, even arbitrarily defined, is extremely easy. Researchers have found that dividing people by shirt color, or even a flip of a coin, will produce hostility towards the other group (this is called Minimal Group Paradigm). Research has found that groups will even do things that help create a strong definition between them and the other groups, even when doing so hurts their own group!


3. Most of the research on Illusory Superiority have been done in the United States. There is some evidence that such an effect may be caused by culture to some extent. Reason being, there is evidence that Asians tend to view themselves as lower in ability than the rest of the population.

No comments:

Post a Comment