Thursday, August 26, 2021

"Why The Other Side Won’t Listen to Reason"

"Why The Other Side Won’t Listen to Reason"
by David Cain

"At some point during your first year as a human being, the adults throw a real curveball at you. They expect you to start understanding what right and wrong mean. These lessons come in the form of mysterious reactions that follow certain things you do. After you pull all the books from the bottom shelf onto the floor, quite a feat for a one year-old, they scold you for some reason. When you pee in the correct place, they praise you. It’s completely baffling, but over time you get a sense that adults are extremely preoccupied with classifying actions into two broad categories: okay and not okay, or good and bad.

You quickly gather this is how the world works. And there is some logic behind what’s rewarded and what’s punished: “bad” actions are usually (but not always) ones that hurt, annoy or inconvenience other people, and “good” actions usually (not always) help in some way, or at least don’t hurt anyone.

This classification system is so strongly emphasized by the adults that you develop a keen sense of it yourself. You see rights and wrongs everywhere, particularly where you stand to gain or lose something personally: in the fair distribution of treats, in acknowledgement for chores done, in which cartoon characters deserve to be happy (or in a police wagon) at the end of the episode. 

Seemingly everything is morally relevant. There are right and wrong ways to speak, play, fidget, ask for things, touch people, and express your feelings. The rules are endlessly detailed and idiosyncratic. There are right and wrong places to sit or stand, things to wear, things to stare at, even expressions to have on your face. Some acts are okay in one place and very bad somewhere else. The adults insist that navigating this sprawling bureaucracy is simple: just be good.

You make use of this system. You argue your case to your parents when your sibling takes something of yours, or plays with a coveted toy too long—if you feel slighted, there must be wrongdoing, and you say so, perhaps listing reasons why you’re right. You petition teachers to take action against other kids who are being greedy, annoying, or mean, and you defend yourself when you’re the one being accused.

There’s Something Fishy About the Way We Judge: By adulthood, morality has become such an intuitive part of our thinking that we barely realize when we’re making a moral judgment.

Hundreds or thousands of times a day we assess the character of another person. We feel we know enough to commend or condemn (usually condemn) a person from the way they park, a word they chose to use in their comment, the state of their front lawn, how they stand in a queue, what they laugh at, where and when they look at their mobile phones, how long they take to get to the point of their anecdote, or any of ten thousand other morally salient micro-actions.

Our moral sense works with great speed and force. Every news article - even the headline alone -gives us a strong, immediate, and seemingly unmistakable sense of which are the good and bad parties involved. Virtually every time we feel annoyed, we reflexively assert some wrongdoing on the part of another human being, even if it’s someone we’ve never seen. If service is slow, some employee is being lazy or inconsiderate. If traffic is crawling it’s because the city always schedules construction work at such stupid times. If an item’s price is unexpectedly high, some greedy CEO is getting paid too much.

There’s something fishy about all this moralizing. We treat our moral feelings and judgments as though they’re truly all-important; seemingly, nothing deserves as much energy and attention as determining the right and wrong of everything done and said in the human world, and lamenting that world’s failure to meet our idea of what’s right. (For endless examples, just check Twitter.) Yet for all their importance, we’re extremely flippant with our moral judgments. We make them all day long, with ease and even a kind of pleasure, and very little second-guessing. Maddeningly, other people have almost perfectly opposite positions on the same moral issues - drug policy, immigration, pornography, whether mayo belongs in guacamole - and they cast their judgments with the all the same ease and certitude.

You’d think that if determining right and wrong were truly what’s important to us, we’d be far more careful about making judgments. We’d want to gather a lot of information before saying anything. We’d seek opposing viewpoints and try to understand them. We’d offer people the benefit of the doubt whenever possible. We’d be very wary of our initial emotions around the topic, and very interested in how our personal interests might be skewing our conclusions. We’d refrain from making conclusions at all if we didn’t need to.

In other words, we’d employ the same reserved, dispassionate, self-scrutinizing ethic we use to examine questions about anything else: physics, history, biology, engineering, business, or any other arena of understanding where premature conclusions can create a big problem. We’d have a keen, ongoing interest in learning how we might be wrong.

But we’re not like this at all. We make moral conclusions freely, immediately, and without self-scrutiny, recruiting as much emotional tilt as possible. We dismiss counterpoints reflexively, as though it’s dangerous to even consider changing our minds. We only rarely admit that an issue is too opaque or complex to be sure what to think.

Why are we so smart and careful when it comes to figuring things out in most areas of inquiry, and so dumb and impulsive when it comes to moral questions, which are supposedly the most important ones to get right?

Why We’re So Stubborn: Social psychologist Jonathan Haidt sheds a lot of light on our confused moral psychology in his book, "The Righteous Mind: Why Good People Are Divided By Politics and Religion."  It’s a fascinating read, but the main punchline is that our moral sensitivity didn’t evolve in order to make us good at determining right and wrong. It evolved to help us survive and thrive in highly social environments.

Our moral feelings are quick and reactive because they developed to aid us in real-time social interactions, not in careful, solitary periods of reflection. These feelings are often conflicting and illogical because they adapted to meet a number of different social goals:

• Our desire to protect the vulnerable, and our hatred for cruelty and carelessness, adapted to motivate us to keep children safe at all costs, and keep potentially dangerous people away
• Our resentment for cheating and unfairness adapted to help us avoid getting exploited by the rest of our group
• Our respect for loyalty, and our fear of betrayal, evolved to help us form coalitions, and identify disloyal people before they make trouble
• Our attitudes towards authority, and those who subvert it, conferred an advantage at positioning ourselves within social hierarchies
• Our moralizing around cleanliness and the sanctity of bodies, sex, and bodily functions, adapted to help us avoid infection and disease 
• It’s no wonder our moral intuitions are so strong, quick and often thoughtless. They are essentially survival reflexes, conditioned by our upbringing and our instincts.

Our moral reasoning - our capacity to explain why something is right or wrong - comes only after our emotional intuitions, if at all, and is tuned for persuading others of our value to the tribe, not for helping us find the most sensible moral stances. Haidt describes our moral reasoning as working much like a press secretary or company spokesperson - its purpose is to justify positions and actions already taken, using any explanation that sounds passably good in the moment, true or not.

Note that none of the above social goals require our moral feelings to be fair or logically sound, and in fact, that can be disadvantageous - a tribe that viewed all outsiders as predators likely would have protected its children better than a tribe that was most concerned with never falsely accusing someone of being dangerous.

In other words, our moral intuitions are strongly tuned to make us groupish and tribal, not even-handed and insightful. And our moral reasoning is tuned more for soliciting approval from others than for actually discovering moral truths.

This explains why we’re so susceptible to rhetoric, prejudice, selective hearing, and fake news. It also explains why it’s strangely pleasurable to take hard moral stands, no matter how poor or nonexistent the reasoning behind them - hard stands, declared publicly, reliably generate a small flood of praise and approval from the tribe that shares those positions.

You can see what a powder keg this moral psychology is liable to create in an increasingly global, internet-connected society, composed of people from many different backgrounds, all of whom enjoy getting Retweeted, Liked, and Favorited.

It’s why, when it comes to politics, the other side simply doesn’t listen to reason. Of course, all of us are on someone’s other side."

No comments:

Post a Comment