Pages

Saturday, October 29, 2011

Miss? Excuse me, Miss? Someone wants you to "represent."

Is it a bit hasty to write about a movie you've not yet seen? Perhaps, but this one I'm eager to see, and it will be televised again on November 12th so perhaps there will be an opportunity. Miss Representation is a documentary made by Jennifer Siebel Newsom on the subject of how women are portrayed in media, making a case that  both on and off-screen they are both under- and misrepresented. Newsom is an actress herself, and described some of that experience to Mother Jones in a recent interview:
Something I've learned myself in making this film is sometimes people have a hard time listening to what we have to say because they're so concerned about how we look. I think that's a challenge that women in particular have in our culture. . . 
I started acting at the age of 28, and my agent told me to lie about my age and take my MBA off my résumé. I didn't do either, but that was like, whoa! I thought there was value in being smart. I thought there was value and wisdom in getting older. We're challenging the culture in Hollywood that is all about youth, youth, youth, beauty. Because not even that's healthy. I don't care how much plastic surgery people have: At some point, they're going to die. We're all going to age somewhere on our body, and we may as well accept that and embrace it. I mean, aging is a beautiful thing; wisdom is a beautiful thing. Frankly, as a woman who's getting older in our culture, I want to see stories about women who are before me, so I can be inspired—because someday I'll be there. 
Older women. Smarter women. Diverse women. More women-- I was rather shocked a few years ago to learn of the Bechdel Test, which first appeared way back in 1985 in a comic called "Dykes to Watch Out For" which described the rules of the author's friend for being willing to see a movie:
  1. It has to have at least two women in it,
  2. Who talk to each other,
  3. About something other than a man.
  4. (Addendum: who have names)
Deceptively simple, right? You'd think that most movies would pass this easily. But not really:


Note that the word used here is not "conspiracy," but "systemic problem." The claim isn't being made that male directors, producers, screen writers, and so on have a conscious agreement to keep women down. For that matter, Newsom's stated concern in making Miss Representation is not solely about women. She notes that 
it's really interesting now, being a mother raising both of a son and a daughter, in a culture that objectifies women and sees women's value in their youth or beauty or sexuality, and not in their ability to lead. And in a culture that values a hypermasculinized version of what it is to be a man. At MissRepresentation.org, we're about creating a dialogue about how we empower women and men, or girls and boys, to find their own path and to find value in a plethora of attributes and possibilities for who they can be.
You can't say something about women without also saying something about men. Portraying women as helpless things that need to be rescued, for example, means portraying men as having to be powerful rescuers. Ideals of masculinity are constructed and enforced in media, too. It's fun to unplug your brain and enjoy an action movie (which, needless to say, predictably fail the Bechdel Test), so long as you're fully aware of how distant the impressions in it are from reality. But not everybody is, and it's hard to expect them to be if they don't have other images to look at. And so far as I know, there's no law that says explosions and boobs have to go together. Just saying.

Speaking on that-- Newsom says she's fighting the objectification of women (boobs), and that can lead to some bad places. Conservatives and liberal feminists, as has often been noted, come together to make strange bedfellows when they decide that showing off bodies for aesthetic appreciation and titillation is inherently wrong. It would not encourage girls and women to "find their own path" and the "value in a plethora or attributes and possibilities for who they can be" if women who decide that their path is to show off their bodies are shamed and/or forbidden to do so. I believe that a feminist wagging her finger at an actress who opts to go topless in a film is no less moralizing than a fundamentalist who does it. So here's hoping Newsom doesn't intend to wag fingers so much as point to opportunities. We would be making progress if there are more women showing up in general as well as a decreased willingness to think less of them according to how much clothing they have on. Slut-shaming is not a feminist pursuit.

Having said all of this, I am eager to see Miss Representation when possible. If possible. If you have the means, you can catch it on the Oprah Winfrey Network at 11am ET on November 12th.

Sunday, October 23, 2011

RPGs and skepticism (Sunday fun post)


If you really aren't interested in video games at all, you....probably won't bother reading this post. But if you're somewhat interested in them but don't know much about them, you might not know that this weekend has been BlizzCon, the annual convention held in Anaheim, California by Blizzard Entertainment. Blizzard's most famous and far-reaching games are Diablo, Starcraft, and most importantly for this topic, World of Warcraft.

A role-playing game, or RPG, is any game in which you're expected to adopt the role of a specific character and control him or her throughout, advancing him or her in ability by leveling-- accumulating experience points which make that character stronger, smarter, faster, wiser, etc. and therefore able to accomplish more difficult tasks and battle stronger adversaries. Dungeons and Dragons is the most famous table-top RPG, and World of Warcraft, I think it's safe to say, is the most famous video game RPG.World of Warcraft is also an MMO (massive multiplayer online game), which means that your character is always interacting with those of other people in real time. In that sense the character represents you-- the faction, race, gender, class, and appearance you choose are all used as information about you. Having made all of those choices, you can decide whether to role-play (always speak and act in game as if you are actually your character) or whether to talk about your character with some degree of remove.  Most people opt for this, whether by speaking explicitly in third person ("He/she," "my character/toon," or "(character's name)") or by speaking in first person but using game terms and clearly speaking as a player rather than a character. It's common to see the two combined, as with a person saying something like "Is it more important for my rogue to have attack power or a better critical hit chance? I'm trying to decide which pair of boots will get me a better bonus."

Clearly for the Horde, this one
So obviously the degree of immersion varies a great deal. And it's not a new topic for RPG gamers at all-- it has been discussed to death, including for the purposes of armchair psychoanalysis: do people who play a character of a different gender secretly want to be that gender? Do they play a race that is more attractive (by human standards) because they want to be accepted, or an uglier one because they like being non-conformists? If they pick a plain ol' human to play rather than something like an orc, does it mean they lack imagination, or are people who play orcs afraid to be themselves? And of course-- are people who play races like human, elf, dwarf, or gnome (the Alliance faction in WoW) good, and people who play orcs, trolls, undead, and goblins (the Horde faction in WoW) bad?

"Class" is the term for the means by which your character defends him/herself and others against the world. Do your powers come mainly from armor and big scary weapons? From your ability to melt into the darkness and evade attacks against you?  Or perhaps from your ability to manipulate magic? Magic generally comes from two distinct sources-- arcane (from energy existing in the universe which can be focused and manipulated) or divine (from, quite simply, the gods). Again, people like to psychoanalyze this choice-- are you a rogue because you enjoy stabbing people in the back? A warrior because you're a control freak? A mage because you're physically weak and like the thought of summoning power from something else?

The magic aspect is what makes the Twitter exchange at the beginning of this post interesting. As you may be aware, PZ Myers and JT Eberhard are both atheists activists-- very outspoken ones. Given that atheists joke all of the time about being evil to mock the public perception they are, it could be expected that they would be drawn to play the underdogs, the misunderstood, the commonly perceived as evil Horde. And given the rejection of supernatural powers of any kind, they could be expected to have no attraction at all to a class like Priest, who uses divine energy to heal other players but also to attack enemies. In Dungeons and Dragons when you play a priest-- a cleric, as they are called there-- you choose a god or goddess to serve, and those of us who have played remember fondly the book of Deities and Demigods which not only described and visualized countless gods both from existing mythologies and created especially for the game, but gave them in-game attributes and abilities. So, for example, you could decide to serve the Egyptian god Ptah, creator of the universe (alignment: lawful neutral), or perhaps the Norse goddess Freya, representing love and fertility (alignment: neutral good). I'm sure I'm not the only one whose interest in mythology as a kid was encouraged by this book.

In WoW, by contrast, the powers of a priest fall under the general category of Holy, and their description is as follows:
Priests are devoted to the spiritual, and express their unwavering faith by serving the people. For millennia they have left behind the confines of their temples and the comfort of their shrines so they can support their allies in war-torn lands. In the midst of terrible conflict, no hero questions the value of the priestly orders.
So in one RPG we have the existence of gods asserted, and their attributes described quite explicitly, whereas in the other it's...well, a little more esoteric. WoW does have its own very complex assortment of demigods as well as some authentic deities, including Elune, goddess of the night elves. The races in WoW have their own cultural mythologies, but becoming a human priest (for example) does not require you to sign up for allegiance to anyone in particular. Nor does becoming a paladin (holy warrior), druid, shaman, or-- in the next expansion-- a monk. Mages and warlocks are also magic users, of course, but their powers come from either their own abilities specifically or harnessing the (often unwilling) assistance of demons. In this world, it's more like a messy confluence of hierarchies of non-physical power....for basically everybody except warriors and rogues, and hunters for the most part, who rely either on either their own brute strength and agility or that of their pets.

So strictly speaking, ought not a skeptic who is determined to remain a skeptic in-game be suspicious of most of these classes?  Priests and paladins are the ones who connect their abilities most directly to divine power (because in the WoW universe, "healing" = "holy"), but almost everybody's drawing on the supernatural in some way or another. The skeptic would, and should, ask: how do they know?

Well, it's a game. A fictional universe-- its terms are its own, and this game has gods, god dammit.

That's one answer. Another answer is that in this universe, the power of spells has been repeatedly tested and applied, and found to exist, in one form or another. A skeptic, upon observing this happen or (ideally) performing the rituals and observing the results for him/herself, would be compelled to believe in the existence of....something. And of course, that "something" is the tricky part. How much would a scientifically-minded denizen of Azeroth be able to confirm, assuming he/she had the luxury to think on the matter intently in between fighting off incursions from the Horde or the Alliance (depending), as well as the multiple itinerant tribes, beasts, demons, elementals, and constructs roaming the land? His/her main concern, of course, is going to be for what works-- what produces results. Most spells are performed to either damage an enemy or provide a buff (protection, fortification) to oneself or others. If the only way to achieve that effect is by using reagents and/or incantations in specific way, that can be tested and confirmed. Right?

But it can't be confirmed as the result of divine power, and that's the rub. Even in a world where mysteriously powerful beings exist, the infiniteness of their abilities still can't be confirmed by finite beings. Which might be why, one could surmise, RPG designers (regardless of platform) don't spend a lot of time or space proclaiming the "omni-ness" of the gods involved in them.

Sunday, October 9, 2011

Biased ! = wrong

Let me say this, right from the start: I love biases. No, I don't love that they exist, but I think they're endlessly  fascinating. I love thinking about them, identifying them, figuring out where they come from. Studying biases is how I came to the realization that the way we generally think about human reasoning is mistaken. Humans are not rational creatures who occasionally succumb to a bias which perverts their ordinarily sound, logical thought processes. We are creatures who are practically made of bias, for whom attempts at objectivity (or as close as we can get to it) are counter-intuitive and require effort. A 2003 paper by psychologists Martie Hasleton and David Buss on biases in social judgment begins:
Humans appear to fail miserably when it comes to rational decision making. They ignore base rates when estimating probabilities, commit the sunk cost fallacy, are biased toward confirming their theories, are naively optimistic, take undue credit for lucky accomplishments, and fail to recognize their self-inflicted failures. Moreover, they overestimate the number of others who share their beliefs, demonstrate the hindsight bias, have a poor conception of chance, perceive illusory relationships between noncontingent events, and have an exaggerated sense of control. Failures at rationality do not end there. Humans use external appearances as an erroneous gauge of internal character, falsely believe that their own desirable qualities are unique, can be induced to remember events that never occurred, and systematically misperceive the intentions of the opposite sex
...to give just a few examples. Approaching the matter from an evolutionary standpoint, they then go on to suggest that these biases are not necessarily "design flaws," (maladaptive traits) but actually features. A suggestion they make in that paper to agree with psychologists Leda Cosmides and John Tooby remains, in its simplicity, one of my favorite things to quote: "[An evolutionary perspective] suggests that the human mind is designed to reason adaptively, not truthfully or even necessarily rationally."  What does that mean in practice? Well, that understanding the world as it really is, and thinking about it in the most logical possible way, is not necessarily the most efficient way to get your genes into the next generation. Rather, the specific lies we tell ourselves actually make it easier for us to get food, avoid being killed, find mates, and reproduce. If this is the case, we should expect to see people lying to themselves constantly...and we should expect to find ourselves doing the same.

That's kind of a discomfiting thought. But you get over it. Reading Mistakes Were Made, But Not By Me, for example, had me grinning (though occasionally ruefully) to notice examples of self-justification bias and other means of avoiding cognitive dissonance that I've been guilty of numerous times. It still doesn't remove the sting of being accused of bias by others, especially people who believe that sufficient to discredit what you're saying. And that is what I've been getting to in this post.

See, Ben Radford has a very good essay up today at SheThought about the various accusations of bias he has received on behalf of virtually every group he has written about based on something someone found objectionable in his articles. Most recently it has been complaints about his discussion of a soon-to-be-published book for children called Maggie Goes on a Diet, accusing him of bias for not denouncing the book (before reading it, by the way-- none of these commentators have had the opportunity to read it yet) as harmful to young girls' health and self-image. Radford remarks
I don’t mind the criticisms, it’s the bias accusations that annoy me, and it’s instructive to briefly analyze them. When I question claims about aliens and UFO photographs, critics assert that the only logical reason I would do so is because I have a bias or agenda as part of a government conspiracy to keep the truth from the public. When I question claims about alternative medicine and homeopathy, it’s not because I have researched it and know a lot about it, but because I’m being paid by Big Pharma. When I question claims made by psychics, critics say it’s because I have a bias toward protecting the scientific status quo—or that if I were to accept the reality of psychics it would devastate my worldview. And when I question claims about the links between media images and eating disorders, it can’t be because I know something about it—having studied it for years and written a book about the mass media—but because I hate fat people.
Whether Radford actually is biased against fat people, or whether Maggie Goes on a Diet is, is not the point here. As I said the book isn't out yet, but you can read his article about the protest against for Discovery here and the rest of his reaction to criticisms at the link above.

The point here is that we all have biases. And there is no harm in pointing them out-- in fact, it's always instructive and useful to do so. However, the simple fact of having biases does not make someone wrong. It might provide some useful psychological information in terms of why they're wrong...or why they're right. But it doesn't tell you which one they actually are. From one of my favorite Ed Brayton posts:
Everyone is biased. If one's bias leads them to make fundamental errors in reasoning, then point out the errors in reasoning. If it leads them to ignore relevant data or distort the nature of the evidence, then point those things out specifically. If you can't do either of those things then the accusation of bias doesn't tell you anything about the validity of the claims being made. This is merely a cognitive shortcut to dismiss someone out of hand rather than engage the arguments being made. 
So here's the quote from CS Lewis that sums this up perfectly:  
"You must show that a man is wrong before you start explaining why he is wrong... Suppose I think, after doing my accounts, that I have a large balance at the bank. And suppose you want to find out whether this belief of mine is 'wishful thinking.' You can never come to any conclusion by examining my psychological condition. Your only chance of finding out is to sit down and work through the sum yourself... If you find my arithmetic correct, then no amount of vapouring about my psychological condition can be anything but a waste of time. If you find my arithmetic wrong, then it may be relevant to explain psychologically how I came to be so bad at my arithmetic..."
Spot on. And very useful. The point is that you must first engage the argument on its own terms. Once you've defeated the argument, then it's reasonable to point out that the inaccuracy of the claims may have been due to bias, or wishful thinking, or fear. But until you defeat the argument, you're not really saying much of anything.

Saturday, October 8, 2011

Spokespeople

Not allowed to be right about
anything.
After posting a clip from Bill Maher's show in which the comedian mocks the Republican presidential  candidates, Ed Brayton got some flack from readers complaining that they wouldn't watch it because Maher has established himself as having some pseudoscientific views, specifically being anti-vaccination. When Ed expressed confusion about why that would mean refusing to listen to Maher's comments on a completely unrelated issue, one response was:
Well Ed, sometimes people are annoyed to find that an idiot agrees with some of their positions. They feel, for whatever incredibly odd reason, that having a blithering idiot as a spokesperson is not the very best strategy they could hope for.  
Examples for me are Maher and Hitchens. Great to hear them when they support me — both have a way with words (Hitchens even writes his, maybe Maher does his as well) — but I will always feel that their effectiveness is undercut by the fact that on some major — really major — issues they are idiots.
My reaction to this is utter bafflement. First, that it's apparently an option between "supportive of my views" and "blithering idiot." But mainly because I don't understand why appreciating some things a person-- especially a comedian-- has to say means that you have somehow adopted him/her as your spokesperson.  What a mantle of responsibility to lay on someone you've never met, let alone never employed to speak for you! It's as though public figures aren't allow to have views of their own.

The ad hominem fallacy is not, as many confusedly believe, a statement that an argument is fallacious when it involves an insult to someone. It's a statement that it's fallacious to believe that you have discredited someone's argument by disparaging something irrelevant about them, usually his/her character: Joe may sound like he has a good point about the feasibility of legalizing marijuana, but he's a thief-- don't listen to him. Ronnie has a lot to say about public healthcare, but he's a Republican and so can't be trusted. Bill Maher might have something funny/insightful to say about the presidential election, but he's an anti-vaxxer so I won't listen. What?

If someone has valuable thoughts to offer, those thoughts can be appreciated without adopting that person's worldview wholesale and allowing him/her to speak for you on every matter.  It's important to realize this because no one in a position to advocate publicly for ideas that you hold dear is going to agree with you on everything. Nor are they infallible-- everyone is wrong about at least a few things. At least.  They're not even infallible in whatever area you find them to be dead-on. Christopher Hitchens is remarkably politically savvy, but that doesn't mean he was correct in endorsing the invasion of Iraq. Nor, for that matter, did such an endorsement make him an idiot-- it simply made him (in my view) wrong.  Richard Dawkins knows his arguments for and against the existence of God, but that doesn't mean he was right to chastise Rebecca Watson for speaking out about a creepy come-on at a conference for atheists. There's a certain of irony in the fact that both Hitchens and Dawkins are considered icons of the skeptical movement, yet so many within that movement are reluctant to be skeptical of them.  It's as though continuing to evaluate claims on their own merits rather than the people making those claims is just too taxing, and so after identifying some heroes of skepticism people are content to turn their brains off and allow those heroes to do the thinking for them. Not very skeptical, that, whether you agree that Hitchens and Dawkins were wrong on these specific issues or not.

Having opted out in many realms of life so far, and being very accustomed to the idea, I realize that my feeling of repugnance for the practice of adopting uncritical acceptance of public figures as spokespeople is uncommon. But it's something that should become more widespread, if ideas are really what is important rather than the mouths from which they come. A good idea is a good one no matter who is expressing it, and the same in reverse for a bad one. Let's not be afraid to criticize our heroes when they're wrong, or too willing dismiss everything a villain has to say as false and invalid. That's what intellectual honesty requires.

Tuesday, October 4, 2011

The phenomenon of the petty tyrant

Researchers at the Stanford Graduate School of Business decided to examine the relationship between status and power in how people treat each other. So they organized a study that involved telling participants they would be working on a business exercise with another student, and randomly assigning each participant a role in the project with a different rank, from a low-status "worker" role to that of a high-status "idea producer." In these exercises the participants were to give orders to their partners with varying degrees of respect conveyed, some orders being more demeaning than others. What the researchers found was that participants with the power to order their partners around but comparatively low status were more likely to issue demeaning orders than those with higher status:
The experiment demonstrated that "individuals in high-power/low-status roles chose more demeaning activities for their partners (e.g., bark like a dog three times) than did those in any other combination of power and status roles." 
According to the study, possessing power in the absence of status may have contributed to the acts committed by U.S. soldiers in the Abu Ghraib prison in Iraq in 2004. That incident was reminiscent of behaviors exhibited during the famous Stanford Prison Experiment with undergraduate students that went awry in the early 1970s. In both cases the guards had power, but they lacked respect and admiration in the eyes of others, and in both cases prisoners were treated in extremely demeaning ways. 
Fast said that he and his colleagues focused on the relationship between power and status because "although a lot of work has looked at these two aspects of hierarchy, it has typically looked at the isolated effects of either power or status, not both. We wanted to understand how those two aspects of hierarchy interact. We predicted that when people have a role that gives them power but lacks status — and the respect that comes with that status — then it can lead to demeaning behaviors. Put simply, it feels bad to be in a low-status position and the power that goes with that role gives them a way to take action on those negative feelings."
This reminds me of work done by social psychologist Roy Baumeister on the subject of self-esteem. He wanted to find out if it's really low self-esteem that encourages people to bully each other, as the prevailing story went in the 1980's. The goal was to discover, as he put it, the relationship between self-esteem and "violence and oppressive actions that so often are tangential or even contrary to the rational pursuit of material self-interest." What he and colleagues discovered was that in actuality, high self-esteem can cause this kind of violence-- if it is coupled with an artificially high sense of one's own status. When that impression of high status is challenged, a person's ego is threatened and aggression against the challenger can be the result:
Our main argument . . . does not depict self-esteem as an independent and direct cause of violence. Rather, we propose that the major cause of violence is high self-esteem combined with an ego threat. When favorable views about oneself are questioned, contradicted, impugned, mocked, challenged, or otherwise put in jeopardy, people may aggress. In particular, they will aggress against the source of the threat. 
In this view, then, aggression emerges from a particular discrepancy between two views of self: a favorable self-appraisal and an external appraisal that is much less favorable. That is, people turn aggressive when they receive feedback that contradicts their favorable views of themselves and implies that they should adopt less favorable views. More to the point, it is mainly the people who refuse to lower their self-appraisals who become violent.  
One might surmise that people are not exactly keen to lower their self-appraisals. So it's natural to expect that many would attempt to maintain a favorable image of themselves by discouraging expressions that disagree with that view. This can be done by punishing those who have expressed such disagreement or instilling sufficient fear in them that they are unwilling to do so.  Thus is a petty tyrant made: people with power but not much status attempting to make up for such by exerting that power to punish all who oppose them!  Or who say anything remotely negative about them. Or who fail to exercise the proper deference to their authority. Or who look at them funny.

Ken at Popehat used the Stanford (Graduate School) experiments yesterday to describe TSA workers who abuse their authority, and the unwillingness of so many to do or even say anything about it.  Under the headline "Today's TSA: Even Petty Power Corrupts. Perhaps ESPECIALLY Petty Power," he writes:
TSA agents are poorly paid, work in nasty conditions, and have little status. Yet they have, within their petty fiefdoms, tremendous power to humiliate and demean. And God, do they ever use it. 
The fact that this is a recognized psychological phenomenon explains, but does not excuse, any more than it excuses police abuse and bureaucratic indifference. Nor does it excuse the leaders of the TSA and the Department of Homeland security, who have decreed a feckless facade of security theater that is calculated to lead to this result, all in the name of promoting unquestioning compliance.
How does one stop a petty tyrant, or prevent one from being created? Two ways, that I can see:
  1. Don't give them power, thus cancelling the "tyrant," or 
  2. Give them recognizable status to be respected, thus cancelling the "petty."
This is of course a chicken and egg problem. When people recognize appalling abuse of power exercised on a regular basis by those to whom it has been allocated, they will abandon recognition of any status for the group to which the abusers belong. When people who have been given this power observe that they are not being accorded status, they have a motivation to abuse. The only way out is for those who have authority over the potential petty tyrants to both keep a tight rein on the means by which they may exert power and ensure that such power is only wielded for just causes. Crack down hard on the former student hall monitors who, upon observation, can be witnessed as being in it for the ability to wield power. Don't hire them if possible, don't give them the opportunity to be abusive while employed, and fire them if caught doing so. When this procedure is not followed, it is to all of our detriment.

Nor does it help to simply instruct people to give respect-- the most that can be elicited is a grudging, fearful, and at most temporary silence. As soon as people are out of earshot and/or under cover of anonymity, the doubt and mistrust will return, now exacerbated.  The reasonable person must instead be presented consistently with the impression that the power being asserted over him/her is appropriate, effectively used, and not open to abuse as punishment for lack of respect of the person wielding it. There is no helping the fact that some people will disdain anyone presuming to exert power over them, no matter what. But it remains the responsibility of those doing so to be consistent and fair, not personal, regardless. That is how respect for status is earned and maintained, and petty tyranny avoided.  

Sunday, October 2, 2011

"Normal" is overrated

"We're all pretty bizarre. Some of us are just better at hiding it, that's all." -- Andrew, The Breakfast Club

The Pervocracy has a great post on the end of "normal" relationships in general, but I particularly liked the part about gender norms:
you don't have to be non-heterosexual to question what gender means to your relationship.  If "which one of y'all does the dishes?" is a stupid question to ask a gay couple, it ought to be an equally stupid assumption to make about a straight one.  The fact that assigned gender roles are available for a straight couple doesn't mean they ought to take them on without question. 
What kind of relationship you have is your choice, and one choice isn't better than another.  What's important is that you make a choice.  That even if you're you're monogamous, vanilla, and heterosexual--you're doing it because it's what you want and because you and your partner have agreed to it, not because that's what people do.  What's important isn't what path you take, but that you know there are paths. 
Paths?  Fuck, there's an entire open world out there once you get past "man buys dinner, woman agrees to missionary PIV until he ejaculates.  (Or rather, a world including "man buys dinner, woman agrees to missionary PIV until he ejaculates," because, hey, if that's your thing.)  There's a million goddamn ways to love, a billion things  "partner" or "lover" or "fuckbuddy" or "spouse" can mean to you, and you get to decide
How fucking cool is that?
Very. The emphasis in this post is on what it calls "consciousamory"-- the idea that no orientation or lifestyle is necessarily superior or more evolved, but what matters is that people are aware that there are options and feel free to choose from among them. There's a problem when consenting adults don't feel that, when pressure from the outside or from one partner only determines the nature of their relationship rather than it being based on an agreement between them.

People write in to Savage Love all of the time asking whether the particular conditions of their relationship or the things that turn them on are normal, and every time the answer is the same: who gives a damn about "normal"?  If you like it, and your partner(s) like it, then it's right for you. If you're aware of the possibilities, have a preference, and all involved parties want it, that's what's important.  That means that your relationship, whatever it might be, is freely chosen.

Pervocracy goes on to point out that if everyone is aware that they don't have to be normal, people who engage in polyamory or any other non-standard relationship style are no longer abnormal. They're just different....in a perfectly normal way.

Coulda been, shoulda been, never woulda been

Apparently October 9th is National Pro Life Cupcake Day. Did you know? It's a day when pastries become  political...poor pastries. Pressed into service on behalf of highly controversial issue which doesn't have, so far as I can tell, any direct connection to wax paper wrappers and frosting. But, one might ask, how is this joyous holiday celebrated? Well...
Here’s how we celebrate: once a year, on October 9th, we would bake as many birthday cupcakes as humanly possible and hand them out for free wherever we can.  When people asked whose birthday it is, we tell them these cupcakes are for celebrating the birthdays of every person who never gets to have a birthday.  People respond in all ways – from refusing the cupcake, to sharing about abortions they’ve had in the past and the regret they carry, to just wanting to know more.
Amanda Marcotte offers up some lovely snark in response:
But really, they're selling the whole "never will get a birthday" thing short!  After all, there are many, many, many more potential people that never come into existence than just those who may have been but for an abortion. After all, there are children you never had because you use contraception (to be fair, anti-choice activists are also against that).  But there are also children you didn't have because you didn't have sex in the first place.  Not fucking is clearly murder in these cases. Every time you're ovulating and you elect to go to bed alone, you have deprived someone of a birthday!  So women like Lila Rose and Jill Stanek, who claim that contraception is a sin and therefore expect us to believe they simply use abstinence to keep from having babies, are also horrible deprivers-of-birthdays with all that abstaining. Stanek is in her 50s and has only one son, I do believe, meaning she's deprived approximately 400 children of their chance to have a birthday. That's a lot of cupcakes!
All I can think of this quote from Richard Dawkins' book Unweaving the Rainbow:
We are going to die, and that makes us the lucky ones. Most people are never going to die because they are never going to be born. The potential people who could have been here in my place but who will in fact never see the light of day outnumber the sand grains of Arabia. Certainly those unborn ghosts include greater poets than Keats, scientists greater than Newton. We know this because the set of possible people allowed by our DNA so massively exceeds the set of actual people. In the teeth of these stupefying odds it is you and I, in our ordinariness, that are here.
 Have a cupcake.

Saturday, October 1, 2011

Is empathy enough?

David Brooks has an interesting essay in the New York Times called The Limits of Empathy. In it he discusses the wealth of research published lately on how empathy works as a psychological response, and makes a case that it can't and shouldn't be considered the true foundation for morality. This is because the reaction of empathy doesn't always kick in when it ideally should, to the extent that it should:
Empathy orients you toward moral action, but it doesn’t seem to help much when that action comes at a personal cost. You may feel a pang for the homeless guy on the other side of the street, but the odds are that you are not going to cross the street to give him a dollar. 
There have been piles of studies investigating the link between empathy and moral action. Different scholars come to different conclusions, but, in a recent paper, Jesse Prinz, a philosopher at City University of New York, summarized the research this way: “These studies suggest that empathy is not a major player when it comes to moral motivation. Its contribution is negligible in children, modest in adults, and nonexistent when costs are significant.” Other scholars have called empathy a “fragile flower,” easily crushed by self-concern.
And when it does, it is shockingly biased:
Moreover, Prinz argues, empathy often leads people astray. It influences people to care more about cute victims than ugly victims. It leads to nepotism. It subverts justice; juries give lighter sentences to defendants that show sadness. It leads us to react to shocking incidents, like a hurricane, but not longstanding conditions, like global hunger or preventable diseases.
All of this is true. Our sense of affective empathy (empathy as an emotional reaction) is most easily provoked when confronted with suffering of people who are like us and familiar to us.  That group includes family most immediately, but can extend toward members of virtually any group who are better known and more like us than those who are not.  Neighbors over non-neighbors. People who go to the same church over those who don't, or don't go to church at all. People of the same color vs. another race, people from the same town/state/country before foreigners. Bros before hos*. Preferential empathy isn't antipathy, it's important to note...but it can turn into it, given that allegiances with some people tend to create enemies out of the others.

Still, I find that a kind of odd criticism of empathy-- that it isn't all-encompassing, therefore it can't be a good moral foundation. Psychologist Simon Baron-Cohen, best known for his work on autism, has written that having a deficient theory of mind (the term for our capacity to recognize and understand the thoughts and goals of others) makes it harder for people with autism to experience affective empathy.  But that certainly doesn't make them into psychopaths.  Instead, it can lead to the creation of a more explicit, removed form of empathy-- one based on broad notions of justice rather than being moved by the suffering of someone specific. I find it entirely fitting to use "empathy" as a term for this because the belief that it's wrong to punish or reward people unequally for the same acts (for example) requires a sense of fairness, and a sense of fairness comes out of an ability to put oneself in the place of someone who is treated unfairly.  This is called the simulation theory of empathy-- understanding what a person is thinking and feeling by approximating their situation as best as you can, drawing on your own experiences.  When your theory of mind is just that-- a theory-- this is how empathy works for you. Cognitively, rather than as an intuitive response.  This way of thinking might have the advantage of provoking people toward a consistent theory of justice, one which isn't as subject to the biases discussed above.

Brooks concludes:
Think of anybody you admire. They probably have some talent for fellow-feeling, but it is overshadowed by their sense of obligation to some religious, military, social or philosophic code. They would feel a sense of shame or guilt if they didn’t live up to the code. The code tells them when they deserve public admiration or dishonor. The code helps them evaluate other people’s feelings, not just share them.
This is absolutely true. But from what I can tell, that is empathy, if it starts with a consideration for how others must feel and think. We all build our own codes-- from scratch possibly, but for the vast majority of us something more like an amalgamation of those developed from people who came before us, cobbled together and modified as we've seen fit. If that codification is centered around being fair and not causing suffering, then it seems right to call it empathy-based.

*If ever an expression merited an immediate karmic punishment from the universe....