Daydreaming—my brain didn’t get that module

Until I read an article in Scientific Mind this month about daydreaming (“Living in a Dream World” by Josie Glausiusz; not available for free as far as I can find), I wasn’t aware that I lack this mental activity. Definitions vary; one used in the article is that daydreams are “an inner world where we can rehearse the future and imagine new adventures without risk“. Another is “imagining situations in the future that are largely positive in tone”. I would add something to differentiate daydreaming from planning—perhaps that daydreaming includes emotional reactions.

It’s not clear in the article whether research results apply only to positive fictional imaginings or to routine planning and review, as well. The latter is much more common. Also the author conflates daydreaming and the mind’s use of off-task time to solve problems non-consciously.

People can daydream in extravagant adventures à la Walter Mitty, or more mundane imaginings of how good that hot bath will feel after work, or how happy one’s child will be when she receives her Christmas present. Most people, the author says, “spend about 30 percent of their waking hours spacing out, drifting off, lost in thought, woolgathering, in a brown study, or building castles in the air.” And it’s important to our sense of self, our creativity, “and how we integrate the outside world into our inner lives”.

I remember, as a solitary child, pretending to be Superman or Tarzan, but not often; I read, instead. After the age of 10 or 12, I don’t think I had imaginary adventures at all. Not surprisingly, I’m also unable to visualize scenes: “Imagine yourself on a tropical beach” is impossible for me to do. I can think, okay, I’m on a tropical breach, it is warm and sunny, and so on, but there’s no sensory aspect to it, just words. Similarly, my memories of the past (mostly gone now due to fibromyalgia cognitive damage) are all just words, as if someone had described a scene to me rather than my having experienced it. There’s no “mind’s eye” in my mind. In novels I usually tune out while reading the descriptions of landscapes and people; no corresponding mental pictures rise in my mind.

Daydreaming can be escapism but it can also be a way of trying out different futures, and experiencing the associated emotions. I think this could also help motivate a person toward a chosen or hoped-for future, by allowing advance tastes of its rewards or of the misery of its alternative. I make decisions about future choices and I make plans but I don’t try them out mentally in advance, and I also (in jobs, for example) tend to stay where I am rather than striving for something different. I’ve thought of myself as lacking in ambition, but maybe it’s more that I don’t have a way of modelling the future choices with emotional content. Mostly I’ve stayed in jobs until they became intolerable, then moved on, sometimes with no replacement in mind. I can’t even really visualize ideas for a vacation or a trip, especially to someplace I’ve never been.

So, what do I do with that 30% of my waking hours that other people use for daydreaming? Not enough. Sometimes, for a couple of minutes, it seems nothing is going on in my mind, or merely observation, without commentary, of what’s happening around me; I have no idea how typical that is. But mostly the engine’s running, chewing over what’s in front of it. Why are things this way, how could this activity be done better, how does this work, that sort of thing. I used to do a lot of sequential thinking, as if working through thoughts with pen and paper, exploring ideas and putting things together, taking them apart, finding correlations and causes. I could continue working on different mental projects during intervals across days and days, and sometimes wrote that way—at the end of the mental work I’d have an outline and some exact wording to put down on paper. Then I’d revise and expand, but I could work out a lot of it mentally and recall it. No more, since fibromyalgia. Thinking is often slow and I can’t remember from one day to the next what I came up with. Sometimes thoughts flit through and are gone before I can even try to remember them. This is one of FM’s major losses, for me, both a loss of pleasure and a loss of what I can accomplish.

Maybe reading fills the role of daydreaming for me. I read a lot, about equal amounts fiction and non-, and if circumstances prevent me from reading for a couple of days I feel the deprivation. The article mentions non-daydreamers only in passing: “Cognitive psychologists are now also examining how brain disease may impair our ability to meander mentally”. If my impairment is due to a brain disease, it’s one I’ve had since early on.

Others, it turns out, suffer from the opposite disorder, daydreaming that is a compulsion or simply so enjoyable that real life takes a back seat. Some have a second life in an alternate world where continuing characters age just like people in the world the rest of us live in. They may fit this narrative into available mental down-time in their lives, or spend up to 90% of their time “away”.

I find it strange that it took me so long to discover that other people spend a third of their waking hours on a mental activity which I lack entirely. It goes to show how little exchange there is among us humans regarding how we think, how our individual minds work. Humans yo-yo between xenophobia—members of other groups are different, dangerous— and “we’re all really just alike”, but a study of psychological research found “significant psychological and behavioral differences between what the researchers call Western, Educated, Industrialized, Rich and Democratic (WEIRD) societies and their non-WEIRD counterparts across a spectrum of key areas, including visual perception, fairness, spatial and moral reasoning, memory and conformity.“ Maybe in daydreaming as well. But nearly all psychological research is done on WEIRD subjects, for both practical and ethnocentric reasons, so who knows? Same for neuroscience; who’s going to airlift fMRI equipment to the lands of the Yanomamo and then persuade them to lie down with their heads inside?

ScytheSharpeningLeighton

Still, the article raised in my mind some questions we could look at right here in the post-industrial West. If people were prevented from daydreaming, by some technological device probably not yet invented, how would they feel? (Recalling the familiar ‘fact’ about deprivation of night-time dreaming making people hallucinate, I looked to see if it was true, and apparently not.) What proportion of people don’t have imaginative daydreams, and is this always a sign of brain disease or dysfunction or just a normal mental variation? We characterize one sort of excessive negative daydreaming as “catastrophizing”; what about individuals making deliberate use of negative or positive futures, to influence their behavior? And how can “daydreaming” be more precisely separated out from other mental processes such as planning, brooding, brainstorming, and worrying?

Muse on it all, and see what your daydreaming mind comes up with.

King vulture

King vulture photo by Tambako the Jaguar, flickr.

Christine O’Donnell, religion, and the human brain

Poor would-be senator Christine O’Donnell has been ridiculed for her comment about mice with human brains:

O’DONNELL: … these groups admitted that the report that said, “Hey, yay, we cloned a monkey. Now we’re using this to start cloning humans.” We have to keep…

O’REILLY: Let them admit anything they want. But they won’t do that here in the United States unless all craziness is going on.

O’DONNELL: They are — they are doing that here in the United States. American scientific companies are cross-breeding humans and animals and coming up with mice with fully functioning human brains. So they’re already into this experiment.

From transcript of O’Reilly show, Friday, November 16, 2007.

Why would Ms. O’Donnell (or someone who informed her) believe this?

Reports of mouse-brain research have been greatly exaggerated

It doesn’t take much to find some of the “evidence” that may have convinced her or her informant. As others have noted, there have been experiments in which human cells were injected into embryo mice, and became part of their brains. A bit different than “cross-breeding humans and animals and coming up with mice with fully functioning human brains”, but all rumors have to start somewhere.

Bad reporting may be to blame: here’s the headline and first line of the 2005 article on the National Geographic site:

NatGeo article on mice.jpg

From nationalgeographic.com.

In case that last line is too small to read, it says “Researchers in California have created living mice with functioning human stem cells in their brains.”

Earlier that same year (2005) another article on the NatGeo site briefly referred to the same research (before it had occurred) this way “And at Stanford University in California an experiment might be done later this year to create mice with human brains.” The title of this misleading article was Animal-Human Hybrids Spark Controversy. Yes, plenty of controversy, but in the article no hybridization is being talked about, only the use of stem cells to demonstrate their potential to be re-purposed. In biology, a hybrid is the offspring of two plants or animals of different species or varieties, such as a mule (a hybrid of a donkey and a horse), and that is the popular understanding as well. Few would consider a mouse with a few cells of human origin, all functioning as mouse cells, to be a hybrid.

mouse with human headSM.jpg

Christine, you need a smart friend; meet Clyven the mouse

But wait, it’s not all down to irresponsible journalism; perhaps Ms. O’Donnell got her information from this page, on the site of the prestigious RYT Hospital, about “Clyven: The World’s First Transgenic Mouse with Human Intelligence” :

Clyven1.jpg

Here’s the explanatory text from that page.

Margaret A. Keyes, M.D., Ph.D., is a researcher in genetic medicine and Professor of Cell Biology and Genetics at RYT Hospital-Dwayne Medical Center. She is exploring the use of embryonic stem cells as a means to cure neurological conditions such as Alzheimer’s Disease and Creutzfeldt-Jakob Disease.

By implanting human brain cells (grown from a human embryo’s stem cells) into a mouse engineered to have Alzheimer’s, Dr. Keyes inadvertently made a remarkable and startling discovery: she not only cured the mouse’s Alzheimer’s Disease, but the animal soon developed the relative intelligence of a human being.

After extensive consideration by RYT Hospital’s Institutional Review Board (IRB) and Institutional Animal Care and Use Committee (IACUC), it was decided that this mouse would be placed under a new study led by Dr. Keyes’ lab.

As part of this new facility (pictured to the right), Clyven is placed inside a light-sensitive box with electrodes attached to him. He is able to communicate and interact with people via this neuro interface. He is able to communicate and interact with people via this neuro interface. [sic] To hear a live audio feed of Clyven inside his box, please click here.

© RYT Hospital-Dwayne Medical Center
Source.

Evidently RYT Hospital has a very understanding Ethics Committee; they approve of confining an animal with a human brain in a little box and punching his head full of electrodes. But Clyven is not bored—in fact, you are invited to chat with him or play a game against him.

Chat with Clyven.jpg

Unfortunately, Clyven’s conversation is limited and repetitious—maybe he himself wrote that quoted passage above, that has the repeated sentence. My best conversation with him was this one,

ME: Hi Clyven.
MOUSE: Yo.
ME: Do you know that a tea party candidate is talking about you?
MOUSE: That’s pretty silly.
ME: Yes.
MOUSE: Tell me more.
ME: I don’t want you to know how dumb people can get.
MOUSE: I’m sorry. I hope to get smarter as time goes on. Please be patient with me. I will be receiving additional neurological treatments next week.

On the other hand, don’t let him persuade you to bet money on that CheezeMaze contest, he’ll beat you paws down.

It’s surprising we haven’t heard more about this RYT Hospital, with the amazingly advanced and useful research that is being done there. Elsewhere on the site, you can read about a new drug, Revitalex

Revitalex.jpg

about successful gene therapy for back pain, and about “NanoDocs”, nanobots that circulate throughout the body repairing tissues.

medical nanites.jpg

Okay, so it’s not a real site but the project of an artist named Virgil Wong. He’s a painter, film-maker, and head of web design for two real hospitals.

Still, can’t you see how anybody might be taken in by the slick style, and accept that there really is a mouse with human intelligence, and nanobots that can tidy up your blood vessels?

No? You say anyone beyond the stage of believing in the Tooth Fairy should have seen through this? and through the distorted reports of growing human brains in mice?

I think so too.

Wherever Christine O’Donnell may have gotten her “information” about mice with human brains, the real problem is minds like hers that are unprepared to question things that most of us would find outlandish. They also believe that Obama is Hitler, Stalin, and a Kenyan anti-colonialist, all at the same time! which would explain why, as I have heard on good authority, Obama has three heads, a fact cleverly concealed by camera angles and good tailoring.

Newt, Eastern.jpg

Eastern Newt (Notophthalmus viridescens), Red Eft Stage. Etymological note: Notophthalamus from the Greek noto (a mark) and ophthalmus (eye), presumably in reference to the eye spots on the sides and back; viridescens from the Latin, (slightly green) referring to the greenish color of the adults. Source.

One born every minute? or are they made?

Where do these credulous people come from? I don’t mean people like Newt Gingrich, who will repeat anything—no matter how preposterous—if it seems advantageous. No, demagogues use untruths consciously, with calculated intent. The power of the demagogue depends upon there being enough people who cannot distinguish between the likely, the possible, and the absurd, and therefore won’t laugh him off his soapbox. And where do they come from?

The beginning preparation for most credulous people of otherwise normal intelligence is, I think, being raised with a huge area of life and thought which is categorically excluded from rational examination. Now, every culture and sub-culture has some areas like that, because they are essential as part of the group’s self-definition. In this Land of the Unquestioned reside things like appropriate behavior (manners), kinship rules, dress codes, what we eat and how we cook it, all that sort of thing. That’s why our way of life seems so logical and natural, and other groups’ ways seem bizarre and senseless.

No problem when it’s a question of the relative merits of haggis or corn on the cob, but in the area of exclusion there are more significant topics also, such as attitudes to the “Other” (women, outsiders, those in your own group who don’t conform), and toward violence. That’s the cultural “Don’t think about these things” list. Then there’s religion and its list.

Religion is the really big no-fly zone for human reason. It covers a much wider area of life than ordinary cultural indoctrination, often upon a foundation of dogmatic zeal which asserts sole possession of truth, and enforces details of the dogma with extreme fervor.

Totalitarianism and extremist religions share two fundamental principles: there is only one true way, and everyone must be forced to acknowledge it. It is not enough for the non-believer to refrain from critical expression and deviant action: he or she must be made to believe. Hence the show trials held by the Soviets, the Chinese Cultural Revolution, and the Inquisition, in which tortured inmates confess their nonexistent sins; hence the death penalty for apostasy in Islam, and the roasting alive of unrepentant Christians by the Romans and doggedly heathen Native Americans by the Christians. The Other must be brought within the fold or die, and it should be done in a public and painful way to present a compelling example to everyone else.

Children are born enquirers (non-believers), and about the age of three they start to ask “Why?” about everything, with irritating persistence. Give an answer and they ask for more details or ask “Why?” again. (Offer a non-answer like “I don’t know” or “Be quiet” and they repeat the original question or say nothing; curiosity discouraged begins to shut down.) Their brains are making and pruning connexions, they’re constructing an internal model of the world, and they want and need to know more and to discuss their own thoughts. They are also learning how to learn, how to figure things out.

A child who gets yelled at for asking about talking snakes, or smacked for asking why the God of Love is such a bloody-handed war-approving tyrant in the Old Testament (see note 1), will learn to accept what he or she is told and not think about it. The lesson is to avoid questioning—especially the things in life that seem illogical, cruel, unfair, out of sync with reality. And that “respect for authority” (actually, it is only respect for power and avoidance of punishment) carries over into other parts of life. The more intensely the “No Questions Zone” is defended, the more timid the young mind’s reason becomes.

Curiosity is inborn, but logic is learned. When children are exposed to illogical conclusions, such as “You got a cold right after you ate that ice cream, so no more ice cream” or “I know the Bible is the Word of God because the preacher says so and the Bible says to follow what the preacher says” they won’t learn the basic rules of logic that help humans sort true from false, as well as “probably true” from “probably false”. Ignorance of logic is of course a good thing for those enforcing a monolithic belief system.

Our country’s culture has an equivocal position on learning. Along with its tradition of independence and individualism, the US also has a strong anti-intellectual tradition, because of its religious foundations and the pragmatic demands of survival on successive frontiers from New England to the Pacific coast. When book-larnin’ is seen as irrelevant, perhaps un-masculine, some will make a positive virtue of ignorance. Also, study is hard, ignorance is effortless. Entropy prevails.

Logic and critical thinking are not enough. In order to winnow the wheat from the chaff reliably, it’s necessary to have some actual knowledge. When a statement is made, the hearers check it against their relevant knowledge base. This process is usually instant and automatic. The new information may directly conflict with existing knowledge, or it may just appear quite unlikely based on what is already known. A certain stock of knowledge, reliable because it has been tested or was provided by a trusted authority, is needed to get through life. Yet even some of this knowledge may be false—blondes are dumb, bankers are trustworthy, a barking dog never bites—and individuals must also possess the willingness to re-examine beliefs based on new experience. Except in the No Thinking Zone, where the only safe course is to agree with authority and otherwise keep your mouth shut.

When politics is the subject, then history must have special prominence among relevant areas of knowledge. Just like more workaday fields of endeavor, political systems embody responses to real needs and problems. If I were re-designing the internal combustion engine, I would first need to know why each part had been designed as it was; what earlier mechanisms were tried for mixing the fuel or timing the ignition, and what were their flaws?

It is history which answers these questions in politics, and must be consulted before tinkering or throwing away parts. For example, decades of controversy about the constitutional provision in the First Amendment usually referred to as “separation of church and state” have distorted public understanding of the law’s intent by framing it as a dispute between agnostics or atheists, vs. religious people. In fact it was enacted to defend all religions from government, and from a preference being shown for a single church, as well as to protect government (or non-religious persons) from religion. And the history of state-established religions illustrates the many repressions and disenfranchisements which are imposed upon members of the non-official religions, even including banishment and death. Only modern ignorance permits the discussion of this subject to be framed entirely as a conflict between religion and irreligion. [Christine O’Donnell, in a recent debate, was ignorant of the provision entirely. After the phrase “Government shall make no law respecting establishment of religion” was quoted to her, she asked “That is in the First Amendment?” Yes, it is, though the exact words are “Congress shall make no…”.]

Logic, general knowledge, critical thinking, history: how is the American public doing on these?

37% of Americans believe that houses can be haunted, and 25% believe in astrology, i.e. that the position of the stars and planets can affect people’s lives.

Fewer than a third can identify DNA as a key to heredity, only about 10% know what radiation is, and 20% think the Sun revolves around the Earth, an idea science abandoned by the 17th century.

50% of our fellow citizens believe in alien abductions, though happily only 7% say they or someone they know has been abducted.

39% of Americans could not name any of the freedoms in the First Amendment.

14 percent of Americans say President Barack Obama may be the Antichrist (24 percent of Republicans believe this). Almost 20% believe he is a Muslim. Does that add up to 34% or is there some overlap?

Two-thirds of 1,000 American adults polled couldn’t name a single current justice of the Supreme Court. In the same survey, more than a third did not know the century in which the American Revolution took place, and half of respondents believed that either the Civil War, the Emancipation Proclamation or the War of 1812 occurred before the American Revolution.

And 21% believe in witchcraft, so O’Donnell’s “I’m not a witch” ad did have its audience.

When you look through these and other poll results it seems that at least 10% to 25% of Americans believe in just about any unproven concept you can imagine. A larger percentage is very ignorant of history and public affairs.

If you’re reading this, and have been apathetic about getting to the polls, you better think again.

One final poll result: in 2009, 19% percent of Americans agreed that the First Amendment goes too far in the rights it guarantees, and 39% said the press has too much freedom.

mr natural.jpg

≈≈≈≈≈≈≈≈≈≈≈≈≈≈

NOTE 1: I cite only two examples, both from the same holy book, for the sake of brevity, but every religion seems to have its own set of magical events and unquestioned cruelties which must be accepted in order to belong. Belong, get along, go along.

Let’s be niggardly with the n-word, but…

but…sometimes it can and should be said.

niggardly
stingy, sparing, parsimonious, e.g. “serving out the rations with a niggardly hand”.

from niggard
mid-14c., nygart, of uncertain origin. The suffix suggests French origin (cf. -ard), but the root word is probably related to O.N. hnøggr “stingy,” from P.Gmc. *khnauwjaz (cf. Swed. njugg “close, careful,” Ger. genau “precise, exact”), and to O.E. hneaw “stingy, niggardly,” which did not survive in M.E. [etymonline.com]

nigger
1786, earlier neger (1568, Scottish and northern England dialect), from Fr. nègre, from Sp. negro (see Negro, from Latin nigrum (nominative form niger) “black,” of unknown origin (perhaps from Proto Indo European *nekw-t- “night,”). From the earliest usage it was “the term that carries with it all the obloquy and contempt and rejection which whites have inflicted on blacks” [cited in Gowers, 1965]. But as black inferiority was at one time a near universal assumption in English-speaking lands, the word in some cases could be used without deliberate insult. More sympathetic writers late 18c. and early 19c. seem to have used black (n.) and, after the American Civil War, colored person. Also applied by English settlers to dark-skinned native peoples in India, Australia, Polynesia. The reclamation of the word as a neutral or positive term in black culture (not universally regarded as a worthwhile enterprise), often with a suggestion of “soul” or “style,” is attested first in the Amer. South, later (1968) in the Northern, urban-based Black Power movement. [”You’re a fool nigger, and the worst day’s work Pa ever did was to buy you,” said Scarlett slowly. … There, she thought, I’ve said ‘nigger’ and Mother wouldn’t like that at all.” [Margaret Mitchell, “Gone With the Wind,” 1936] [etymonline.com]

Dr. Laura Schlessinger, radio dispenser of advice and moral judgments, is taking considerable heat for her response to a call involving the word nigger. I don’t even know if WordPress will allow me to spell that word out—and that is what I want to talk about. If you would like to see a transcript of the call and Dr. Laura’s remarks immediately after the call, it is here.

I will say that I think Dr. Laura should apologize, but not for saying a bad word on the radio. For whatever reason, she abandoned her professional role and lost the distance and composure essential to that role. Instead of asking elucidating questions and listening to the caller’s answers, she went off on a rant of her own. As Dear Abby and Ann Landers and others have often decreed, when a spouse hears relatives or friends insulting or taunting his or her spouse, spouse1 must let the relatives/friends know that this is not acceptable, that neither member of the couple will stay to hear such insults and taunts. There are good practical reasons for this, and there is even Biblical justification: “For this reason a man shall leave his father and his mother, and be joined to his wife; and they shall become one flesh.” Genesis 2:24. New American Standard Bible (©1995)

When a black woman married to a white man complained that the word nigger was being used, the therapist-of-the-air should have asked: “How is it being used? Give us an example.” Probably the example will not be a disquisition having to do with Mark Twain’s use of the word in Huckleberry Finn, or the word’s etymology, or a quotation from a black comic using the word. Most likely the remarks are of this nature: “You know niggers, they always/never…” or “Some nigger robbed the convenience store over by where I work…”

These uses are insulting, hostile, and demeaning, like all the other dehumanizing terms used to set some group apart from the rest of us. English has terms like that for Arabs, Baptists, Catholics, Jews, fat people, smart people, stupid people, white people, Hispanic people, gay people, men, women, and people whose ancestors were from Ireland, Italy, Poland, Scandinavia, and so on. Other languages have similar terms for the same list, subtracting the words for the speaker’s religion, gender, sexual preference, appearance, and ethnic origin, and adding ones for us Americans. Sometimes there’s intra-group use of such terms; Dr. Laura went off on that tangent but I will not, it’s irrelevant. When outsiders use the term it is almost exclusively insulting and demeaning. Worst of all, these dehumanizing terms are the mental preparation for pogroms, lynchings, beatings, bombings, murders, war, and ethnic cleansing (a shocking euphemism which deserves its own examination, but not here).

No question, nigger is not a word for everyday use by non-blacks, same as the other group-based terms alluded to above.

But when someone wants to excise words from our language, all of us should resist.

When we begin killing words, where will it end?

Some may say, Better words than people! My reply is that manipulating language is the same as manipulating thought, which in turn changes how we act. Dehumanizing the Other is a preparation for war, violence and mistreatment, whether organized or individual. What’s necessary and positive is to continue to educate people not to use these category-based insulting demeaning words to other people. If you want to call someone lazy then do that, but don’t couple it with an insult to the person’s innate or historic self. I can argue with you about whether I am lazy or not, and if convinced that I am, I can choose to change it, but I will always be a black Scandinavian Catholic gay smart woman, if that is what I am.

If we must substitute the ridiculous circumlocution “the N-word” for nigger, then how do we discuss historical documents that used it? How do we read literature that used it? How do we talk about the word itself and its history that renders it sharp as a sword, clanking with manacles, reeking of hatred and suffering?

James Baldwin used nigger, and not just in the vocative sense (e.g., my example, as some use “man”, “Man, you know I’m…”, “Nigger, you know I’m…”) In fact he and Dick Gregory made a serious movie by that title, and it’s the title of Dick Gregory’s 1990 autobiography. Baldwin and Gregory believed that the word and its various meanings needed to be thought about, talked about, by whites and blacks.

Apparently “the N-word” attained popularity during the OJ Simpson trial, in talking about recorded statements by Mark Fuhrman. Chris Darden, black prosecutor, tried to save Fuhrman’s credibility by letting everyone know how awful he, Darden, thought the word was: “The prosecutor [Christopher Darden], his voice trembling, added that the “N-word” was so vile that he would not utter it. “It’s the filthiest, dirtiest, nastiest word in the English language.”

But the Bowdlers who gave us first “the F-word” and then “the N-word” won’t stop there. Broadcasters aid and abet them, probably feeling a little frisson of guilty pleasure at being able to allude unmistakably to words the FCC won’t permit them to utter. And some people make up new “[letter] – words” to dramatize their remarks or because they feel victimized. So we have “the B-word”, and words for C, D, E, F, G, H, J, K and on through the alphabet. Some of the words so represented are offensive, others are as inoffensive as “green”.

Let’s let this “N-word” thing stop and fade away. Many times it will be sufficient to say that a person used a “racial slur referring to black people”. If not, then just say the word. “Candidate Joe Smith singled out a black man in the audience for insult, calling him a nigger and a ‘mono’, Spanish for monkey.” Let’s be grown-ups about language and about how we treat others. Hiding behind alphabetical euphemisms makes us sound like giggling 8 year olds, and prevents us from thinking and talking about the issues that euphemisms cover up.

Listening to what people say: no victim “deserves it”

Recently I’ve noticed, in reports of crimes against persons, an abhorrent phrase that seems to be commonly accepted: people being quoted as saying that the victim “didn’t deserve this”. Who does deserve being beaten, raped, or murdered? Ah, but maybe this person did deserve a beating––but was murdered instead. No, too subtle.

Was I imagining it? I googled “didn’t deserve to die”, the strongest usage, and quickly came up with half a dozen different instances.

Then, on the front page of the Oregonian a week or so ago, I saw this one: a driver with a blood alcohol level “approaching .30” ran his car up onto a sidewalk in broad daylight and pinned a pedestrian against a utility pole. As the drunk tried to drive away he hit the pedestrian two more times. Oh yes, and the pedestrian was blind and carrying a white cane. The driver was chased and boxed in by other drivers. Since his arrest, he had been trying to make a good impression: visiting the badly injured man, publicizing his own past volunteer work (performed while he was a bank exec), all that sort of thing. The article reported on his appearance in court for sentencing, definitely an occasion to choose one’s words carefully. What did he say, in his attempt at an apology?

“He didn’t deserve it. It was all my fault.”

Good to know that the blind man didn’t actually deserve being run over three times, we were all wondering about that.

What’s going on here?

According to my unscientific survey the phrase is used at least as often by the relatives of victims, as by those accused of the crime in question. So I conclude that this represents a general societal attitude, which tacitly regards some people as deserving to be harmed or attacked by others.

The connexion that came up in my mind was with a shift in moral education over the past three decades or so, which changed the focus from the person acting, to the person being acted upon, and from general principles of interpersonal behavior, to principles regarding certain groups. In an effort to end harassment of minorities and those perceived as different, we started teaching children and adults to avoid ridiculing this or that sort of person––overweight or gay, for example. Something needed to be done, to end these long-winked-at instances of bullying and cruelty, but how much better to emphasize a universal (and positive, rather than negative) approach of being polite and compassionate. Singling out groups creates assumptions that groups not named may be fair game. “Nobody told me not to call him names, he’s an Italian/left-handed/too skinny/a nerd!”

The general approach is better all around.

Some pragmatic reasons: It’s far easier, no need to remember who you’re supposed to be kind to this week. Like deciding that you are going to stop your car whenever a pedestrian is trying to cross, instead of having to make a judgment call on the fly each time. No type of person is accidentally omitted (though of course people who are dangerous, manipulative, etc., can and often must be treated differently). Those are points of persuasion for people not so much moved by moral considerations alone (to me it’s surprising how often there are practical reasons which could be used to bolster the “should/ought” arguments).

Moral arguments include: putting responsibility where it belongs, on the act-or instead of the act-ee; promoting human community rather than division; generally strengthening the moral rule which is one that makes human interchange run much more smoothly and harmoniously.

Then, from a different angle, there’s Shakespeare’s take what the just deserts of a human being, “poor bare, forked animal”, may be

HamletOnJustDeserts.jpg