On Collective Guilt

Ruth_Andreas-TitelWe can’t leave the Holocaust alone. That might be a good thing if we had the courage to view it honestly. We don’t though. We insist that it’s a puzzle we continue to try to solve, ostensibly so that we will know where to place blame, and in that way also know how to ensure that it will never happen again. We refuse, however, to place blame where it really belongs and so we keep turning it over and over, searching for something we will never find.

Why the Germans? Why the Jews? are questions that Götz Aly takes up in a new book the title of which begins with these questions (Metropolitan Books, 2014). Aly’s theory, not particularly novel, is that the social and economic advances made possible for Jews in Germany as a result of a series of legal reforms in the various German states in the eighteenth and nineteenth centuries made them objects of envy. “Not all Nazi voters,” acknowledges Christopher R. Browning in a review of Aly’s book, “were anti-Semitic, but they at least tolerated Nazi anti-Semitism” (“How Envy of Jews Lay Behind It,” The New York Review of Books, January 8, 2015).

“But how to explain,” Browning continues, “this ‘moral insensibility’ and ‘moral torpor’ of 1933-1944, which underpinned the ‘criminal collaboration’ between the German people and the Nazi regime?” The answer Aly offered first in Hitler’s Beneficiaries (Metropolitan Books, 2005), was material gain. Aly’s new work supplements the motive of material gain with a “new morality” involving race theory that would justify such collaboration.

Many Germans remained unconvinced, however, by the new race theory. Many Germans were, in fact, untroubled by the legal reforms that had made possible the flowering of the Jewish middle class. Many Germans had even championed these reforms.

What happened to those people?

The journalist Ruth Andreas-Friedrich, who lived in Berlin during the war, gives us some insight into what happened to them in the diary she kept from 1938-1945. Initially, at least, they were not helping the Nazis. Her entry for Nov 10, 1938, the day after the infamous Kristallnacht,“ gives moving testament to that fact. At half past nine in the morning Andreas-Friedrich took a bus to her office. “The bus conductor looks at me,” she writes,

as if he had something important to say, but then just shakes his head, and looks away guiltily. My fellow passengers don’t look up at all. Everyone’s expression seems somehow to be asking forgiveness. The Kurfürstendamm is a sea of broken glass. At the corner of Fasanenstraße people are gathering–a mute mass looking in dismay at the synagogue, whose dome is hidden in a cloud of smoke.

            ‘A damn shame!’ a man beside me whispers … [W]e all feel that we are brothers as we sit here in the bus ready to die of shame. Brothers in shame; comrades in humiliation” (Berlin Underground 1938-1945 [Paragon House, 1989).

When she gets to the office, her editor, whom she observes, was “rumored to have a tinge of Nazism, ” says “one doesn’t dare look people in the eye anymore” (21).

“They’ve dragged all them all away–all the Jewish men they could get hold of,” begins her entry for the next day.

Only those who were warned in time have escaped the raid. Thank Heavens, a good many were warned. Hundreds managed to disappear at the houses of friends; hundreds sought shelter with strangers and found it. One little seamstress took in two Jewish fugitives; she didn’t even know their names or where they came from. Workingmen in the Frankfurter Allee brought back to the Jewish shop-owners the merchandise that was scattered over the street. They didn’t say a word, just tugged sheepishly at their caps. The chief surgeon of a hospital is hiding a wounded rabbi in the back room from the bloodhounds of the Gestapo.

            While the SS was raging, innumerable fellow Germans were ready to die of pity and shame” (p. 25).

The next line of the translation reads “Almost all our friends have people quartered on them.” If one goes to the original German edition of the diaries, however, the text continues

Women are dashing about the city today with mysterious bundles under their arms, meeting one another on street corners: Shaving articles for Doctor Weißmann. A clean shirt for Fritz Levy, night things for Jochen Cohn. One tries, as much as possible, to look after those in hiding. It isn’t advisable for them to come out of hiding yet. What happened yesterday could continue today (Der Schattenmann [The Shadow Man], Suhrkamp, 2nd ed. 2012, p. 38).

Then comes the line “Almost all our friends have people quartered on them.” There is no ellipsis to indicate material was omitted. One could argue it doesn’t matter because what makes it into the translation makes clear that the general reaction of Berliners to Kristallnacht was one of horror. Still, the omitted material makes even clearer how widespread among gentiles was sympathy for the plight of the Jews.

Interesting, eh? People running about the city collecting the necessary articles for friends, and in some cases even strangers, they’re protecting. Jews being given shelter by countless German gentiles. Workmen returning to Jewish shop-owners merchandise that had been scattered on the street. What happened to those countless Germans who were sympathetic to the plight of the Jews, to those countless “brothers in shame”?

What do you think happened to them? What happens to people who try to help others as it becomes increasingly clear what such assistance might eventually cost them? Some continue, despite the danger, some go into resistance groups such as “Uncle Emil,“ the one with which Andreas-Friedrich became associated, but most do not.

Andreas-Friedrich “looks lovingly” at the man who whispers “A damn shame!” at the sight of the burning synagogue.

“It occurs to me,” she writes, “that this is ”really the time to call your neighbor ‘brother.’ But I don’t do it. One never does; one just thinks it. And if you really do pluck up the courage for a running start, in the end you just ask, ‘Pardon me, could you tell me the time?’ And then you are instantly ashamed of being such a coward” (p. 19).

Why couldn’t she do it? Why couldn’t she acknowledge to the man that she also condemned what had happened the night before? Why couldn’t any of the people on the bus who were hanging their heads in shame, in silent shame? Why doesn’t one do it?

Years ago I saw a nature program that focused on a litter of wolf cubs. There were three cubs in the den. One emerged, however, days before the other two. He was bold, he was courageous. He was eager to explore the outside world. Ah, I thought to myself, he will be the alpha wolf. He will grow up to be the leader.

One day, though, the brave little cub came home from his explorations with an injured foot. He left again the next day, undaunted by his grisly experience of the day before, but that evening, he did not return. He never returned again. Who knows what had gotten him, but something clearly had.

Several more days passed after the disappearance of the first little cub before the two remaining ones peeked out, trembling, bodies pressed together, from the mouth of the little den. Another day still passed before they had the courage actually to emerge fully from the shelter of their home.

And suddenly I understood why human beings are such a miserable craven lot. Natural selection has ensured that cowardly individuals have a higher survival rate than courageous ones. They live longer, produce more offspring. So it isn’t our fault, really, that we’re such a miserable, craven lot. It’s in our genes.

And yet it is our fault because cowardice isn’t the only thing that’s in our genes. We have somehow also evolved a conscience. We know, as Aristotle expressed it in the Nicomachean Ethics, that there are things we ought rather to “face death” than do (Book III 1). And yet few of us have the courage to face death to do the right thing. Few of us even have the courage to say “brother” to another who affirms the values we purport to hold dear.

Elizabeth Kolbert writes in the February 16th issue of The New Yorker that the Germans “failed miserably” to draw a line between the innocent and the guilty after the war. She writes, in fact, that to say they “failed miserably” would be “generous” (“The Last Trial”). That’s true, of course, though in a different sense, I think, than the one Kolbert meant, because the line, drawn properly, would encircle us all, all except for the few whose willingness to martyr themselves to do the right thing places them not outside the group, but above it.

We are all guilty of the cravenness that paved the way for the Holocaust, the glass through which we keep seeing darkly, which we keep turning over and over in a vain attempt to escape our own reflection. If we had the courage to recognize ourselves in it, then perhaps we could learn from it. But courage, sadly, is precisely what we lack.

(This piece is dedicated to my dear friend and German tutor of many years, Ebba Mørkeberg 1924-2014.  It originally appeared in the of Feb 17, 2015 issue of Counterpunch.)

When Bad Things Happen to Good Academics

I wonder sometimes what makes people go bad. There doesn’t seem to be any logic to it. James Gilligan, a forensic psychiatrist who has worked with serial killers, writes that nearly all of them have been abused as children. That makes sense to me. I’m inclined to think that people are like other animals, that if they get what they need when they’re young, they grow up to be well- adjusted members of their species. We know how to make an animal, a dog for example, vicious: simply mistreat it. My understanding is that that works on pretty much any animal. If it gets what it needs when it’s young, it will turn out to be a fine adult. If it doesn’t it won’t, it’s that simple.

I like this view, not simply because it’s humane, but also because it’s optimistic. It gives us a formula for wiping out cruelty and intolerance. We just need to work to ensure that people get what they need. We need to make sure that parents don’t have so many financial worries that they cannot be sufficiently attentive to their children, or worse, that they end up taking out their stress on their children. We need to make sure that every person, every job, is accorded respect, that people are treated with dignity, etc., etc., and eventually cruelty and inhumanity will become things of the past. That’s a tall order, of course, and perhaps it’s idealistic, but it’s something to aim at anyway. There was a time when people said things such as poverty and hunger could never be wiped out. But we’ve made great strides in eliminating them, and have even eliminated them completely in parts of the world. It’s widely believed now to be a question of will, not of practical possibility. If we want to eliminate poverty and hunger, we can.

I like to think that the same thing is true with cravenness and cruelty (meaning that it can be wiped out if we have the will to do so) and generally, I do believe it. But sometimes I’m confronted with examples of what seems to be completely gratuitous and inexplicable viciousness from people whose lives to all outward appearances anyway, would seem to be pretty cushy, people who give no evidence (no other evidence anyway) of having been abused as children. The mystery of why some people go bad gives me a certain sympathy with John Calvin, and others who believe in predestination, or the view that some people are just inherently bad. I don’t really believe that, but in my weaker moments, I wonder if it might not be true.

There are just so many variables. Is it not enough to have loving and attentive parents? Can having been picked last for a team in gym class cause a wound that festers for years leading finally to generalized suspicion and paranoia as an adult? Can one slight on the playground explain a vicious and unprovoked attack on a colleague years later?

My mother once said that in her experience, religion made good people better and bad people worse. (Both her parents were ministers in the Assemblies of God church.) The same thing, sadly, seems to be true of academia. I don’t believe there is a better life than that of a tenured academic. Hardly ever in human experience are the stars aligned so perfectly as they are in the lives of tenured academics. Teaching of any sort is fulfilling but most teaching doesn’t come with the job security and other material benefits routinely accorded to the tenured academic. To be paid to teach, not to mention to read, and write, well, it’s like winning the lottery.

I had some wonderful teachers when I was in college. This led me to believe that teachers were, in general, not simply wiser and more learned than the average person, but also kinder, more considerate, more understanding and tolerant. This made sense to me because they had what appeared to be wonderful jobs. How could anyone not be happy with such a life, I asked myself, and how could anyone who was happy fail to be anything but nice?

Since then, however, I have learned that two kinds of people enter academia: (1) well adjusted people, people who are basically kind and decent, sympathetic and empathetic, people who love to read and sometimes (though not always) also to write, people who like people in general and like to think that in their own small way they are doing something to better the human condition, and (2) maladjusted people who like to use their learning as a club with which they can intimidate others, people who suffer from varying degrees of paranoia, people possessed of a messianic zeal to single-handedly save humanity from what in their fevered imaginations they believe to be the ravages inflicted on it by the forces of evil they take to be embodied in the form of despised colleagues, people who spend more time plotting to undermine and even publicly humiliate these colleagues than they spend on teaching.

There is almost no way to check the damage the latter sort of academic can cause once he or she becomes tenured. They sit plotting and poisoning the air in their departments until they retire, and they do not generally retire until very late in life because they thrive on conflict, a kind of conflict that it is hard to find outside a professional context. When, as sometimes happens, I’m confronted with the spectacle of the damage such people can do, the havoc they can wreak in an otherwise harmonious community of scholars, the pain they can cause to colleagues for whom they have conceived a pathological dislike, I have a certain sympathy with the anti-academic element in our vociferously anti-intellectual society. Academics are not really the plague that they are increasingly represented as being, but there is, lamentably, a sizable contingent that gives the rest of us a bad name.

All Over America the Lamps are Going Out

Agee photo finalThese are bad times. I thought of James Agee’s beautiful and heartrending work Let Us Now Praise Famous Men when I heard the verdict in the Zimmerman case. There’s an account, very near the beginning of the book, of Agee’s and Walker Evans’s encounter with a young black couple that made me think, when I first read it, how far we had come from those dark days. Agee and Evans had found a church they wanted to photograph. The church was in a relatively deserted wooded area and was locked. As the two men were wondering whether to force their way in, a young black couple came walking by. The couple, Agee writes,

[w]ithout appearing to look either longer or less long, or with more or less interest, than a white man might care for, and without altering their pace, … made thorough observation of us, of the car, and of the tripod and camera. We spoke and nodded, smiling as if casually; they spoke and nodded gravely, as they passed, and glanced back once, not secretly, nor long, nor in amusement. (p. 36.)

Agee decides to go after the couple to ask them if they know where to find a minister or someone else who could let them into the church. Agee, being Agee trails behind them at first simply observing them “taking pleasure… in the competence and rhythm of their walking in the sun, … and in the beauty in the sunlight on their clothes.” They are obviously courting, both dressed in their Sunday best. He in “dark trousers, black dress shoes, a new-laundered white shirt with lights of bluing in it, and a light yellow soft straw hat,” she in “a flowered pink cotton dress” and “freshly whited pumps.

“I was walking more rapidly than they,” explains Agee, “but quietly.” Still, before he had gone far, the couple, as if they could sense his presence, turned back and looked at him “briefly and impersonally, like horses in a field.” Agee waved at them, but they’d already turned away again. He began to walk faster, but was impatient to catch up to them, so he “broke into a trot. At the sound of the twist of my shoe in the gravel,” writes Agee

the young woman’s whole body was jerked down tight as a fist into a crouch from which immediately, the rear foot skidding in the loose stone so that she nearly fell, like a kicked cow scrambling out of a creek, eyes crazy, chin stretched tight, she sprang forward into the first motions of a running not human but that of a suddenly terrified wild animal. In this same instant the young man froze, the emblems of sense in his wild face wide open toward me, his right hand stiff toward the girl who, after a few strides, her consciousness overtaking her reflex, shambled to a stop and stood, not straight but sick, as if hung from a hook in the spine of the will not to fall for weakness, while he hurried to her and put his hand on her flowered shoulder and, inclining his head forward and sidewise as if listening, spoke with her, and they lifted, and watched me while, shaking my head, and raising my hand palm outward, I came up to them (not trotting) and stopped a yard short of where they, closely, not touching now, stood, and said, still shaking my head (No; no; oh, Jesus, no, no, no!) and looking into their eyes; at the man, who was not knowing what to do, and at the girl, whose eyes were lined with tears, and who was trying so hard to subdue the shaking in her breath, and whose heart I could feel, though not hear, blasting as if it were my whole body, and I trying in some fool way to keep it somehow relatively light, because I could not bear that they should receive from me any added reflection of the shattering of their grace and dignity, and of the nakedness and depth and meaning of their fear, … [said] ‘I’m very sorry! I’m very sorry if I scared you! I didn’t mean to scare you at all. I wouldn’t have done any such thing for anything.’ They just kept looking at me. There was no more for them to say than for me. …. After a little the man got back his voice, his eyes grew a little easier, and he said without conviction that that was all right and that I hadn’t scared her. She shook her head slowly, her eyes on me; she did not yet trust her voice. Their faces were secret, soft, utterly without trust of me, and utterly without understanding; and they had to stand here now and hear what I was saying, because in that country no negro safely walks away from a white man, or even appears not to listen while he is talking. … I …  asked what I had followed them to ask; they said the thing it is usually safest for negroes to say, that they did not know; I thanked them very much, and … again, … I said I was awfully sorry if I had bothered them; but they only retreated still more profoundly behind their faces, their eyes watching mine as if awaiting any sudden move they must ward, and the young man said again that that was all right, and I nodded, and turned away from them, and walked down the road without looking back. (pp. 37-39.)

I remember when I read this passage the horror that came over me to think that anyone would ever have to live with such constant fear. That couple had been frightened, even if only briefly, for their lives.

I knew what it was like to be pursued. I was one of the very few white children at my school for most of my childhood and though the black children who knew me were almost always kind to me, the ones who didn’t know me, the ones I might encounter at recess or walking to or from school, were not. I’d been chased before and been called names and had things thrown at me. I once had a glass bottle thrown at me. It shattered just in front of me so that I could feel the force of the tiny fragments against my shins. I’d learned very early to keep walking, no matter what what was going on behind or in front of me, I’d learned somehow by instinct, I think, not to display fear. Of course I couldn’t ignore people either. I had to acknowledge them, but I couldn’t appear to be afraid. I don’t know why, exactly, that worked, but it did and I knew somehow, even as a child, that it would.

So I identified with that couple. I knew what it was like to affect nonchalance when you are really very afraid. I knew the intricacies of the subtle etiquette of self defense and how it kicks in automatically at such times. I identified with this couple. But still, I had never been afraid for my life.

There are not words to describe what it must be like to live that way, to live with an ever-present fear for one’s very life. I remember when I read that passage I thought to myself, thank God, thank God black people do not have to live like that anymore.

These are bad times.

(This piece was originally published in Counterpunch, 24 July 2013.)

On Death and Dying

Otis elementary school 2One of the most frightening things, I think, about dying is that we do it alone. Of all the natural evils for which one would like to blame the creator, this seems one of the worst. It would have been so much better, wouldn’t it, if we left this life in groups, left perhaps with the people we came in with, with the children we remember from our earliest days in school, and perhaps also with the people we have come to love, if they are suitably close to us in age. If we could go in groups, as if on a field trip, it would be easier.

But we go alone, even those unfortunates who die in accidents that take many lives die effectively alone because they don’t have time, really to appreciate their fates as shared. They say the people who remained on the Titanic sang as the ship went down. That’s what I’m talking about. It would be so much better, so much easier to bear if we were assigned a time along with many others. We could begin to gather a little before that time, all of us who were assigned to leave together, we could begin to gather and prepare ourselves and share with one another the joys and sorrows of our lives. If we did that, I think we would realize that our lives had really all been variations on the same theme, that we were not so different from one another as we had thought.

I’m not certain if I believe in life after death, even though I am very religious. I’m not certain what it would be for. I doubt I will be ready to leave this life when my time comes. I think I’d like to live much longer than I know I will, say three or four hundred years. I think I’d eventually get tired of living though, so the prospect of living forever is not all that appealing.

It seems to me, however, that if there is life after death, that that place where we will all go (and I believe we will all go to the same place because I am a universalist), wherever it is, that we will all actually arrive there together. Even though each of us will die individually, alone, if we go anywhere, it is to eternity and since there is no temporal change in eternity, there cannot be any arriving earlier or later. Where we will go will be where everyone will go at the same time, or where everyone, in a sense, already is. There will be no waiting for the loved ones who die after us. They will be there waiting for us, so to speak, when we arrive, even if they are in the bloom of youth when we leave.

When I think about death, which I do more and more as I get older, I wonder if perhaps part of the point of it, of the horrible specter of that trip one must take alone, is precisely to make us understand that we never really are alone. And by that I don’t mean simply that God is always with us, although I do mean that also. I mean that we are all part of the whole of humanity, that we are connected to everyone and, indeed, to every living thing.

There is a poem I love by Molly Holden that conveys this sense of connectedness very well. It’s called “Photograph of Haymaker, 1890.” It goes like this:

It is not so much the image of the man
that’s moving — he pausing from his work
to whet his scythe, trousers tied
below the knee, white shirt lit by
another summer’s sun, another century’s —

as the sight of the grasses beyond
his last laid swathe, so living yet
upon the moment previous to death;
for as the man stooping straightened up
and bent again they died before his blade.

Sweet hay and gone some seventy years ago
and yet they stand before me in the sun,

That’s not the whole of the poem. I left out the last couple of lines for fear of violating copyright. You can read the whole of it though if you go to Poetry magazine. Of course the poem is about the haymaker in that it’s about mortality which is inseparable, I think from temporality. Time passes, people pass, as they say. The haymaker will pass, just as the grasses he’s cutting down in the vigor of his manhood. And he is gone now of course the man who was young and vigorous in that photo taken so long ago.

I love to read philosophy and learn that others who lived and died long before me had precisely the same thoughts that I have had. I feel suddenly linked to those people in a mystical way. I feel as if they are with me in a strange sense, that we are together on this journey we call life, even though they completed it long ago.

Kierkegaard speaks often about the idea of death and how one must keep it ever present in his thoughts. I did not understand this when I first read it, but I believe I do now. To think about death, really to think about it, to think it through, will bring you right back around again to life and what a miracle it is, and by that I don’t mean your own small individual life, but all of it, life as a whole, and you will be filled with reverence for it. You will be kinder to every creature.

And you will feel less alone.

This piece is for Otis Anderson, February 6, 1959 – July 14, 2013.

Dawkins’ Delusions

Cuisinart EM-100

Cuisinart EM-100

I’d put off reading any of the ”new atheists” until recently. What I knew of their criticisms of religion had not impressed me as particularly sophisticated or even as new, so there seemed no urgency to read them. I’m teaching philosophy of religion this term though and my students expressed a desire to look at the new atheists, so I reluctantly purchased a copy of Richard Dawkins’ The God Delusion and began reading it in preparation for class.

I was afraid I wouldn’t like it. I was wrong. It’s hilarious! Not only has it caused me to laugh out loud, but it has brought home with particular force what an egalitarian industry publishing is. Anyone can publish a book, even a blithering idiot making claims that are demonstrably false and pontificating on things he knows nothing about and on works he has not read.

To be fair to Dawkins, I should point out that he’s clearly not a run-of-the-mill blithering idiot or he’d never have risen to his current position of prominence in science. He’d have been wise, however, to have restricted his public pronouncements to that field. His foray into the fields of religion and philosophy has made it clear that he’s closer to an idiot savant on the order of the infamously racist Nobel Prize winner James D. Watson, than to a genuine intellectual such as Stephen Jay Gould.

The preface to the paperback edition of The God Delusion includes Dawkins’ responses to some of the criticisms that were advanced against the book when it first appeared. In response to the charge that he always attacks “the worst of religion and ignored the best,” Dawkins writes

If only such subtle, nuanced religion predominated, the world would surely be a better place, and I would have written a different book. The melancholy truth is that this kind of understated, decent, revisionist religion is numerically negligible. To the vast majority of believers around the world, religion all too closely resembles what you hear from the likes of Robertson, Falwell or Haggard, Osama bin Laden or the Ayatollah Khomeini. These are not straw men, they are all too influential, and everybody in the modern world has to deal with them (p. 15).

From where does Dawkins get his statistics concerning the proportion of religious believers who subscribe to “understated, decent, revisionist” views of religion? How does he know their numbers are negligible? Evidence suggests otherwise. That is, most people in the economically developed world appear to accept modern science, so if surveys concerning the proportion of the population in this part of the world who are religious are correct, then the numbers of the “decent” religious people are not negligible, in fact, these people are vastly in the majority.

Of course to give Dawkins credit, he does refer to believers “around the world,” and not just in the economically developed part. It’s possible that Dawkins intends his book to enlighten the followers of Ayatollah Khomeini and other Muslim fundamentalist leaders, as well as to the few fundamentalists in the economically developed world who reject science. It does not appear to have been aimed, however, at such an audience and I’ve not heard anything about Dawkins’ underwriting the translation of the book into Farsi or Arabic.

Also, how come science gets to “develop,” but religion that has changed over time is referred to pejoratively as “revisionist.” Germ theory was not always part of natural science, but I wouldn’t call contemporary science “revisionist” because it now includes belief in the reality of microorganisms.

“I suspect,” writes Dawkins, “that for many people the main reason they cling to religion is not that it is consoling, but that they have been let down by our educational system and don’t realize that non-belief is even an option” (p. 22).

Dawkins is either being disingenuous in the extreme or he is, in fact, feeble minded. Notice he says “our” educational system, so here he is clearly not talking about Iran or the Middle East. The whole reason that it is occasionally controversial to teach evolution in school in the U.S. is that religious extremists have become offended by the ubiquity of evolutionary theory in the science curriculum.

Far from education “letting people down” in failing to make clear to them that non-belief is an option, it more often lets people down in failing to make clear to them that belief is an option. It tends to caricature religious belief in precisely the way Dawkins’ conflation of religion with religious fundamentalism does, with the result that young people are literally indoctrinated with the view that religion itself, not one particular instantiation of it (i.e., fundamentalism), but religion itself is simply a particular form of superstition that is essentially in conflict with the modern world view. Dawkins would appear to be a victim of such indoctrination himself in that he repeatedly conflates religion with religious fundamentalism. He acknowledges occasionally that not all religious people hold the views he attributes to them, but he can’t seem to remember this consistently.

The reader of The God Delusion is faced with a dichotomy unflattering to the book’s author: either a rigorous systematic distinction between religion in general and religious fundamentalism in particular taxes Dawkins’ cognitive abilities beyond what they can bear, or his repeated conflation of these these two distinct phenomena is cynically calculated to raise a false alarm concerning the purported threat that religion in general presents to the advancement of civilization in the hope that this alarm will cause people to storm their local Barnes and Noble in an effort to secure, through the purchase of his book, ammunition they can use to defend themselves against the encroaching hoards of barbarian believers.

In the preface to the original hard cover edition Dawkins writes:

I suspect— well, I am sure— that there are lots of people out there who have been brought up in some religion or other, are unhappy in it, don’t believe it, or are worried about the evils that are done in its name; people who feel vague yearnings to leave their parents’ religion and wish they could, but just don’t realize that leaving is an option (p. 23).

Really, he writes that, I’m not kidding. I cut and pasted that text from the ebook. Yes, Dawkins is seriously asserting that there are people “out there” who do not realize that it’s possible, even in principle, to reject the faith they were born into. Obviously, these are not church-going folks. If they were, they would surely notice the children who cease at some point (usually in late adolescence or early adulthood) to attend church with their parents, or overhear the laments of parents whose children have “left the faith” during the coffee and cookies that often follows services on Sundays.  These people who “just don’t realize that leaving is an option” must be a rare non-church-going species of fundamentalist. Even the Amish, after all, know that “leaving is an option.”

It’s admirable that Dawkins is so concerned about this infinitesimally small portion of humanity that he would write a whole book for their benefit. The view, however, that they represent a significant threat to Western civilization is hardly credible.

A charitable reading of Dawkins might incline one to think that what he meant was that it was not an emotional option, that it would wreak more havoc in their lives than they fear they could bear. (This, presumably, is why more Amish don’t leave the faith.) But if that were truly Dawkins concern, he’d have written a very different type of book because that problem has nothing to do with science or the failure of religious people to understand it.

Atheists, according to Dawkins, are under siege. “Unlike evangelical Christians,” he bemoans, “who wield even greater political power [than Jews], atheists and agnostics are not organized and therefore exert almost zero influence” (p. 27). Oh yeah, atheists exert “zero influence.” That’s why we’re all taught the Bible in school, right? And why my university, like so many universities in the U.S., has such a huge religion department relative to, say, the biology department.

Wait, we’re not taught the Bible in school, that’s part of what fundamentalists are so up in arms about. We don’t teach creation, we teach evolution. We don’t have a religion department at Drexel. We don’t even lump religion in with philosophy, as is increasingly common at institutions that appear to be gradually phasing out religion all together. We don’t teach religion period, not even as an object of scholarly study, let alone in an attempt to indoctrinate impressionable young people with its purportedly questionable “truths.”

The Penguin English Dictionary,” observes Dawkins, “defines a delusion as ‘a false belief or impression’” (p. 27). Is the belief that religion represents a serious threat to the advance of civilization not obviously false?  “The dictionary supplied with Microsoft Word,” continues Dawkins, “defines a delusion as ‘a persistent false belief held in the face of strong contradictory evidence” (28). Is there not “strong contradictory evidence” to the claim that atheists are under siege?

Is it possible that the survival of modern science really is threatened in Britain, in contrast to the clear cultural hegemony it enjoys in the U.S.? Maybe. Eating baked beans on toast has always seemed pretty backward to me. My guess, however, is that Dawkins suffers from the delusion that we in the U.S. are more backward than the folks on the other side of the Atlantic.

I’ll give Dawkins one thing. He’s right about how our educational system has failed us. That’s the only explanation I can think of for the popularity of Dawkins alarmist clap trap. It ought to be obvious to anyone with even a modicum of formal education that Dawkins is talking sheer nonsense. But then Dawkins is a scientist, not a philosopher or theologian. He simply doesn’t seem to understand Stephen Jay Gould’s lovely straightforward presentation of the nonoverlapping magisteria view of the relation between science and religion.

But then it’s hard to say whether Dawkins failure to understand, NOMA, as it is now called, is an expression of his cognitive limits or of his intellectual irresponsibility in that it appears he hasn’t actually read Gould’s paper. What makes me think this, you ask? Well, because Gould goes on at length in this paper about how creationism (Dawkins’ apparent primary concern) is “a local and parochial movement, powerful only in the United States among Western nations, and prevalent only among the few sectors of American Protestantism that choose to read the Bible as an inerrant document, literally true in every jot and tittle” (emphasis added), and one could add here “has made no inroads whatever into the system of public education.”

Perhaps Dawkins thought it was unnecessary to read Gould, that anyone who would defend religion must not be worth reading. We all have our blind spots. I, for example, though I am devoutly religious, refuse to believe that prayer effects any change other than in the one who prays. It’s not because of some paranoid fear I have of inadvertently falling into superstition. It’s because the idea of a God whose mind could be changed by a particularly passionate entreaty, that is, of a God who is capricious and vain, is not at all edifying to me. I refuse to believe God is like that, quite independently of anything that might be presented to me as evidence for or against such a view.

Fortunately, my understanding of the relation between science and religion is a little more sophisticated than Dawkins’, so I can rest easily in my convictions, unperturbed by the phantom of their possible overthrow in the indeterminate future by some hitherto unknown type of empirical evidence. There is no such thing as empirical evidence either for or against the truth of religious convictions of the sort I hold. Fundamentalists may have to live with their heads in the sand but people with a proper understanding of the relation between the phenomenal and numinal realms do not.

That’s where our educational system has failed us. Too many people, even well educated people, have been taught that science conflicts with religion, not with a specific instantiation of religion, that is, not with fundamentalism, but with religion period. Education has failed us in a manner precisely opposite to the one in which Dawkins claims it has. The problem is not that the educational system has led people to the position where they feel that non belief is not an option. The problem is precisely that the pretentious misrepresentation of the explanatory powers of empirical science and the reduction to caricature of anything and everything that goes under the heading of “religion” has led people to the position where they feel that belief is not an option.

I have enormous respect for honest agnostics, despite William James’ point in his essay “The Will to Believe,” that agnosticism is formally indistinguishable from atheism in that it fails just as much as the latter to secure for itself the good that is promised by religion. Agnosticism is at least intellectually honest. The question whether there’s a God, or as James puts it, some kind of higher, or transcendent purpose to existence, cannot be formally answered. Even Dawkins acknowledges that it’s not actually possible to demonstrate that there’s no God (though he asserts, bizarrely, that God’s improbability can be demonstrated). But if God’s existence cannot be disproved, then disbelief stands on no firmer ground than belief, so why trumpet it as somehow superior?

The fact is that we’re all of us out over what Kierkegaard refers to as the 70,000 fathoms. I’m comfortable with my belief. I’m not offended by agnostics. I’m not even offended by atheists. I’m not offended by the fact that there are people who don’t believe in God. I would never try to argue to them that they ought to believe. That to me is a profoundly personal matter, something between each individual and the deity. What’s strange to me is that there are many people, people such as Dawkins, who are apparently so uncomfortable with their atheism that the mere existence of anyone who disagrees with them on this issue is offensive to them. It’s as if they perceive the very existence of religious belief as some kind of threat. What kind of threat, one wonders, might that be?

Religious belief, at this stage of human history anyway, certainly does not represent a threat to scientific progress. Dawkins blames religion for the 9/11. Experience has shown, however, that terrorism, of pretty much every stripe, is effectively eliminated with the elimination of social and economic inequities, just as is religious fundamentalism. So why isn’t Dawkins railing against social and economic inequities?  That would appear to be a far more effective way to free the world of the scourge of religious fundamentalism than simply railing against fundamentalism directly. Direct attacks on fundamentalism are analogous to temperance lectures to people whose lives are so miserable that drinking is the only thing that brings them any kind of joy.

“[A] universe with a creative superintendent,” asserts Dawkins, “would be a very different kind of universe from one without one” (p. 78). But what is the difference for people such as NIH director Francis Collins, and myself, who believe that the description of the universe that is provided by science is precisely a description of the nature of God’s material creation? Dawkins is right in that there’s a difference between those two universes. He’s wrong though in believing that difference to be material.

Suppose that one morning you found on your doorstep an apple. Suppose you love apples. Suppose as well that though you could not preclude the possibility that this apple had simply fallen from an overly-full grocery bag of some passerby, for some reason that you cannot explain, you were infused with the conviction, as soon as you laid eyes on the apple, that someone had placed it there for you. What a lovely thought! The whole experience changes your morning, even your day, in a positive way.

In a material sense, of course, it makes no difference whether the apple came there by chance, or by design. It is the same apple, after all, whatever the explanation for its presence. It is not at all the same experience, however, to believe that one has found an apple by chance and to believe one has found it by design.

Now suppose a well-meaning friend, points out the superfluity of your assumption that the apple had been placed there by someone. Suppose this person pointed out that nothing in the mere presence of the apple compelled such an assumption and that you should thus content yourself with a “natural explanation” of how it came to be there. Ought you to abandon your belief in your invisible benefactor? What would you gain by abandoning it? If your friend had been ridiculing you for your “foolishness,” then presumably that would cease. You would regain his respect. But at what cost? It’s none of his business what you chose to believe in such an instance. That he would make fun of you for believing something the truth of which he cannot disprove but which makes you happy paints a very unflattering picture of him. So you would regain the respect of someone whose respect many would rightly disdain, even while you would lose something that had made you happy. And why is the explanation you have supplied for the presence of the apple less “natural” than his? You didn’t assume the apple had spontaneously sprung into existence. The real difference between your view of how the apple came to be there and his is that yours is nicer, that it makes you feel better.

Or to take a more apposite example in my case: Say that for as long as you can remember, you’ve wanted one of those fancy, expensive home cappuccino makers. You know the ones I’m talking about. Not the little cheapie things that can be had for under a hundred dollars, but the really expensive ones that resemble the real thing that they use in fancy cafes and coffee houses. Say that you have always wanted one of these fancy cappuccino makers but because you had chosen the life of an academic and the modest salary that went along with it, you felt a fancy cappuccino maker was an extravagance you simply couldn’t allow yourself. Lawyers can afford such things you reasoned, but then they also needed them because they are generally very unhappy in their work. If you had gone to law school, you could have had a fancy cappuccino maker. You knew this, of course, but chose to go to graduate school in philosophy instead because you believed a career in philosophy would be more fulfilling than a career in law. You made your choice and so must content yourself with a fulfilling career and more modest coffee-making set up.

This seems to you a reasonable trade off, so you do not waste away large portions of your life lusting after a fancy home cappuccino maker. Still, you do think wistfully of such machines sometimes, particularly when you see them in the homes of your lawyer friends, or in one of those fancy kitchen stores that always have so many of them. You have accustomed yourself, over time, to this occasional quiet longing.

But then one Saturday, when you are on your way back to your apartment, after having done your morning shopping, you spy a large bag on the sidewalk in front of one of the houses on your block. People often put things out on the sidewalk that they no longer want, so you stop to see if there is anything there you might be able to use. As you approach the bag, your heart begins to beat more quickly. Peeping out of the edge of the bag is what looks for all the world like the top of one of those fancy, expensive cappuccino makers that you have always wanted. You peer disbelievingly into the bag and discover that not only does it indeed contain such a machine, but all of the accoutrements that generally go with them, a little stainless steel milk frothing jug, metal inserts in both the single and double espresso size (as well as one to hold those Illy pods that you would never buy because they are too expensive), and a coffee scoop with a flat end for tamping down the coffee. As you are peering into the bag, your neighbor emerges from the front door of her house with more bags of stuff to put out on the sidewalk.

“Are you giving this away?” you ask tentatively.

“Yes,” she replies.

“Does it work?” you ask.

“Yes,” she replies.

“Why are you giving it away?” you ask incredulously, convinced that any minute she will change her mind.

“Well,” she says nonchalantly, I’ve had it for four years and never used it. I figure that if you have something for four years and never use it, you should get rid of it.”

You nod and laugh, affecting a nonchalance to match your neighbor’s. As soon as she has disappeared into the house, though, you snatch up the bag that contains the machine and all the accoutrements and stagger under its weight the short distance to your door. You download the manual for the machine (a Cuisinart EM-100, which you discover retails for $325), set it up and give it a trial run. It works like a dream!

Your innermost wish for a fancy, expensive cappuccino maker has been fulfilled! One was deposited practically on your doorstep. Of course it came there in a perfectly natural, explicable way, but still, your heart overflows with gratitude toward God whom you believe has arranged the universe, in his wisdom and benevolence, in such a way that this fancy, expensive cappuccino maker should come into your possession now. God has favored you with the rare and coveted have-your-cake-and-eat-it-too status in that you have been allowed to pursue your life’s calling of being a philosophy professor and have a fancy, expensive cappuccino maker!

You do not need to attribute this turn of events to any supernatural agency in order to see “the hand of God” in it. It does not trouble you to think that your neighbor had very likely been considering putting that machine out on the street for quite some time. That the whole event came about very naturally. But still, it is deeply significant to you and fills you with a sense of awe and wonder. Why should that bother Richard Dawkins?

It is fair, of course, to point out that you might just as well be annoyed that God had not arranged for you to receive this fancy, expensive cappuccino maker earlier. But you do not think that way. Why, you do not know. You attribute this wonderfully positive psychological dynamic to God’s Grace, but of course you could be wrong, perhaps it’s genetic. Earlier it seemed to you that the sacrifice of a fancy, expensive cappuccino maker in order to pursue your life’s calling was really not so very much to ask, and you accepted it stoically. Now, you are overcome with gratitude toward God for so arranging things that your wish for such a machine has been fulfilled. Earlier you were happy, now you are happier still. What’s wrong with that? That seems to me to be a very enviable situation.

Experience may incline us to expect certain emotional reactions to various kinds of events, but reason does not require such reactions. Many religious people are effectively deists in that they accept what scientists call the “laws of nature” and do not believe that God arbitrarily suspends those laws in answer to particularly passionate entreaties. Such people accept that God must thus be responsible in some way for the things they don’t like just as much as for the things they like, but consider that perhaps there is some reason for those things that human reason simply cannot fathom, and look to God for emotional support when the bad things in life seem to overwhelm the good and thank God when the reverse seems to be the case.

To be able to find strength in God when times are bad and to thank him (her or it) when times are good is an enviable gift. Who wouldn’t want to be like that? Of course it is possible to rail against God for not ensuring that times are always good, but it isn’t necessary. The failure to condemn or to become angry is not a failure of logic. Objectively, everything simply is, nothing necessitates a particular emotional reaction. The dynamic of faith is just as rational as the dynamic of skepticism. In fact, it could be construed as even more rational. That is, happiness is something that it is generally acknowledged human beings almost universally pursue and the dynamic just described is clearly a particularly good way of achieving it in that it effectively ensures a generally positive emotional state. Maybe believers are wrong, but even Dawkins acknowledges that no one will ever be able to prove that. Even if they are wrong, however, it seems there is little, if any harm, in their beliefs and a great deal of good.

Why does religion so offend atheists such as Dawkins? No one is forcing them to sign up. Dawkins is not alone in his outrage. It’s pervasive among atheists. The invectives they hurl at believers always put me in mind of those hurled by a child at the participants in an invisible tea party to which he has not been invited.

“There isn’t really any TEA there, you know!” he yells.

But is the outrage over the fictitious nature of the tea, that anyone should pretend to drink something that isn’t really there, or is it at not having been invited to the party? Perhaps the problem with the atheist is the feeling of being left out. Perhaps they are angry that other people get to enjoy something from which they have been excluded, something they have been led to believe is “not an option” for them.

(For a really excellent piece on The God Delusion see Terry Eagleton’s “Lunging, Flailing, Mispunching” in the London Review of Books.)

Education and Democracy

Anti-intellectualism (cover)I’m reading Richard Hofstadter’s Anti-intellectualism in American Life in preparation for doing a review of Carlin Romano’s new book America the Philosophical. Romano mentions Hofstadter in his introduction, but only in his introduction. He never returns to him. I suspected that was going to turn out to be a weakness in Romano’s book, so I decided I should read Hofstadter before reviewing Romano. That was no great chore. Hofstadter is one of my favorite authors. His book Social Darwinism in American Thought is a real eye-opener. That book, together with Max Weber’s The Protestant Ethic and the Spirit of Capitalism, is a kind of Rosetta Stone of American culture.

The penultimate chapter of Hofstadter’s book looks at the educational theory of John Dewey. “The new education,” Hofstadter observes, that grew out of Dewey’s thought “would have social responsibilities more demanding and more freighted with social significance than the education of the past. Its goal would be nothing less than the fullest realization of the principles of democracy. In setting this aspiration, Dewey stood firmly within the American tradition, for the great educational reformers who had established the common-school system had also been concerned with its potential value to democracy” (Hofstadter, p. 378). That is, in Dewey’s theory, “the ends of democratic education are to be served by the socialization of the child, who is to be made into a co-operative rather than a competitive being and ‘saturated’ with the spirit of service (Hofstadter, p. 379).

Leaving aside the issue of the mounting evidence that people are inherently more inclined to cooperation than to competition, it seems to me that something essential is omitted here. The traditional conception of the significance of education to democracy is that it is important that citizens in a democracy be well informed, that they should be able to read as a means to being well informed, as well as that they should be able to think critically and analytically so as to be better able to sort their way through the information with which they are presented and to properly understand its significance.

I believe, however, that the significance of education to democracy is much greater than that. It is not simply that citizens in a democracy must be rational and well informed, they must also be happy. Unhappy people are too prone to using their vote punitively, that is, in ways that actually decrease rather than increase the happiness of their fellow citizens. But policies that improve the quality of life of the average citizen are the engine of democracy. Without them democracy ultimately breaks down. That is, Dewey’s ideal of socialization as encouraging cooperation can’t be sustained unless the individuals being socialized are relatively happy both throughout the period of socialization and beyond (if the process can be meaningfully said to stop at any point).

What few people understand, I fear, is the importance of education to human happiness. Human beings, as Aristotle famously observed, are rational animals. They have very highly developed and complex brains, brains that have needs of their own for stimulation and challenge. Helen Keller writes movingly, for example, of how perpetually angry, and even violent, she was before she learned language (The Story of My Life). That was partly, of course, because of her difficulty communicating, but it was also, as she clearly details, because of her difficulty in fixing thoughts in her mind. Language, like mathematics and logic, is a cultural achievement. People do not learn it in isolation from other people and they do not gain an optimal command of it if they do not read. The brain is driven to make sense of its environment. It finds fulfillment in that. People would do science (as indeed they did for millennia) even if it had no obvious utility, just as they always done cognitively challenging and stimulating games such as chess and crossword puzzles.

The need of human beings to develop their minds is, I believe, so acute that its fulfillment is an ineradicable element of human happiness. That, I would argue, is the real value of education to democracy. We need to educate people in a democracy not merely so that they will better understand what sorts of policies would be best for society as a whole, but so that they will also desire what is best for society as a whole rather than the spread of their private misery onto the larger community.

The War on Fairness

Portrait caricatureIt’s rare when a person does something that is at once so idiotic and so heinous that it brings discredit upon his entire profession. I fear philosopher Stephen T. Asma has done this, however, with his new book from the University of Chicago Press. I’ve bragged for years to friends and relatives that the philosophy curriculum at the graduate level is so rigorous that it weeds out the kinds of morons who all too often are able to make it through other Ph.D. programs. Not everyone with a Ph.D. in philosophy is a transcendent genius, I’ve conceded, but there’s a basement level of analytical acuity below which philosophers simply do not go.

I stand corrected. Stephen T. Asma’s article, “In Defense of Favoritism,” excerpted from his book Against Fairness (I’m not making this up, I swear) is the worst piece of incoherent and morally reprehensible tripe I think I’ve ever read in my life. I endeavor, as a rule, not to read crap, but I was intrigued when I saw the title of Asma’s article in the headlines I receive every day from The Chronicle of Higher Education. Clever hook, I thought! It seemed obvious to me that few people would undertake a genuine defense of favoritism and that the Chronicle would certainly never publish such a thing, so I was curious to find out what the article was actually about.

Well, it’s just what it says it is–it’s a defense, or an attempt at a defense anyway, of favoritism. I say “an attempt” at a defense because favoritism is considered by most people to be indefensible, and with good reason.  “Favoritism,” as distinguished from the universally human phenomenon of having favorites, is defined by the Oxford English Dictionary as “[a] disposition to show, or the practice of showing, favour or partiality to an individual or class, to the neglect of others having equal or superior claims; undue preference.” It’s the qualification of the preference as “undue” that’s important here.

There’s nothing wrong with wanting your niece or nephew, for example, to get that new tenure-track position in your department, but there’s a whole lot wrong with giving it to them, or giving them preferential treatment in discussions of who should get it, simply because they are your niece or nephew. Ditto for your favorite grad student. To want someone you care about to succeed because you care about them is perfectly natural. To ENSURE that they succeed over other, and possibly better qualified, people simply because you care about them is wrong. That’s what favoritism is though.

I thought at first that Asma might simply be confused about the meaning of “favoritism,” that what he was actually trying to do was to defend the view that there’s nothing wrong with having favorites, that what philosophers refer to as “preferential affection” is simply part of human nature and not something anyone should ever feel guilty about. The further I got into the article, however, the clearer it became that Asma was indeed trying to defend undue preference.

The piece, as Kierkegaard would say, is something both to laugh at and to weep over in that it’s such an inept piece of argumentation that it’s hilarious while at the same time being profoundly morally offensive. That Asma’s opening is, as one reader observes in the comments following the article, “irrelevant to his point” is the least of his crimes against sound reasoning.

“Fairness,” asserts Asma, “is not the be-all and end-all standard for justice,” thus positioning himself as a sort of imbecilic David over and against the Goliath of John Rawls whose theory of justice as fairness is much admired by philosophers. There’s nothing wrong, of course, with taking aim at intellectual giants. It helps, however, when one does this, to have a good argument.

But Asma does not have a good argument. It’s impossible to give a developmental account of Asma’s argument because it has little that resembles a structure. Instead of starting with premises that he carefully arranges to lead the reader from assumptions he already holds to a conclusion the inevitability of which he is finally compelled, if not actually to accept, then at least to concede as probable, Asma presents a mishmash of irrelevant, incoherent, and equivocal non sequiturs that litter the page like toys strewn about a room by a child rooting impatiently through his toybox for the one cherished toy he cannot find. And what is Asma’s cherished toy? Why it’s favoritism! Asma is determined to prove that favoritism is, in his own words, “not a bad thing.”

The upshot of Asma’s rambling argument is that the tendency toward favoritism is part of human nature. This is regrettably true. It makes us feel good when we promote the interests of those we love. Just because something makes us feel good though, doesn’t mean that it’s ethical. The conflation of these two things is known in philosophy as “the naturalistic fallacy.” Asma, ought to know this because he is a philosopher. How he can make such a fundamental mistake is mystifying.

The article begins with Asma recounting a scene with his son who is complaining because Asma will not allow him to play a game that involves the killing of zombies because he, Asma, feels his son is too young for that sort of game. “That’s sooo not fair!” his son protests. Instead, however, of using this occasion as the inspiration to write a book for children that will help them to better understand the meaning of the word “fair,” Asma takes his toddler’s grasp of the term, equates it erroneously with “egalitarianism” and decides to write a philosophical treatise (for adults) discrediting both.

Asma then turns to an examination of what he asserts is the virtue of generosity. What he actually describes, however, is not what most philosophers would identify as a virtue (which, according to Aristotle, for one, requires cultivation), but a natural inclination, found in varying degrees in various individuals, to share what one has with one’s friends–and only, he is careful to explain, with one’s friends. But the fact that most people enjoy sharing what they have with their friends does not make this inclination into a virtue. To equate a natural inclination, in this way, with a virtue is, once again, an expression of the naturalistic fallacy.

The child in Asma’s example gives all her candy to a few friends over the protestations of classmates to whom she has a less passionate emotional attachment. “But the quality of her generosity,” asserts Asma, “is not compromised by the fact that she gave it all to her five friends.” This flagrantly begs the question, however, because there is a sizable contingent of humanity that would contest such a definition of “generosity.” Sure, if you define sharing with only your friends as “virtuous,” then you won’t have a hard time defending favoritism because sharing with only your friends is the same thing as favoritism and far from seeing it as a virtue, most of humanity would see it as downright nasty.

And that isn’t the only problem with conflating inclinations and virtues. How about sharing with your friend when you have good reason to believe that that friend is going to use what you’ve shared with him to further some nefarious purpose he may have? Is that virtuous? Plato talks about that problem in the Republic. Is it possible that Asma, a philosopher, hasn’t read the Republic?

My heart sort of goes out to Asma at that point, though, because he seems to be contrasting the child who shares with only her friends with a child who refuses to share any of his candy with anyone–ever. But that’s not just greedy, it’s pathological and anyone who fails to recognize this must have had a very wretched childhood indeed. To Asma’s credit, he acknowledges that his argument is “counterintuitive.” Readers will find themselves wishing, however, that Asma hadn’t been so dismissive of his intuitions.

Asma erroneously asserts that the activities of those in the civil rights and feminist movements, for example, are expressions of favoritism and tribalism. That’s a fair charge to level, I suppose, against black supremacists, and perhaps against radical feminist separatists, but the two examples Asma cites, Rosa Parks and Susan B. Anthony, hardly fall into those categories. It’s not favoritism to demand rights for one’s group that are equal to the rest of society. Only fighting for more rights, or for preferential treatment, could be characterized that way.

Perhaps it’s the term “equal” that throws Asma off. He seems to have a particular aversion to it. He refers, for example, to what he claims is “American hostility to elitism,” but the example he gives is not one of anti-elitism, which would be hard to find in our culture, but one of anti-intellectualism. That is, he points out that “politicians work hard to downplay their own intelligence and intellectual accomplishments so they might seem less threatening (less eggheadish) to the public.”

We’re not hostile to elitism in the U.S. though. We’re the most thoroughly elitist society in the economically developed world. Everything from our systems of taxation, education, and health, to our system of criminal justice is set up to favor the wealthy elites.

Asma cites several studies that show that what is called “ingroup bias” appears to be inherent in human nature and uses this fact to support his position that favoritism is therefore “not a bad thing.” That something is inherent in human nature does not, however, entail that it is morally acceptable. There are all kinds of unfortunate tendencies in human nature that parents, societies, and finally civilization itself endeavor to control, tame, and even in some cases eradicate.

Asma’s whole defense of favoritism is not simply an expression of “the naturalistic fallacy,” referred to above. To the extent that he tries to defend favoritism by arguing that it’s innate, he’s also guilty of conflating an “ought” with an “is.” Hume referred to this mistake as the “is-ought” problem. That is, it is a misguided attempt to draw inferences about the nature of moral obligation (i.e., how people ought to behave) from observations about how people tend to behave (i.e., how they do behave) when the two things are qualitatively different and need to be kept rigorously distinguished.

Asma returns, at the end of the article, to the example of children. He appears to have hopped on the bandwagon of pseudo-intellectuals who have begun to express concern that we are being too nice to our children. It seems Asma’s son came home one day with a ribbon he’d “won” in a footrace, but Asma’s pride dissipated when his son explained that all the children had “won” the race, that they’d all been given ribbons. “I don’t want my son, and every other kid in his class,” protests Asma, “to be told they’d ‘won’ the footrace at school just because we think their self-esteem can’t handle the truth. Equal rewards for unequal accomplishments foster the dogma of fairness, but they don’t improve my son or the other students.”

Leaving aside the issue that Asma has once again evinced that he has appropriated a toddler’s simplistic and hence erroneous definition of “fairness,” there’s something comically fantastical about Asma’s apparent fear that today’s youth are in danger of living out their lives in blissful ignorance of their own weaknesses and inadequacies. The likelihood, for example, that admissions to elite universities are suddenly going to become merit blind, or that we will cease keeping statistics on the accomplishments of professional athletes seems vanishingly small, and the only professions that seem openly to embrace the conspicuously inept are those in the financial industry.

Sadly, children will learn all too soon that there are winners and losers and that the former are rewarded while the latter are not. Not only does it do no harm to stave off that realization as long as possible, it may actually do a great deal of good if it helps us to teach children that their worth as individuals is not dependent on their bettering their peers in contests. Not everyone can be a winner. Most people have to content themselves with being also-rans. If we can teach children early that the also-rans are to be lauded as an essential part of the race (after all, there is no race without them), then we might actually help to increase the number of people who are able to live happy and fulfilling lives.

Asma’s fears are not restricted, however, to the specter of a utopian future for his progeny. Even while wealth is increasingly transferred to a dwindling minority of the American population, Asma is tortured by feverish nightmares of creeping socialism. “Liberals,” he asserts, “say ‘fairness’ when they mean ‘all things should be equal’”–as if we, in the U.S., stood in imminent danger of sweeping political reforms that would make the social-welfare states of Northern Europe look like Czarist Russia by comparison.

What’s disturbing is not so much Asma’s argument as the fact that it found a reputable (or at least once reputable) academic publisher and that it was actually excerpted in The Chronicle of Higher Education. Noam Chomsky said somewhere that despite all the atrocities he had spent a large part of his life chronicling, he believed humanity was making moral progress. You don’t see moral defenses of slavery anymore, he pointed out, whereas you did see such things in earlier periods of human history. Yes, maybe that’s true. But if we’ve regressed to the point that it’s now socially acceptable to publish moral defenses of favoritism, and attacks on fairness, can defenses of slavery be far behind?

This piece originally appeared in CounterPunch on 11/192012