On Death and Dying

Otis elementary school 2One of the most frightening things, I think, about dying is that we do it alone. Of all the natural evils for which one would like to blame the creator, this seems one of the worst. It would have been so much better, wouldn’t it, if we left this life in groups, left perhaps with the people we came in with, with the children we remember from our earliest days in school, and perhaps also with the people we have come to love, if they are suitably close to us in age. If we could go in groups, as if on a field trip, it would be easier.

But we go alone, even those unfortunates who die in accidents that take many lives die effectively alone because they don’t have time, really to appreciate their fates as shared. They say the people who remained on the Titanic sang as the ship went down. That’s what I’m talking about. It would be so much better, so much easier to bear if we were assigned a time along with many others. We could begin to gather a little before that time, all of us who were assigned to leave together, we could begin to gather and prepare ourselves and share with one another the joys and sorrows of our lives. If we did that, I think we would realize that our lives had really all been variations on the same theme, that we were not so different from one another as we had thought.

I’m not certain if I believe in life after death, even though I am very religious. I’m not certain what it would be for. I doubt I will be ready to leave this life when my time comes. I think I’d like to live much longer than I know I will, say three or four hundred years. I think I’d eventually get tired of living though, so the prospect of living forever is not all that appealing.

It seems to me, however, that if there is life after death, that that place where we will all go (and I believe we will all go to the same place because I am a universalist), wherever it is, that we will all actually arrive there together. Even though each of us will die individually, alone, if we go anywhere, it is to eternity and since there is no temporal change in eternity, there cannot be any arriving earlier or later. Where we will go will be where everyone will go at the same time, or where everyone, in a sense, already is. There will be no waiting for the loved ones who die after us. They will be there waiting for us, so to speak, when we arrive, even if they are in the bloom of youth when we leave.

When I think about death, which I do more and more as I get older, I wonder if perhaps part of the point of it, of the horrible specter of that trip one must take alone, is precisely to make us understand that we never really are alone. And by that I don’t mean simply that God is always with us, although I do mean that also. I mean that we are all part of the whole of humanity, that we are connected to everyone and, indeed, to every living thing.

There is a poem I love by Molly Holden that conveys this sense of connectedness very well. It’s called “Photograph of Haymaker, 1890.” It goes like this:

It is not so much the image of the man
that’s moving — he pausing from his work
to whet his scythe, trousers tied
below the knee, white shirt lit by
another summer’s sun, another century’s —

as the sight of the grasses beyond
his last laid swathe, so living yet
upon the moment previous to death;
for as the man stooping straightened up
and bent again they died before his blade.

Sweet hay and gone some seventy years ago
and yet they stand before me in the sun,

That’s not the whole of the poem. I left out the last couple of lines for fear of violating copyright. You can read the whole of it though if you go to Poetry magazine. Of course the poem is about the haymaker in that it’s about mortality which is inseparable, I think from temporality. Time passes, people pass, as they say. The haymaker will pass, just as the grasses he’s cutting down in the vigor of his manhood. And he is gone now of course the man who was young and vigorous in that photo taken so long ago.

I love to read philosophy and learn that others who lived and died long before me had precisely the same thoughts that I have had. I feel suddenly linked to those people in a mystical way. I feel as if they are with me in a strange sense, that we are together on this journey we call life, even though they completed it long ago.

Kierkegaard speaks often about the idea of death and how one must keep it ever present in his thoughts. I did not understand this when I first read it, but I believe I do now. To think about death, really to think about it, to think it through, will bring you right back around again to life and what a miracle it is, and by that I don’t mean your own small individual life, but all of it, life as a whole, and you will be filled with reverence for it. You will be kinder to every creature.

And you will feel less alone.

This piece is for Otis Anderson, February 6, 1959 – July 14, 2013.

Dawkins’ Delusions

Cuisinart EM-100

Cuisinart EM-100

I’d put off reading any of the ”new atheists” until recently. What I knew of their criticisms of religion had not impressed me as particularly sophisticated or even as new, so there seemed no urgency to read them. I’m teaching philosophy of religion this term though and my students expressed a desire to look at the new atheists, so I reluctantly purchased a copy of Richard Dawkins’ The God Delusion and began reading it in preparation for class.

I was afraid I wouldn’t like it. I was wrong. It’s hilarious! Not only has it caused me to laugh out loud, but it has brought home with particular force what an egalitarian industry publishing is. Anyone can publish a book, even a blithering idiot making claims that are demonstrably false and pontificating on things he knows nothing about and on works he has not read.

To be fair to Dawkins, I should point out that he’s clearly not a run-of-the-mill blithering idiot or he’d never have risen to his current position of prominence in science. He’d have been wise, however, to have restricted his public pronouncements to that field. His foray into the fields of religion and philosophy has made it clear that he’s closer to an idiot savant on the order of the infamously racist Nobel Prize winner James D. Watson, than to a genuine intellectual such as Stephen Jay Gould.

The preface to the paperback edition of The God Delusion includes Dawkins’ responses to some of the criticisms that were advanced against the book when it first appeared. In response to the charge that he always attacks “the worst of religion and ignored the best,” Dawkins writes

If only such subtle, nuanced religion predominated, the world would surely be a better place, and I would have written a different book. The melancholy truth is that this kind of understated, decent, revisionist religion is numerically negligible. To the vast majority of believers around the world, religion all too closely resembles what you hear from the likes of Robertson, Falwell or Haggard, Osama bin Laden or the Ayatollah Khomeini. These are not straw men, they are all too influential, and everybody in the modern world has to deal with them (p. 15).

From where does Dawkins get his statistics concerning the proportion of religious believers who subscribe to “understated, decent, revisionist” views of religion? How does he know their numbers are negligible? Evidence suggests otherwise. That is, most people in the economically developed world appear to accept modern science, so if surveys concerning the proportion of the population in this part of the world who are religious are correct, then the numbers of the “decent” religious people are not negligible, in fact, these people are vastly in the majority.

Of course to give Dawkins credit, he does refer to believers “around the world,” and not just in the economically developed part. It’s possible that Dawkins intends his book to enlighten the followers of Ayatollah Khomeini and other Muslim fundamentalist leaders, as well as to the few fundamentalists in the economically developed world who reject science. It does not appear to have been aimed, however, at such an audience and I’ve not heard anything about Dawkins’ underwriting the translation of the book into Farsi or Arabic.

Also, how come science gets to “develop,” but religion that has changed over time is referred to pejoratively as “revisionist.” Germ theory was not always part of natural science, but I wouldn’t call contemporary science “revisionist” because it now includes belief in the reality of microorganisms.

“I suspect,” writes Dawkins, “that for many people the main reason they cling to religion is not that it is consoling, but that they have been let down by our educational system and don’t realize that non-belief is even an option” (p. 22).

Dawkins is either being disingenuous in the extreme or he is, in fact, feeble minded. Notice he says “our” educational system, so here he is clearly not talking about Iran or the Middle East. The whole reason that it is occasionally controversial to teach evolution in school in the U.S. is that religious extremists have become offended by the ubiquity of evolutionary theory in the science curriculum.

Far from education “letting people down” in failing to make clear to them that non-belief is an option, it more often lets people down in failing to make clear to them that belief is an option. It tends to caricature religious belief in precisely the way Dawkins’ conflation of religion with religious fundamentalism does, with the result that young people are literally indoctrinated with the view that religion itself, not one particular instantiation of it (i.e., fundamentalism), but religion itself is simply a particular form of superstition that is essentially in conflict with the modern world view. Dawkins would appear to be a victim of such indoctrination himself in that he repeatedly conflates religion with religious fundamentalism. He acknowledges occasionally that not all religious people hold the views he attributes to them, but he can’t seem to remember this consistently.

The reader of The God Delusion is faced with a dichotomy unflattering to the book’s author: either a rigorous systematic distinction between religion in general and religious fundamentalism in particular taxes Dawkins’ cognitive abilities beyond what they can bear, or his repeated conflation of these these two distinct phenomena is cynically calculated to raise a false alarm concerning the purported threat that religion in general presents to the advancement of civilization in the hope that this alarm will cause people to storm their local Barnes and Noble in an effort to secure, through the purchase of his book, ammunition they can use to defend themselves against the encroaching hoards of barbarian believers.

In the preface to the original hard cover edition Dawkins writes:

I suspect— well, I am sure— that there are lots of people out there who have been brought up in some religion or other, are unhappy in it, don’t believe it, or are worried about the evils that are done in its name; people who feel vague yearnings to leave their parents’ religion and wish they could, but just don’t realize that leaving is an option (p. 23).

Really, he writes that, I’m not kidding. I cut and pasted that text from the ebook. Yes, Dawkins is seriously asserting that there are people “out there” who do not realize that it’s possible, even in principle, to reject the faith they were born into. Obviously, these are not church-going folks. If they were, they would surely notice the children who cease at some point (usually in late adolescence or early adulthood) to attend church with their parents, or overhear the laments of parents whose children have “left the faith” during the coffee and cookies that often follows services on Sundays.  These people who “just don’t realize that leaving is an option” must be a rare non-church-going species of fundamentalist. Even the Amish, after all, know that “leaving is an option.”

It’s admirable that Dawkins is so concerned about this infinitesimally small portion of humanity that he would write a whole book for their benefit. The view, however, that they represent a significant threat to Western civilization is hardly credible.

A charitable reading of Dawkins might incline one to think that what he meant was that it was not an emotional option, that it would wreak more havoc in their lives than they fear they could bear. (This, presumably, is why more Amish don’t leave the faith.) But if that were truly Dawkins concern, he’d have written a very different type of book because that problem has nothing to do with science or the failure of religious people to understand it.

Atheists, according to Dawkins, are under siege. “Unlike evangelical Christians,” he bemoans, “who wield even greater political power [than Jews], atheists and agnostics are not organized and therefore exert almost zero influence” (p. 27). Oh yeah, atheists exert “zero influence.” That’s why we’re all taught the Bible in school, right? And why my university, like so many universities in the U.S., has such a huge religion department relative to, say, the biology department.

Wait, we’re not taught the Bible in school, that’s part of what fundamentalists are so up in arms about. We don’t teach creation, we teach evolution. We don’t have a religion department at Drexel. We don’t even lump religion in with philosophy, as is increasingly common at institutions that appear to be gradually phasing out religion all together. We don’t teach religion period, not even as an object of scholarly study, let alone in an attempt to indoctrinate impressionable young people with its purportedly questionable “truths.”

The Penguin English Dictionary,” observes Dawkins, “defines a delusion as ‘a false belief or impression’” (p. 27). Is the belief that religion represents a serious threat to the advance of civilization not obviously false?  “The dictionary supplied with Microsoft Word,” continues Dawkins, “defines a delusion as ‘a persistent false belief held in the face of strong contradictory evidence” (28). Is there not “strong contradictory evidence” to the claim that atheists are under siege?

Is it possible that the survival of modern science really is threatened in Britain, in contrast to the clear cultural hegemony it enjoys in the U.S.? Maybe. Eating baked beans on toast has always seemed pretty backward to me. My guess, however, is that Dawkins suffers from the delusion that we in the U.S. are more backward than the folks on the other side of the Atlantic.

I’ll give Dawkins one thing. He’s right about how our educational system has failed us. That’s the only explanation I can think of for the popularity of Dawkins alarmist clap trap. It ought to be obvious to anyone with even a modicum of formal education that Dawkins is talking sheer nonsense. But then Dawkins is a scientist, not a philosopher or theologian. He simply doesn’t seem to understand Stephen Jay Gould’s lovely straightforward presentation of the nonoverlapping magisteria view of the relation between science and religion.

But then it’s hard to say whether Dawkins failure to understand, NOMA, as it is now called, is an expression of his cognitive limits or of his intellectual irresponsibility in that it appears he hasn’t actually read Gould’s paper. What makes me think this, you ask? Well, because Gould goes on at length in this paper about how creationism (Dawkins’ apparent primary concern) is “a local and parochial movement, powerful only in the United States among Western nations, and prevalent only among the few sectors of American Protestantism that choose to read the Bible as an inerrant document, literally true in every jot and tittle” (emphasis added), and one could add here “has made no inroads whatever into the system of public education.”

Perhaps Dawkins thought it was unnecessary to read Gould, that anyone who would defend religion must not be worth reading. We all have our blind spots. I, for example, though I am devoutly religious, refuse to believe that prayer effects any change other than in the one who prays. It’s not because of some paranoid fear I have of inadvertently falling into superstition. It’s because the idea of a God whose mind could be changed by a particularly passionate entreaty, that is, of a God who is capricious and vain, is not at all edifying to me. I refuse to believe God is like that, quite independently of anything that might be presented to me as evidence for or against such a view.

Fortunately, my understanding of the relation between science and religion is a little more sophisticated than Dawkins’, so I can rest easily in my convictions, unperturbed by the phantom of their possible overthrow in the indeterminate future by some hitherto unknown type of empirical evidence. There is no such thing as empirical evidence either for or against the truth of religious convictions of the sort I hold. Fundamentalists may have to live with their heads in the sand but people with a proper understanding of the relation between the phenomenal and numinal realms do not.

That’s where our educational system has failed us. Too many people, even well educated people, have been taught that science conflicts with religion, not with a specific instantiation of religion, that is, not with fundamentalism, but with religion period. Education has failed us in a manner precisely opposite to the one in which Dawkins claims it has. The problem is not that the educational system has led people to the position where they feel that non belief is not an option. The problem is precisely that the pretentious misrepresentation of the explanatory powers of empirical science and the reduction to caricature of anything and everything that goes under the heading of “religion” has led people to the position where they feel that belief is not an option.

I have enormous respect for honest agnostics, despite William James’ point in his essay “The Will to Believe,” that agnosticism is formally indistinguishable from atheism in that it fails just as much as the latter to secure for itself the good that is promised by religion. Agnosticism is at least intellectually honest. The question whether there’s a God, or as James puts it, some kind of higher, or transcendent purpose to existence, cannot be formally answered. Even Dawkins acknowledges that it’s not actually possible to demonstrate that there’s no God (though he asserts, bizarrely, that God’s improbability can be demonstrated). But if God’s existence cannot be disproved, then disbelief stands on no firmer ground than belief, so why trumpet it as somehow superior?

The fact is that we’re all of us out over what Kierkegaard refers to as the 70,000 fathoms. I’m comfortable with my belief. I’m not offended by agnostics. I’m not even offended by atheists. I’m not offended by the fact that there are people who don’t believe in God. I would never try to argue to them that they ought to believe. That to me is a profoundly personal matter, something between each individual and the deity. What’s strange to me is that there are many people, people such as Dawkins, who are apparently so uncomfortable with their atheism that the mere existence of anyone who disagrees with them on this issue is offensive to them. It’s as if they perceive the very existence of religious belief as some kind of threat. What kind of threat, one wonders, might that be?

Religious belief, at this stage of human history anyway, certainly does not represent a threat to scientific progress. Dawkins blames religion for the 9/11. Experience has shown, however, that terrorism, of pretty much every stripe, is effectively eliminated with the elimination of social and economic inequities, just as is religious fundamentalism. So why isn’t Dawkins railing against social and economic inequities?  That would appear to be a far more effective way to free the world of the scourge of religious fundamentalism than simply railing against fundamentalism directly. Direct attacks on fundamentalism are analogous to temperance lectures to people whose lives are so miserable that drinking is the only thing that brings them any kind of joy.

“[A] universe with a creative superintendent,” asserts Dawkins, “would be a very different kind of universe from one without one” (p. 78). But what is the difference for people such as NIH director Francis Collins, and myself, who believe that the description of the universe that is provided by science is precisely a description of the nature of God’s material creation? Dawkins is right in that there’s a difference between those two universes. He’s wrong though in believing that difference to be material.

Suppose that one morning you found on your doorstep an apple. Suppose you love apples. Suppose as well that though you could not preclude the possibility that this apple had simply fallen from an overly-full grocery bag of some passerby, for some reason that you cannot explain, you were infused with the conviction, as soon as you laid eyes on the apple, that someone had placed it there for you. What a lovely thought! The whole experience changes your morning, even your day, in a positive way.

In a material sense, of course, it makes no difference whether the apple came there by chance, or by design. It is the same apple, after all, whatever the explanation for its presence. It is not at all the same experience, however, to believe that one has found an apple by chance and to believe one has found it by design.

Now suppose a well-meaning friend, points out the superfluity of your assumption that the apple had been placed there by someone. Suppose this person pointed out that nothing in the mere presence of the apple compelled such an assumption and that you should thus content yourself with a “natural explanation” of how it came to be there. Ought you to abandon your belief in your invisible benefactor? What would you gain by abandoning it? If your friend had been ridiculing you for your “foolishness,” then presumably that would cease. You would regain his respect. But at what cost? It’s none of his business what you chose to believe in such an instance. That he would make fun of you for believing something the truth of which he cannot disprove but which makes you happy paints a very unflattering picture of him. So you would regain the respect of someone whose respect many would rightly disdain, even while you would lose something that had made you happy. And why is the explanation you have supplied for the presence of the apple less “natural” than his? You didn’t assume the apple had spontaneously sprung into existence. The real difference between your view of how the apple came to be there and his is that yours is nicer, that it makes you feel better.

Or to take a more apposite example in my case: Say that for as long as you can remember, you’ve wanted one of those fancy, expensive home cappuccino makers. You know the ones I’m talking about. Not the little cheapie things that can be had for under a hundred dollars, but the really expensive ones that resemble the real thing that they use in fancy cafes and coffee houses. Say that you have always wanted one of these fancy cappuccino makers but because you had chosen the life of an academic and the modest salary that went along with it, you felt a fancy cappuccino maker was an extravagance you simply couldn’t allow yourself. Lawyers can afford such things you reasoned, but then they also needed them because they are generally very unhappy in their work. If you had gone to law school, you could have had a fancy cappuccino maker. You knew this, of course, but chose to go to graduate school in philosophy instead because you believed a career in philosophy would be more fulfilling than a career in law. You made your choice and so must content yourself with a fulfilling career and more modest coffee-making set up.

This seems to you a reasonable trade off, so you do not waste away large portions of your life lusting after a fancy home cappuccino maker. Still, you do think wistfully of such machines sometimes, particularly when you see them in the homes of your lawyer friends, or in one of those fancy kitchen stores that always have so many of them. You have accustomed yourself, over time, to this occasional quiet longing.

But then one Saturday, when you are on your way back to your apartment, after having done your morning shopping, you spy a large bag on the sidewalk in front of one of the houses on your block. People often put things out on the sidewalk that they no longer want, so you stop to see if there is anything there you might be able to use. As you approach the bag, your heart begins to beat more quickly. Peeping out of the edge of the bag is what looks for all the world like the top of one of those fancy, expensive cappuccino makers that you have always wanted. You peer disbelievingly into the bag and discover that not only does it indeed contain such a machine, but all of the accoutrements that generally go with them, a little stainless steel milk frothing jug, metal inserts in both the single and double espresso size (as well as one to hold those Illy pods that you would never buy because they are too expensive), and a coffee scoop with a flat end for tamping down the coffee. As you are peering into the bag, your neighbor emerges from the front door of her house with more bags of stuff to put out on the sidewalk.

“Are you giving this away?” you ask tentatively.

“Yes,” she replies.

“Does it work?” you ask.

“Yes,” she replies.

“Why are you giving it away?” you ask incredulously, convinced that any minute she will change her mind.

“Well,” she says nonchalantly, I’ve had it for four years and never used it. I figure that if you have something for four years and never use it, you should get rid of it.”

You nod and laugh, affecting a nonchalance to match your neighbor’s. As soon as she has disappeared into the house, though, you snatch up the bag that contains the machine and all the accoutrements and stagger under its weight the short distance to your door. You download the manual for the machine (a Cuisinart EM-100, which you discover retails for $325), set it up and give it a trial run. It works like a dream!

Your innermost wish for a fancy, expensive cappuccino maker has been fulfilled! One was deposited practically on your doorstep. Of course it came there in a perfectly natural, explicable way, but still, your heart overflows with gratitude toward God whom you believe has arranged the universe, in his wisdom and benevolence, in such a way that this fancy, expensive cappuccino maker should come into your possession now. God has favored you with the rare and coveted have-your-cake-and-eat-it-too status in that you have been allowed to pursue your life’s calling of being a philosophy professor and have a fancy, expensive cappuccino maker!

You do not need to attribute this turn of events to any supernatural agency in order to see “the hand of God” in it. It does not trouble you to think that your neighbor had very likely been considering putting that machine out on the street for quite some time. That the whole event came about very naturally. But still, it is deeply significant to you and fills you with a sense of awe and wonder. Why should that bother Richard Dawkins?

It is fair, of course, to point out that you might just as well be annoyed that God had not arranged for you to receive this fancy, expensive cappuccino maker earlier. But you do not think that way. Why, you do not know. You attribute this wonderfully positive psychological dynamic to God’s Grace, but of course you could be wrong, perhaps it’s genetic. Earlier it seemed to you that the sacrifice of a fancy, expensive cappuccino maker in order to pursue your life’s calling was really not so very much to ask, and you accepted it stoically. Now, you are overcome with gratitude toward God for so arranging things that your wish for such a machine has been fulfilled. Earlier you were happy, now you are happier still. What’s wrong with that? That seems to me to be a very enviable situation.

Experience may incline us to expect certain emotional reactions to various kinds of events, but reason does not require such reactions. Many religious people are effectively deists in that they accept what scientists call the “laws of nature” and do not believe that God arbitrarily suspends those laws in answer to particularly passionate entreaties. Such people accept that God must thus be responsible in some way for the things they don’t like just as much as for the things they like, but consider that perhaps there is some reason for those things that human reason simply cannot fathom, and look to God for emotional support when the bad things in life seem to overwhelm the good and thank God when the reverse seems to be the case.

To be able to find strength in God when times are bad and to thank him (her or it) when times are good is an enviable gift. Who wouldn’t want to be like that? Of course it is possible to rail against God for not ensuring that times are always good, but it isn’t necessary. The failure to condemn or to become angry is not a failure of logic. Objectively, everything simply is, nothing necessitates a particular emotional reaction. The dynamic of faith is just as rational as the dynamic of skepticism. In fact, it could be construed as even more rational. That is, happiness is something that it is generally acknowledged human beings almost universally pursue and the dynamic just described is clearly a particularly good way of achieving it in that it effectively ensures a generally positive emotional state. Maybe believers are wrong, but even Dawkins acknowledges that no one will ever be able to prove that. Even if they are wrong, however, it seems there is little, if any harm, in their beliefs and a great deal of good.

Why does religion so offend atheists such as Dawkins? No one is forcing them to sign up. Dawkins is not alone in his outrage. It’s pervasive among atheists. The invectives they hurl at believers always put me in mind of those hurled by a child at the participants in an invisible tea party to which he has not been invited.

“There isn’t really any TEA there, you know!” he yells.

But is the outrage over the fictitious nature of the tea, that anyone should pretend to drink something that isn’t really there, or is it at not having been invited to the party? Perhaps the problem with the atheist is the feeling of being left out. Perhaps they are angry that other people get to enjoy something from which they have been excluded, something they have been led to believe is “not an option” for them.

(For a really excellent piece on The God Delusion see Terry Eagleton’s “Lunging, Flailing, Mispunching” in the London Review of Books.)

Education and Democracy

Anti-intellectualism (cover)I’m reading Richard Hofstadter’s Anti-intellectualism in American Life in preparation for doing a review of Carlin Romano’s new book America the Philosophical. Romano mentions Hofstadter in his introduction, but only in his introduction. He never returns to him. I suspected that was going to turn out to be a weakness in Romano’s book, so I decided I should read Hofstadter before reviewing Romano. That was no great chore. Hofstadter is one of my favorite authors. His book Social Darwinism in American Thought is a real eye-opener. That book, together with Max Weber’s The Protestant Ethic and the Spirit of Capitalism, is a kind of Rosetta Stone of American culture.

The penultimate chapter of Hofstadter’s book looks at the educational theory of John Dewey. “The new education,” Hofstadter observes, that grew out of Dewey’s thought “would have social responsibilities more demanding and more freighted with social significance than the education of the past. Its goal would be nothing less than the fullest realization of the principles of democracy. In setting this aspiration, Dewey stood firmly within the American tradition, for the great educational reformers who had established the common-school system had also been concerned with its potential value to democracy” (Hofstadter, p. 378). That is, in Dewey’s theory, “the ends of democratic education are to be served by the socialization of the child, who is to be made into a co-operative rather than a competitive being and ‘saturated’ with the spirit of service (Hofstadter, p. 379).

Leaving aside the issue of the mounting evidence that people are inherently more inclined to cooperation than to competition, it seems to me that something essential is omitted here. The traditional conception of the significance of education to democracy is that it is important that citizens in a democracy be well informed, that they should be able to read as a means to being well informed, as well as that they should be able to think critically and analytically so as to be better able to sort their way through the information with which they are presented and to properly understand its significance.

I believe, however, that the significance of education to democracy is much greater than that. It is not simply that citizens in a democracy must be rational and well informed, they must also be happy. Unhappy people are too prone to using their vote punitively, that is, in ways that actually decrease rather than increase the happiness of their fellow citizens. But policies that improve the quality of life of the average citizen are the engine of democracy. Without them democracy ultimately breaks down. That is, Dewey’s ideal of socialization as encouraging cooperation can’t be sustained unless the individuals being socialized are relatively happy both throughout the period of socialization and beyond (if the process can be meaningfully said to stop at any point).

What few people understand, I fear, is the importance of education to human happiness. Human beings, as Aristotle famously observed, are rational animals. They have very highly developed and complex brains, brains that have needs of their own for stimulation and challenge. Helen Keller writes movingly, for example, of how perpetually angry, and even violent, she was before she learned language (The Story of My Life). That was partly, of course, because of her difficulty communicating, but it was also, as she clearly details, because of her difficulty in fixing thoughts in her mind. Language, like mathematics and logic, is a cultural achievement. People do not learn it in isolation from other people and they do not gain an optimal command of it if they do not read. The brain is driven to make sense of its environment. It finds fulfillment in that. People would do science (as indeed they did for millennia) even if it had no obvious utility, just as they always done cognitively challenging and stimulating games such as chess and crossword puzzles.

The need of human beings to develop their minds is, I believe, so acute that its fulfillment is an ineradicable element of human happiness. That, I would argue, is the real value of education to democracy. We need to educate people in a democracy not merely so that they will better understand what sorts of policies would be best for society as a whole, but so that they will also desire what is best for society as a whole rather than the spread of their private misery onto the larger community.

The War on Fairness

Portrait caricatureIt’s rare when a person does something that is at once so idiotic and so heinous that it brings discredit upon his entire profession. I fear philosopher Stephen T. Asma has done this, however, with his new book from the University of Chicago Press. I’ve bragged for years to friends and relatives that the philosophy curriculum at the graduate level is so rigorous that it weeds out the kinds of morons who all too often are able to make it through other Ph.D. programs. Not everyone with a Ph.D. in philosophy is a transcendent genius, I’ve conceded, but there’s a basement level of analytical acuity below which philosophers simply do not go.

I stand corrected. Stephen T. Asma’s article, “In Defense of Favoritism,” excerpted from his book Against Fairness (I’m not making this up, I swear) is the worst piece of incoherent and morally reprehensible tripe I think I’ve ever read in my life. I endeavor, as a rule, not to read crap, but I was intrigued when I saw the title of Asma’s article in the headlines I receive every day from The Chronicle of Higher Education. Clever hook, I thought! It seemed obvious to me that few people would undertake a genuine defense of favoritism and that the Chronicle would certainly never publish such a thing, so I was curious to find out what the article was actually about.

Well, it’s just what it says it is–it’s a defense, or an attempt at a defense anyway, of favoritism. I say “an attempt” at a defense because favoritism is considered by most people to be indefensible, and with good reason.  “Favoritism,” as distinguished from the universally human phenomenon of having favorites, is defined by the Oxford English Dictionary as “[a] disposition to show, or the practice of showing, favour or partiality to an individual or class, to the neglect of others having equal or superior claims; undue preference.” It’s the qualification of the preference as “undue” that’s important here.

There’s nothing wrong with wanting your niece or nephew, for example, to get that new tenure-track position in your department, but there’s a whole lot wrong with giving it to them, or giving them preferential treatment in discussions of who should get it, simply because they are your niece or nephew. Ditto for your favorite grad student. To want someone you care about to succeed because you care about them is perfectly natural. To ENSURE that they succeed over other, and possibly better qualified, people simply because you care about them is wrong. That’s what favoritism is though.

I thought at first that Asma might simply be confused about the meaning of “favoritism,” that what he was actually trying to do was to defend the view that there’s nothing wrong with having favorites, that what philosophers refer to as “preferential affection” is simply part of human nature and not something anyone should ever feel guilty about. The further I got into the article, however, the clearer it became that Asma was indeed trying to defend undue preference.

The piece, as Kierkegaard would say, is something both to laugh at and to weep over in that it’s such an inept piece of argumentation that it’s hilarious while at the same time being profoundly morally offensive. That Asma’s opening is, as one reader observes in the comments following the article, “irrelevant to his point” is the least of his crimes against sound reasoning.

“Fairness,” asserts Asma, “is not the be-all and end-all standard for justice,” thus positioning himself as a sort of imbecilic David over and against the Goliath of John Rawls whose theory of justice as fairness is much admired by philosophers. There’s nothing wrong, of course, with taking aim at intellectual giants. It helps, however, when one does this, to have a good argument.

But Asma does not have a good argument. It’s impossible to give a developmental account of Asma’s argument because it has little that resembles a structure. Instead of starting with premises that he carefully arranges to lead the reader from assumptions he already holds to a conclusion the inevitability of which he is finally compelled, if not actually to accept, then at least to concede as probable, Asma presents a mishmash of irrelevant, incoherent, and equivocal non sequiturs that litter the page like toys strewn about a room by a child rooting impatiently through his toybox for the one cherished toy he cannot find. And what is Asma’s cherished toy? Why it’s favoritism! Asma is determined to prove that favoritism is, in his own words, “not a bad thing.”

The upshot of Asma’s rambling argument is that the tendency toward favoritism is part of human nature. This is regrettably true. It makes us feel good when we promote the interests of those we love. Just because something makes us feel good though, doesn’t mean that it’s ethical. The conflation of these two things is known in philosophy as “the naturalistic fallacy.” Asma, ought to know this because he is a philosopher. How he can make such a fundamental mistake is mystifying.

The article begins with Asma recounting a scene with his son who is complaining because Asma will not allow him to play a game that involves the killing of zombies because he, Asma, feels his son is too young for that sort of game. “That’s sooo not fair!” his son protests. Instead, however, of using this occasion as the inspiration to write a book for children that will help them to better understand the meaning of the word “fair,” Asma takes his toddler’s grasp of the term, equates it erroneously with “egalitarianism” and decides to write a philosophical treatise (for adults) discrediting both.

Asma then turns to an examination of what he asserts is the virtue of generosity. What he actually describes, however, is not what most philosophers would identify as a virtue (which, according to Aristotle, for one, requires cultivation), but a natural inclination, found in varying degrees in various individuals, to share what one has with one’s friends–and only, he is careful to explain, with one’s friends. But the fact that most people enjoy sharing what they have with their friends does not make this inclination into a virtue. To equate a natural inclination, in this way, with a virtue is, once again, an expression of the naturalistic fallacy.

The child in Asma’s example gives all her candy to a few friends over the protestations of classmates to whom she has a less passionate emotional attachment. “But the quality of her generosity,” asserts Asma, “is not compromised by the fact that she gave it all to her five friends.” This flagrantly begs the question, however, because there is a sizable contingent of humanity that would contest such a definition of “generosity.” Sure, if you define sharing with only your friends as “virtuous,” then you won’t have a hard time defending favoritism because sharing with only your friends is the same thing as favoritism and far from seeing it as a virtue, most of humanity would see it as downright nasty.

And that isn’t the only problem with conflating inclinations and virtues. How about sharing with your friend when you have good reason to believe that that friend is going to use what you’ve shared with him to further some nefarious purpose he may have? Is that virtuous? Plato talks about that problem in the Republic. Is it possible that Asma, a philosopher, hasn’t read the Republic?

My heart sort of goes out to Asma at that point, though, because he seems to be contrasting the child who shares with only her friends with a child who refuses to share any of his candy with anyone–ever. But that’s not just greedy, it’s pathological and anyone who fails to recognize this must have had a very wretched childhood indeed. To Asma’s credit, he acknowledges that his argument is “counterintuitive.” Readers will find themselves wishing, however, that Asma hadn’t been so dismissive of his intuitions.

Asma erroneously asserts that the activities of those in the civil rights and feminist movements, for example, are expressions of favoritism and tribalism. That’s a fair charge to level, I suppose, against black supremacists, and perhaps against radical feminist separatists, but the two examples Asma cites, Rosa Parks and Susan B. Anthony, hardly fall into those categories. It’s not favoritism to demand rights for one’s group that are equal to the rest of society. Only fighting for more rights, or for preferential treatment, could be characterized that way.

Perhaps it’s the term “equal” that throws Asma off. He seems to have a particular aversion to it. He refers, for example, to what he claims is “American hostility to elitism,” but the example he gives is not one of anti-elitism, which would be hard to find in our culture, but one of anti-intellectualism. That is, he points out that “politicians work hard to downplay their own intelligence and intellectual accomplishments so they might seem less threatening (less eggheadish) to the public.”

We’re not hostile to elitism in the U.S. though. We’re the most thoroughly elitist society in the economically developed world. Everything from our systems of taxation, education, and health, to our system of criminal justice is set up to favor the wealthy elites.

Asma cites several studies that show that what is called “ingroup bias” appears to be inherent in human nature and uses this fact to support his position that favoritism is therefore “not a bad thing.” That something is inherent in human nature does not, however, entail that it is morally acceptable. There are all kinds of unfortunate tendencies in human nature that parents, societies, and finally civilization itself endeavor to control, tame, and even in some cases eradicate.

Asma’s whole defense of favoritism is not simply an expression of “the naturalistic fallacy,” referred to above. To the extent that he tries to defend favoritism by arguing that it’s innate, he’s also guilty of conflating an “ought” with an “is.” Hume referred to this mistake as the “is-ought” problem. That is, it is a misguided attempt to draw inferences about the nature of moral obligation (i.e., how people ought to behave) from observations about how people tend to behave (i.e., how they do behave) when the two things are qualitatively different and need to be kept rigorously distinguished.

Asma returns, at the end of the article, to the example of children. He appears to have hopped on the bandwagon of pseudo-intellectuals who have begun to express concern that we are being too nice to our children. It seems Asma’s son came home one day with a ribbon he’d “won” in a footrace, but Asma’s pride dissipated when his son explained that all the children had “won” the race, that they’d all been given ribbons. “I don’t want my son, and every other kid in his class,” protests Asma, “to be told they’d ‘won’ the footrace at school just because we think their self-esteem can’t handle the truth. Equal rewards for unequal accomplishments foster the dogma of fairness, but they don’t improve my son or the other students.”

Leaving aside the issue that Asma has once again evinced that he has appropriated a toddler’s simplistic and hence erroneous definition of “fairness,” there’s something comically fantastical about Asma’s apparent fear that today’s youth are in danger of living out their lives in blissful ignorance of their own weaknesses and inadequacies. The likelihood, for example, that admissions to elite universities are suddenly going to become merit blind, or that we will cease keeping statistics on the accomplishments of professional athletes seems vanishingly small, and the only professions that seem openly to embrace the conspicuously inept are those in the financial industry.

Sadly, children will learn all too soon that there are winners and losers and that the former are rewarded while the latter are not. Not only does it do no harm to stave off that realization as long as possible, it may actually do a great deal of good if it helps us to teach children that their worth as individuals is not dependent on their bettering their peers in contests. Not everyone can be a winner. Most people have to content themselves with being also-rans. If we can teach children early that the also-rans are to be lauded as an essential part of the race (after all, there is no race without them), then we might actually help to increase the number of people who are able to live happy and fulfilling lives.

Asma’s fears are not restricted, however, to the specter of a utopian future for his progeny. Even while wealth is increasingly transferred to a dwindling minority of the American population, Asma is tortured by feverish nightmares of creeping socialism. “Liberals,” he asserts, “say ‘fairness’ when they mean ‘all things should be equal’”–as if we, in the U.S., stood in imminent danger of sweeping political reforms that would make the social-welfare states of Northern Europe look like Czarist Russia by comparison.

What’s disturbing is not so much Asma’s argument as the fact that it found a reputable (or at least once reputable) academic publisher and that it was actually excerpted in The Chronicle of Higher Education. Noam Chomsky said somewhere that despite all the atrocities he had spent a large part of his life chronicling, he believed humanity was making moral progress. You don’t see moral defenses of slavery anymore, he pointed out, whereas you did see such things in earlier periods of human history. Yes, maybe that’s true. But if we’ve regressed to the point that it’s now socially acceptable to publish moral defenses of favoritism, and attacks on fairness, can defenses of slavery be far behind?

This piece originally appeared in CounterPunch on 11/192012

Hedonic Adaptation

Paintings over table in PhillyMy reflections here were prompted by an article in today’s New York Times entitled “New Love: A Short Shelf Life.” The article, by Sonja Lyubomirsky, a professor of psychology at the University of California, Riverside, is about how the euphoria associated with the first phase of romantic relationships tends to wear off relatively quickly. I’d initially planned to write a piece on relationships, but the more I thought about it, the more it seemed to me that the problem Lyubomirsky describes isn’t restricted to relationships. Lyubomirsky charges that romantic relationships are subject to the same dynamic of what psychologists call “hedonic adaptation,” as are other thrilling experiences. That is, euphoria, she observes, tends to be short lived, whether it is associated with “a new job, a new home, a new coat,” or a new love.

The first thing that annoyed me about the article was its purely speculative character, or more correctly, the fact that it was mere speculation paraded in front of the reader as scientific fact. “[A]lthough we may not realize it,” asserts Lyubomirsky, “we are biologically hard-wired to crave variety.” Says who? Where is the scientific evidence to support such a claim? We like variety in some things, to be sure, but we like uniformity in others. We appear, in fact, to crave uniformity at least as much as we crave variety. We need, for example, to be able to assume that the future will resemble the past in crucial respects if we are going to be able to function at all and are notorious for being unable to appreciate variety when the variety in question would tend to discredit the worldview to which we have become comfortably wedded.

My point is not that Lyubomirsky is mistaken, or that she has no right to indulge in such speculations. My point is that they are speculations and should not be presented as if they were facts. One reader, Joseph Badler, made the point beautifully. “The ‘we are biologically hardwired’” he wrote, “is just too cheap. It can be used to justify anything. Evo psych post-hoc explanations are making everybody intellectually lazy.”

“Evo psych” refers to evolutionary psychology, which, if you ask me, is a completely bogus discipline that purports to provide evolutionary explanations for traits of human psychology. Why do people appear to crave variety in their sexual partners? Well, the evolutionary psychologist responds (and here I am paraphrasing Lyubomirsky) , because it guarantees a more robust gene pool. That makes sense, of course, but so does the observation that infidelity can be corrosive of social bonds and that promiscuity could thus threaten both the immediate family and the long-term survival of the entire community.

So which is it? Are people hard-wired to crave variety to ensure a more robust gene pool, or are they hard-wired to crave uniformity to be better able to survive to the age of reproduction? Or could they be hard-wired, as seems the most likely, to crave both things relative to particular environments and situations? But if this is the case, then evolutionary “explanations” for psychological traits are obviously speculative because of the seemingly limitless variables one would have to take into account when calculating in what sense natural selection might lie behind a particular psychological tendency.

The situation of the evolutionary psychologist becomes almost unmanageably complex even if we assume that all human beings exhibit the same psychological tendencies. Once we acknowledge that all human beings do not exhibit the same psychological tendencies, then evolutionary psychology, becomes, I would argue, a mere parody of an academic discipline. That is, I’d go further even than Badler. I don’t think it’s simply making people intellectually lazy. I think it’s making them stupid. That it continues to be respected as an academic discipline suggests that the academy is egalitarian to a fault in that even the criterion that one ought to be able to think clearly in order to be admitted to it appears to have been judged unfairly discriminatory.

A number of readers took exception to the comparison of a new love with new material possessions such as a “home” or “coat.” (I almost always enjoy reading the comments readers post to articles such as this one. They confirm my faith that the average person is neither so simple minded nor so superficial as the authors of the articles appear to assume). “Didn’t know love was material,” observes Anna from Ontario wryly.

It’s true that our relationships with people are importantly different from our relationships with things. They may not be so different, though, as some of the opponents of Lyubomirsky’s apparent materialism assume. Another reader points out that Lyubomirsky and those who agree with her “do not consider how the disposable and planned obsolescent qualities of consumer capitalism also ‘program’ us to always desire the new.”

I wouldn’t put all the fault, though, on consumer capitalism. Consumer capitalism, after all, is an expression of something in human nature. Unfortunately, it is the expression, I would argue, of one of the less appealing tendencies in human nature–impatience.

The thrill of the new is something with respect to which we are largely, if not entirely, passive. It wears off though. To continue to be thrilled by the same thing requires diligent effort. The problem with consumer capitalism is that what it parades for our approval is primarily the cheap and tawdry, things that glitter but which are not gold. Such things thrill us before we are fully aware of what they are. Once we learn what they are, they cease to thrill because there is nothing inherently thrilling about them.

Of course even things that are inherently valuable and which thus ought to be inherently thrilling are subject to the same dialectic. We are thrilled with the initial acquisition of them, but that thrill eventually wears off, or at least quiets down. It doesn’t take a great deal of intellectual effort, however, to appreciate that the thrill that dies down in this way is the thrill of acquisition rather than of possession. We are thrilled to have acquired a thing, but then we get used to having it. If it is truly something worth having, though, and we are capable of appreciating it as such, then the initial euphoria of acquisition should be replaced by the more enduring thrill of possession, or more correctly, of appreciation. The problem is, such appreciation requires effort. It requires that we look at the thing again, look at it long and carefully, that we actively search for what is good and valuable in it, rather than simply surrender ourselves to a passive thrill.

Years ago, when I first became engaged, my sister caught me admiring my engagement ring. “You’ll stop doing that after a while,” she said. I found that remark disturbing. I didn’t want to cease to see my ring as beautiful any more than I wanted to cease to love the man who had given it to me. It will happen to you though, her words suggested, independently of what you want. It will happen to you. Kierkegaard talks about that dynamic in the first volume of his two-volume work Either-Or. Everything disappoints, he, or at least one of his pseudonyms, says there.

But does everything have to disappoint? I have never ceased to see my engagement ring as beautiful, just as I have never ceased to love the man who gave it to me. I’m a very materialistic person, in a way. I have lots of things, lots of nice things in which I take enormous pleasure that does not diminish with time. I collect paintings and fountain pens and antiques of various sorts, and each one of these possessions adds immeasurably to the quality of my life.

I love to sit at my table in the morning and sip my coffee (I love coffee!) and look at the two paintings I’ve hung on the wall on the far side of the table. One is a landscape I found in an antique store and the other is a still life I did myself. They are not great masterpieces, but they are very nice and I derive enormous pleasure from looking at them. I look at them in the morning when I am having my coffee and in the evening when I am having dinner. I often work at that table in the afternoon and I’ll glance admiringly up at them periodically even then.

I don’t know what it is exactly that I like so much about them. Each is rough, and yet each is the product of some person’s vision. I like people. They are endlessly fascinating to me. I love handiwork because you can see the humanity in it. I like things because I like creation. I value it as something beautiful and moving. One reader of Lyubomirsky’s article observed that the reason her marriage had been happy until her husband’s death was that they had “had God.” Another reader pointed out, however, that that approach to keeping love alive won’t work for everyone because not everyone is religious. He (she?) went on to point out, however, that “looking beyond oneself and working toward the greater good (of one’s spouse, family, community, world) may be an essential element in the pursuit of lifelong happiness.” I’d agree with that. I’d argue, however, that unless you think that creation, or the universe, or whatever, is good, then even the “greater good” of one’s spouse, family, community, and even the world, will ultimately fall flat.

The challenge, I’d argue, to achieving an enduring happiness is that we’ve programmed ourselves, in a sense, to believe that happiness is inherently fleeting. That the thrill of acquisition is the only thrill there is. Whether that is the fault of consumer capitalism alone or whether it is an expression of something inherent in human nature, I’ll leave it to the reader to decide.

On Parenting

OK, I do not have children and there are those who would charge that this disqualifies me from saying anything meaningful about parenting. I would respond to such a charge, however, with the observation that not being a parent myself means I occupy a disinterested perspective relative to the issue of parenting and that what I lack in practical experience I perhaps make up for in objectivity. I just finished reading Lori Gottlieb’s article “How to Land Your Kid in Therapy” in The Atlantic and that prompted a number of reflections I would like to record here in the hope that they may help give some peace of mind to what it appears are increasing numbers of parents who fear they are doing irreparable damage to their children by, of all things, being too attentive.

I, like Gottlieb, am a fan of Philip Larkin’s “This Be The Verse,” which I will quote at greater length than she does because, well, I am a fan of it.

They fuck you up, your mum and dad

They may not mean to, but they do.

They fill you with the faults they had

And add some new ones just for you.

….

Man hands on misery to man.

It deepens like a coastal shelf.

Get out as quickly as you can,

And don’t have any kids yourself.

I don’t actually think that people ought not to have children, but I do believe that man hands on misery to man and this recent spate of blaming parents for being too attentive to their children seems to be to be a case in point. The problem with parenting, throughout most of human history, has been inattentiveness. That’s no surprise. Life is hard. Parenting is hard. I don’t have children, at least in part, because I find being sufficiently attentive to my cats taxing. I’m not insensitive, at least not if I am to judge from what family and friends and close acquaintances say about me. On the contrary, I am considered to be relatively sensitive. I acquired a stray cat many years ago and was somewhat put out by its habit of walking across the papers I was trying to grade. It would jump up on my desk and walk back and forth in front of me as I was trying to work. As frustrated as I was, it was clear to me that the poor thing needed attention. It was a living being crying out for affection, and that cry was obviously more immediately important that was my need to grade another paper just then. So I would stop and pet it and play with it until its need for affection was satisfied and I could get back to work.

Needless to say, this dragged out the process of grading papers. The good part of it was that I learned then and there that I should not have children. A cat, after all, is much more self-sufficient than a human child, which is notorious for having the longest period of dependency of any offspring in the animal kingdom. If I found it difficult to attend to the needs of a cat, how much more difficult, I realized, would I find it to attend to the needs of a child.

That’s the thing. Children need an enormous amount of attention and, thankfully, there are people who seem able to give it to them without resentment. I’m not entirely without qualification to speak on the issue of the state of today’s youth. I teach at a university, so while I don’t have children myself, I have lots of experience with young people. My impression of them is, in fact, very positive. Gottlieb is a psychotherapist, and she’s concerned because she sees increasing numbers of young people who’ve had happy childhoods but who are “just not happy” as adults. But should that be a surprise? It’s not easy to be an adult, particularly a young adult. Life is hard, and young people, no matter how happy or unhappy their childhoods, have relatively little experience navigating the stormy waters of maturity. All of a sudden they are expected to make important decisions on their own, to choose a career, a job at which they will spend the majority of their waking hours for the rest of their lives, answering to someone who, unlike their parents, is not tied to them by bonds of deep affection.

Just writing that sends cold shivers down my spine. Life is hard. It’s full of frustrations and disappointments. No amount of good parenting can change that fact. No amount of good parenting can guarantee that a child will grow up to be a perfectly happy and well-adjusted adult. There is no such thing, and to suggest that there is and that parents who have failed to fashion it from the raw clay of their children is to add insult to the injury of having, finally, to release those children into the cold, cruel world.

Of course people who’ve had happy childhoods are less happy as young adults. Duh? Do baby birds look happy when their parents push them out of the nest? Have the people who are now blaming parents for having been too attentive to their children ever watched nature shows? College is hard work, and it gets harder every day in that it gets more competitive. And, fun, fun, real work is harder than college. Your boss probably won’t give you an extension on an important assignment, or allow you to redo it to improve your “grade”. Kids know this. They know that however hard college is, it is still a picnic compared to what comes after it, and that is what they are looking at as young adults. Happy, why should they be happy? Gottlieb got one thing right. “The American dream and the pursuit of happiness,” she observes, “have morphed from a quest for general contentment to the idea that you must be happy at all times and in every way.” She doesn’t seem to see the implications of that observation though. There is nothing necessarily wrong with legions of people who’ve had happy childhoods being less happy as young adults. Being an adult is harder than being a child; most people struggle at it, even the ones, such as myself, who are really, really fortunate to find careers that are personally fulfilling, to say nothing of the multitudes who do not.

Rates of anxiety and depression, Gottlieb reports, have “risen in tandem with self esteem.” I’m willing to accept that rates of self-esteem among young people have risen because I have many friends with beautiful and apparently well-adjusted children, children who seem more even tempered, sympathetic and tolerant than I was as a child, or indeed than were any of my childhood friends. I’m optimistic, actually, about the future of humanity because of all the wonderful young people I see, including not just children , but also my students.

Ok, so much for rates of self-esteem. But have rates of anxiety and depression actually gone up? How does one measure such a thing? Presumably the measurements are made on the basis of the numbers of people seeking treatment for these conditions. But aren’t people with healthy self-esteem more likely to seek treatment than people with low self-esteem? There are many people my age or older who simply will not seek psychotherapeutic treatment for any reason because they see it as shameful. People with higher self-esteem are less concerned about things like that, and hence are more likely to seek treatment, thus skewing the numbers. That more people are seeking treatment for anxiety and depression does not thus necessarily mean more people are suffering from it. (It is interesting to note in this connection that neither Gottlieb nor anyone else she cites appears to acknowledge how these numbers may also be skewed by the increasingly aggressive marketing of antidepressants and anti-anxiety drugs by the pharmaceutical industry which appears designed to encourage pretty much everyone to seek treatment for anxiety and depression).

I don’t mean to suggest that children can’t be spoiled. They can, but there’s a difference between giving a child love and giving in to his or her every whim or desire. You can’t give a child too much love. So I say go ahead and pamper your children. Shelter them, protect them from as many of life’s hard knocks as you can for as long as you can. Reassure them that they are brilliant and beautiful. Comfort them when they fall, console them when they fail, etc., because there is no way in hell that you can be there for them all the time, even when they are children. Gottlieb observes naively, that “[k]ids who always have problems solved for them believe that they don’t know how to solve problems.” But no parent can solve all a child’s problems, and the example she gives shows this. “I know of one kid,” she observes, “who said that he didn’t like another kid in the carpool, so instead of having their child learn to tolerate the other kid, they offered to drive him to school themselves.” By the time such kid are teenagers, she observes, “they have no experience with hardship.” Yeah, right. So the other kids are not going to make fun of the one kid who can’t be part of the carpool but whose parents have to drive him to school themselves. There is no way, no way any parent can keep a child from experiencing hardships. Kids are going to experience hardships, and they are going to learn, finally, to take care of themselves no matter how much parents may want to take care of them forever. One would think that if anyone understood this, it would be psychotherapists.

Perhaps what people in the psychotherapeutic professions should concentrate on is the hostility of the environment into which we are sending today’s youth. It’s never been easy to be an adult, but we’ve made it unnecessarily harder by creating a nasty punitive culture that is based on a negative view of human nature that we know now from biological and neurological research is demonstrably false. That is, people are not motivated by nothing but self interest, they are naturally sympathetic and empathetic. Perhaps the transition to adulthood would be less traumatic if our society were not based on the view it is “a war of all against all.” That is, perhaps our focus should not be on how this generation of parents, like every generation before it, is once again failing its children, but on how we are failing as a culture to create an environment that will maximize the potential for human happiness on an individual and a collective level.

 

 

Two Archetypes

Portrait caricatureI’m a transvestite–I think. I like to wear pants. When I was in grad school I even used to wear ties. No one else did, not even the professors, let alone the male grad students. Just me. I liked them. They seemed like me. I like spare, streamlined clothing. I like utilitarian things: pants and shirts and serviceable shoes. I’m not entirely lacking style. I’ve been told, actually, that I have good taste. It’s rather masculine taste though. I don’t like ruffles, don’t like frills, don’t like high heels, don’t paint my nails. I don’t “do” my hair. I can’t be bothered. I just wash it, you know, and let it dry naturally.

My, shall we say “masculine” aesthetic is not something I’d given much thought to until the last couple of years. Several times, in my adult life, I’ve caught a glimpse of myself reflected in a store window and been shocked by the image that confronted me. Why that’s me, I’ve thought to myself, that small woman is me. This will sound strange, but I was surprised to see that I was a small woman. I realized then that I actually had some kind of mental image of myself as a medium-sized man. I’m not seriously mentally ill or anything. I know I’m a woman. I’m not shocked when I look at myself in the mirror in the morning. I do sometimes wear dresses and I always wear makeup, though not very much because, like my hair, I can’t be bothered to spend too much time on it. Still, I rise in the morning and perform the ablutions appropriate to a person of my sex. But then I forget. I get caught up in the things that must be done in the day, and in thought. I forget what I look like. It’s then, I think, that I must unconsciously slip into the masculine image that I have of myself. It fits my job, I guess. There aren’t too many women in philosophy and there are few female academics in any discipline who have what my dean once described as my “pit bull” quality.

It’s how I was brought up, I think. My father is actually a terrible sexist. The thing is, he didn’t have any sons. If he’d had even one son, he’d have raised his daughters differently. He didn’t have any sons though, so he raised us, at least to some extent, the way he’d have raised sons if he’d had them and me more even than my two sisters because I am more similar in temperament to my father than they are. Yes, I was sort of the de facto son. My husband is always remarking that I am “the man” and he is “the woman” in our relationship, not in the sense of the Nicolsons, but in the sense of character traits that are usually thought of as gender specific, things such as my not liking to ask for directions or being generally uncommunicative as opposed to his insisting on asking for directions and talking often about his feelings. He has more friends than I do too. It’s not that I don’t have friends. I’m fortunate to have many good friends. I don’t feel any compulsion to see them all the time though unlike my husband who begins to muse audibly about whether he might have offended a particular friend if a week goes by without his hearing from him or her. That’s another thing, he has more female friends, good friends, than I do. I have some of course, but according to one, not enough. “You need more women friends,” she said. I hadn’t thought about it until then, but most of my friends are actually men.

As I explained, however, I’m no Vita Sackville-West. I’ve never been sexually attracted to women. I’ve always liked men. Still, I realized recently that from the time I was very young, if I were attracted to a boy, and then later a man, I would fantasize about impressing him with how strong and tough I was. Many of my romantic fantasies involved rescue, which, I suppose, is not that unusual for a woman, except that I was always the one doing the rescuing. Yes, I was always rescuing the man I loved from some deadly peril through my extraordinary courage and cunning. I know that sounds strange, but there it is. I’ve been fortunate too, despite my bizarrely masculine character traits, to have had several deeply satisfying romantic relationship with fairly typically masculine men (in which company I would include my former-football-captain husband, despite his frequent protestations that he is “the woman” in our relationship).

I’m small and delicate looking. I’m sure it never occurred to any of the men I’ve been involved with that I had such a masculine self-conception. Though my husband thinks I frighten people whom I argue with and will issue subtle cues if he senses the dinner conversation going in the direction of a confrontation.

I’m not telling you all these things about myself out of some sort of confessional impulse. I’ve something larger in mind. Ever since I figured out why I was always so shocked to be unexpectedly confronted with the fact that I was a woman, which is to say, ever since realized that I actually had somewhere in the depths of my psyche, an image of myself as a man, I’ve been intrigued by this fact about myself. I’ve tried to figure out how it came about, whether it was nature or nurture, marveled that it clearly had no relation whatever to my sexuality. I identify with what Jung called “the animus” the masculine half of human nature that everyone has. Everyone, according to Jung, has a masculine side and a feminine side (though I doubt he would like the term “side”) that he refers to as “the animus” and “anima” respectively.

I smoked a pipe briefly in college and my male friends thought that was cool. I had a masculine nickname too. “Max,” they called me. One of my friends had decided the name “Marilyn” didn’t fit me and that I therefore needed a nickname. She hit on “Max” because my last name was “Piety” and Max Carter, the college chaplain, was the most pious man anyone knew. So there I was, a pipe-smoking girl named Max who went about in what was generally androgynous attire. And yet I was popular with the young men at my college.

I doubt very much though that a purse-carrying lipstick-wearing young man with a moniker of, say, Debbie, would have enjoyed a similar degree of popularity with the opposite sex. I suppose I’ve been aware of this sort of inequity for a long time without really having very strong feelings about it. I guess it seemed natural to me, somehow, that women were allowed a wider berth in terms of what was considered an appropriate expression of their gender than were men. It’s only recently that I have begun to feel this disparity is tragically unfair.

It started, I think, the evening I told my husband about my strange experience of being surprised when I caught an unexpected glimpse of myself in a window or a mirror and saw that I was small woman rather than a medium-sized man. “I have a friend,” he said slowly, “who is a transvestite.” This friend, he explained, did not actually go out dressed up as a woman. He just went around his apartment sometimes in women’s clothes. They’d been friends for years, my husband continued, before his friend had actually “confessed” his predilection for women’s clothes. He was ashamed of it, fearful that people would condemn him. There were only a few people who knew this fact about him, people he was very close to, people he knew well enough to feel confident they wouldn’t condemn him. I could tell my husband was sad, that he felt bad for his friend, bad that his friend was ashamed of something that was so harmless, so innocent, something that a woman, that his wife, could do with impunity was something that he lived in constant fear might be discovered.

That’s when I started to think how tragic is the disparity in the flexibility, or whatever you want to call it, of gender roles. Women can play at being men all they want, but men are made to feel ashamed if they even fantasize about playing at being women, let alone–God forbid–actually try it.

Where do our archetypes of the masculine and the feminine come from? Who dictates them? There was a time when men wore laces and velvets, gaudy jewelry and even makeup. When did that become shameful and why? I wondered briefly whether we really needed these archetypes. Couldn’t we just speak of “character traits,” I asked myself, without having to assign them a gender? Don’t the categories of “masculine” and “feminine” represent a false dichotomy? Can’t everything just be “human” I mused?

But the longer I tried to entertain such a possibility the harder it became to form any firm conception of it. Maybe we need these two most basic of archetypes. We are a classifying species, after all. Gender it seems is itself an archetype and one that I’m beginning to suspect we can’t do without. I’m okay with that, my concern is that there are many men who are perhaps not okay with it because the archetype of masculinity is so much narrower than that of femininity, so while a woman can wear pants in public, a man cannot do the same with a dress. I think that’s wrong. Not only is it horribly unfair, its destructive.

Susan Faludi explains in her book, Backlash: The Undeclared War Against American Women, that while there have been steady gains in women’s rights over the years, studies show that most Americans, men and women, still expect men to be the main bread winners. Women’s freedoms are increasing, yet men are apparently still expected by nearly everyone to exceed women’s accomplishments both personally and professionally. Faludi postulates very persuasively that this inequity is one of the main causes of continuing sexism. I mean, how fair is that? Men and women are increasingly placed in competition with one another. Men enter this competition, however, in metaphorical straightjackets and yet are still expected by nearly everyone to win and condemned as “un-masculine,” or as “failures” (which in our culture are roughly synonymous) if they don’t. The mind boggles at the amount of resentment that would naturally be created by that kind of inequity.

Maybe we need gender archetypes, but if we do, then I think we also need to allow everyone an equal degree of experimentation with them and maybe that means the archetypes themselves are due for some adjustments. Maybe its time we brought the laces and velvets back into the masculine archetype. Plato talks in book V of the Republic about how the differences between men and women are really what philosophers refer to as “accidental” rather than “essential.” Some women are more “spirited” than some men. Some men are more “appetitive” than some women. The only thing that is important according to Plato is that individuals are assigned to positions or tasks that are appropriate to their individual personalities. I like that. It seems just.