On Death and Dying

Otis elementary school 2One of the most frightening things, I think, about dying is that we do it alone. Of all the natural evils for which one would like to blame the creator, this seems one of the worst. It would have been so much better, wouldn’t it, if we left this life in groups, left perhaps with the people we came in with, with the children we remember from our earliest days in school, and perhaps also with the people we have come to love, if they are suitably close to us in age. If we could go in groups, as if on a field trip, it would be easier.

But we go alone, even those unfortunates who die in accidents that take many lives die effectively alone because they don’t have time, really to appreciate their fates as shared. They say the people who remained on the Titanic sang as the ship went down. That’s what I’m talking about. It would be so much better, so much easier to bear if we were assigned a time along with many others. We could begin to gather a little before that time, all of us who were assigned to leave together, we could begin to gather and prepare ourselves and share with one another the joys and sorrows of our lives. If we did that, I think we would realize that our lives had really all been variations on the same theme, that we were not so different from one another as we had thought.

I’m not certain if I believe in life after death, even though I am very religious. I’m not certain what it would be for. I doubt I will be ready to leave this life when my time comes. I think I’d like to live much longer than I know I will, say three or four hundred years. I think I’d eventually get tired of living though, so the prospect of living forever is not all that appealing.

It seems to me, however, that if there is life after death, that that place where we will all go (and I believe we will all go to the same place because I am a universalist), wherever it is, that we will all actually arrive there together. Even though each of us will die individually, alone, if we go anywhere, it is to eternity and since there is no temporal change in eternity, there cannot be any arriving earlier or later. Where we will go will be where everyone will go at the same time, or where everyone, in a sense, already is. There will be no waiting for the loved ones who die after us. They will be there waiting for us, so to speak, when we arrive, even if they are in the bloom of youth when we leave.

When I think about death, which I do more and more as I get older, I wonder if perhaps part of the point of it, of the horrible specter of that trip one must take alone, is precisely to make us understand that we never really are alone. And by that I don’t mean simply that God is always with us, although I do mean that also. I mean that we are all part of the whole of humanity, that we are connected to everyone and, indeed, to every living thing.

There is a poem I love by Molly Holden that conveys this sense of connectedness very well. It’s called “Photograph of Haymaker, 1890.” It goes like this:

It is not so much the image of the man
that’s moving — he pausing from his work
to whet his scythe, trousers tied
below the knee, white shirt lit by
another summer’s sun, another century’s —

as the sight of the grasses beyond
his last laid swathe, so living yet
upon the moment previous to death;
for as the man stooping straightened up
and bent again they died before his blade.

Sweet hay and gone some seventy years ago
and yet they stand before me in the sun,

That’s not the whole of the poem. I left out the last couple of lines for fear of violating copyright. You can read the whole of it though if you go to Poetry magazine. Of course the poem is about the haymaker in that it’s about mortality which is inseparable, I think from temporality. Time passes, people pass, as they say. The haymaker will pass, just as the grasses he’s cutting down in the vigor of his manhood. And he is gone now of course the man who was young and vigorous in that photo taken so long ago.

I love to read philosophy and learn that others who lived and died long before me had precisely the same thoughts that I have had. I feel suddenly linked to those people in a mystical way. I feel as if they are with me in a strange sense, that we are together on this journey we call life, even though they completed it long ago.

Kierkegaard speaks often about the idea of death and how one must keep it ever present in his thoughts. I did not understand this when I first read it, but I believe I do now. To think about death, really to think about it, to think it through, will bring you right back around again to life and what a miracle it is, and by that I don’t mean your own small individual life, but all of it, life as a whole, and you will be filled with reverence for it. You will be kinder to every creature.

And you will feel less alone.

This piece is for Otis Anderson, February 6, 1959 – July 14, 2013.

On Teaching

Plato

Plato

I don’t remember ever forming the ambition to be a teacher. When I was very small, I used to play “teacher” with my dolls. I had a little slate that I would position for them as a blackboard and on which I would write my “lessons.” That was a child’s game though, not an ambition. I did it, I suppose, because school was a large part of the world of my experience, so when I was alone with my dolls I naturally imitated the few adults I’d had exposure to. That meant that if I wasn’t playing “mother,” I was playing “teacher.”

I was an art major when I first entered college. It had been drilled relentlessly into me that I would not be able to make a living as an artist, but that since I could draw, I would probably be able to make a living as a medical illustrator. So I enrolled at Ohio State, one of the few schools in the country that had a program in medical illustration. I did not fit well though into the community of art students, either at Ohio State, or at the Columbus College of Art and Design where I subsequently enrolled. I remained an art major, however, even after leaving both institutions, more out of a lack of imagination, I supposed, than out of positive commitment.

I studied, God knows what (I don’t remember now) at the University of Illinois at Chicago Circle before, finally, ending up at Earlham, a small Quaker college in Richmond, Indiana, still an art major. I took some art classes, but I also took a philosophy class. I don’t remember what prompted me to take that class; I think some philosophy class must have been required for my major. The subject, I still remember, was rationalism and empiricism. It sounds very unromantic, but I loved it. I changed my major soon after that first class and took almost nothing but philosophy from that point on.

I didn’t particularly like reading philosophy, as I’ve written elsewhere, but I loved talking about it. I loved talking about it so much that I actually tried to talk to my father about Kant’s Prolegomena to any Future Metaphysics the first summer after I transferred to Earlham.

“All that stuff about ‘a prior synthetic cognition’ may be very interesting to you,” my father observed somewhat patronizingly, “but you’re not going to find many young men interested in it.” I didn’t bother him after that with philosophy. I kept it to myself, or at least kept it to myself until I was back at Earlham again and could drop by the office of my professor and advisor Bob Horn.

“Bob is God,” is what students at Earlham used to say about him. He was kind and patient and brilliant. He didn’t talk too much, the way I fear I tend to do with my own students now, but just enough, just enough to get, or keep, the conversation going. He was tolerant and understanding. He wrote on one of my friend’s papers, “Eric, I understand you as if through a cloud.”

Nearly every afternoon of my senior year was spent in his office. I would head there as soon as my classes were done for the day and sit in the slanting afternoon light and talk and talk and talk about the ideas that were always swarming in my head like so many bees. And he would smile patiently and respond occasionally with his vastly superior knowledge and wisdom. I never felt though that he was talking down to me. I felt as if we were kindred spirits, as if we connected at a level that is rarely reached by two human beings.

Even when I decided, in my senior year, to go to graduate school, it had not been because I’d harbored any ambitions of becoming a teacher, but because I couldn’t conceive of any other life than the one I’d come to know, the life of philosophy, the life of talking about philosophy with some kindred spirit. I was afraid of teaching, in the beginning, afraid I would never be more than a kind of fraud, afraid I would never be able to approach the knowledge and wisdom of my own professors, particularly Bob. I cherished a little hope, like a tiny flame in the darkness, that somehow graduate school would transform me into a paragon of philosophical knowledge, but that day never came. The more I learned, the clearer it became to me how very much there was to know, and how little of it I had actually mastered.

They ease you into teaching in graduate school. You start out as a teaching assistant, which means you are really sort of a glorified student so you don’t feel you have to be so knowledgeable as the professors but can luxuriate in the experience of being ever so slightly more knowledgeable than the students. I did that for a few years before finally teaching my first course, so even if I still felt something of a fraud when students referred to me as “professor,” my impostor syndrome was not quite so pronounced as it would have been if I’d been thrown into teaching right after I’d gotten my undergraduate degree.

I like people. I’m an animal lover and people are animals, so I like people as well as other animals. I was raised not to dissemble, so I didn’t pretend to know things I didn’t know, and I learned gradually that in fact, over time, I’d acquired a great deal of knowledge and that even if I still fell pitifully short of the standards of my own undergraduate professors (particularly Bob), I was actually in a position to be of some real, concrete help to my students.

I taught in Denmark for several years before I came to Drexel. I never had a student for more than one course when I taught in Denmark, however, because my students there were nearly always Americans or other non-Danish nationals who were taking their semester abroad. I loved my students, but in a very detached way. I never got to know any of them, really, but that was okay with me. I’ve always been kind of a loner. I liked engaging with them intellectually, but it didn’t bother me that I would have them for only one course and after that would never see them again.

My situation when I came to Drexel was not so different. Drexel didn’t allow students to major in philosophy. The best they could do was a minor. I didn’t mind that; in fact, I rather like it. I loved teaching, but I also loved writing, and the fact that my exposure to the lives of my students was very limited suited me well. I got to go into the classroom and do what I loved–talk about philosophy–without having to spend any time helping my students navigate the practical difficulties of their lives. I had all the fun of teaching, or so I thought, with none of the inconvenience.

But then someone senior to myself got the idea that we should offer a major in philosophy. Drexel had had one before but had jettisoned it after several professors retired and were not replaced. Philosophy students do inordinately well on the GREs, so it wasn’t too difficult to convince the dean that a philosophy major would be good for the university. I was ambivalent about it myself, though. I knew that if we had a major I would suddenly “have students” in a way I had never “had students” before and that these “students” would cut into my research time.

I couldn’t bring myself to protest the reinstatement of the major, but neither could I champion it. I sat by quietly with a curiosity not unlike that of a disinterested person watching a train wreck. I didn’t think our students were sufficiently prepared for such a rigorous and intellectually challenging major and I feared that I was emotionally incapable of forming the kind of attachment to them that it seemed to me was necessary for a productive mentoring relationship.

I like large chunks of time all to myself, time when I don’t have to see, attend to, or worry about anyone else. I couldn’t picture myself hanging out with my students, couldn’t imagine welcoming them into my office and cheerfully allowing them to monopolize my time the way Bob had allowed me to monopolize his. I liked my students, but more as abstract beings than as concrete ones. I knew that in this respect I fell short of the standard that Bob had set for me, but I had accepted long ago that I would never be able to meet any of Bob’s standards, Bob, after all was “God.”

But then when we got the major back, everything changed. As if out of nowhere students began to appear who stood out from the students I’d had before. They weren’t interested in philosophy; they were possessed, possessed as I had been all those years ago when I’d practically lived in Bob’s office. Not only did I have them for more than one class; I had them in more than one class at a time! I teach only two courses per term, so I was surprised to find that I had a couple of students in both my classes and not just for one term, but for several terms in a row.

Something else happened with the reinstatement of the major: the size of my classes shrank. Where before I’d been teaching critical reasoning to twenty-five students, I suddenly found I was teaching epistemology to ten, and ten students who were a cut above, at least in terms of their commitment to the material, the ones I had become used to.

I suddenly found myself caring about my students very much. I couldn’t help but get to know them. They would talk to me not simply about the material they had read for class, but about their lives and long-term ambitions and I realized that by that point in my life, I’d actually lived long enough to have acquired some wisdom that could be helpful to them with respect to these more general concerns. They would come talk to me, as I had to Bob, and I found to my surprise that I actually enjoyed talking to them, even about things that were not directly related to philosophy.

“Your students are not your friends,” a colleague once remarked when advising new faculty on the perils of socializing too much with students. He’s right, of course. There’s a certain responsibility in a pedagogical relationship. A teacher must never confide in a student, or look to a student for emotional support. It is perfectly appropriate for a student to do these things, however, with a teacher. A teacher stands in loco parentis. Most college students are young people who have not yet made their way in the world but who are going to college as part of their preparation for that. They are more than their student numbers. They are inexperienced adults who occasionally need support and guidance when contemplating life’s larger questions, or simply how to survive a term in which they are taking too many courses in order to minimize their student-loan debt.

A teacher cannot hold himself too emotionally aloof from his students and still be an effective teacher. The point of a liberal arts education is not merely to impart knowledge to students on a variety of subjects. It is not even to introduce them to the joys of the life of the mind. It is to help them to become happy and thriving adults, to help them in their pursuit of “the good life” in the classical sense. But that can be done only by teachers who are willing to engage with their students as human beings and who can draw on their own humanity, and not simply their intellects, in those relationships.

A teacher has to love his students in a manner that is perhaps unique to that relationship, and in that way teach them that it is natural for people to care about one another and that the world into which they are entering, though challenging, is a friendlier place than they may have thought.

The Age of Idiocy

Portrait caricatureThe eighteenth century is known as “the age of reason.” I was going through our magazines, trying to clear them out, when I ran across an article in an old issue of The Economist that made me fear the late twentieth and early twenty-first century will be known as the age of idiocy. “A growing body of research,” writes the author of “Hunkier than Thou,” “suggests” that women’s “preference for certain types of male physiognomy may be swayed by things beyond their conscious control.” No, really? Women can’t control whom they are attracted to?

Are you f*#!ing kidding me! People are getting money to do this sort of “research”!

It’s not just one cognitively challenged, or intellectually dishonest moron, whose conned someone into giving him (or her) money for this inane “research”; there’s apparently a whole “body” of such “research.” Who is doing this research, I want to know. Is it extraterrestrials? Because we earthlings have known for millennia that we can’t control our attractions.

Correct me if I’m wrong, but isn’t the inability of people to control to whom they are attracted one of the most obvious fact about homo sapiens? Is it not the most popular subject of the world’s great literature, from The Iliad to “The Awakening,” with, I hazard a guess, 75% of everything in between.  Who could not know this obvious and sad fact of human nature. If we had conscious control over whom we were attracted to, then people would always marry absolutely the right person and infidelity and divorce would be unheard of.

Is this not the single most idiotic subject of a scientific study you have ever heard of? The intellectual progress of humanity, of which we have been so proud in the last couple of centuries, has suffered a serious setback if we are suddenly doing “research” to establish the truth of things that have been obvious to most of humanity throughout its long history. Okay, Homer didn’t do an empirical study. Empirical studies should be something of a last resort, however, in our attempts to understand human nature and human society. You don’t do them to tell you stuff you already know. You do them to tell you things you don’t already know. Ah, but there’s the rub: We’ve become so infatuated with empirical studies and the statistics they generate, that we’ve sort of erased the accumulated wisdom of human history and decided that it’s time to reinvent the wheel.

Nowadays, we’re disinclined to believe anything that doesn’t have a study to back it up, apparently forgetting (what most of us knew not so long ago) that you can get a study to support any conclusion you want, so long as you are sufficiently careful in the crafting of your questions. An empirical study, is, as a means of gaining information, the crudest sort of imitation of the awesome mechanism of a run of the mill inductive inference. Let me draw an analogy here to make it a little clearer. The empirical study is to a human brain functioning properly like instant coffee is to real coffee: a pale, sad imitation which no person in his right mind would prefer to the original but would accept only if the original were, for some reason, unobtainable.

Let me give you a few examples of what is problematic about empirical studies. I remember hearing a few years ago that studies showed wine was better for you than beer because people who drank wine tended to be healthier than people who drank beer. Later, someone who still remembered how to think pointed out that people who drank wine tended to have more money than people who drank beer, and that they tended to have better health care and exercise more and to have a healthier diet – and all of that might have something to do with why they tended to be healthier.

Something similar happened with a study on breast cancer. The study found that women who had children when they were younger tended to have a lower incidence of breast cancer than women who waited to have children until they were older.  Later, someone who still remembered how to think pointed out that the women who had children when they were younger tended to live in more rural, less developed, parts of the country and that women who waited to have children until they were older tended to live in more developed, urban and hence polluted areas — and that it might be environmental toxins that explained the higher cancer rates.

The mind boggles at the specter of the hodgepodge of incoherent speculations we will be expected to accept as “facts” based on empirical studies alone. A study is nothing before it is interpreted, and the interpretation requires a brain relying on its own devices and not on other studies.

Our infatuation with empirical studies started, I believe, with formal logic as an attempt to understand the amazing reasoning power of the human brain. At some point, however, the whole thing went horribly wrong, and we began to believe that this lumbering, awkward process of arriving at conclusions was, in fact, superior to the real mechanisms of the organ it had originally been trying to model.

To be fair, my suspicion is that part of our infatuation with empirical research is an understandable and justifiable reaction to the elitism of the medieval scholastics, who could not have cared less what the data showed when they had “the philosopher” on their side. Empirical research is, after all, as much a child of the Enlightenment as is democracy. Still, anything, even a good thing, can be taken to a ridiculous extreme. We have free public education because some wise heads years ago were aware of the dangers inherent in letting ignorant and illiterate people vote. Are we not similarly aware of the dangers inherent in having ignorant and, apparently, illiterate, people crafting empirical studies?

Perhaps, after all, history is just a series of pendulum swings, back and forth, between overweening confidence in our inherent reasoning capacities and pathological skepticism concerning the truth or reality of anything we can’t define ostensively. It looks to me like the project of the Enlightenment to wipe out intellectual elitism has turned into a case of throwing out the baby with the bathwater. We meant to discredit a particular, deviant, exercise of “reason,” but we were finally unable to control this new skepticism and ended with the reductio ad absurdum of discrediting reason itself.

David Brooks writes in the  New York Times that what he calls “the rising philosophy of the day” is “data-ism.”

He’s got that right. We can’t think anymore. We can only count.

(An earlier version of this article appeared in the online journal CounterPunch.)

Accountability in Higher Education: The Elephant in the Room

Portrait caricatureThere’s been a lot of discussion among academics of Richard Arum and Josipa Roksa’s book Academically Adrift: Limited Learning on College Campuses (Chicago, 2011). Arum and Roksa present strong evidence that students are not learning the reasoning skills that colleges and universities claim to teach. Part of the problem, it appears, is that professors aren’t requiring enough of students. Half the students surveyed for the book, observed Sarah E. Igo in a review in Academe, “reported that they had not had a class in the last semester requiring more than twenty pages of writing in the entire course, and a third had not taken a class requiring more than forty pages of reading a week.”

Why aren’t professors requiring more of students? Is it because, as some have argued, tenured and tenure-track faculty are more concerned about their research than they are about teaching? Or because they’re just lazy and hence don’t want to exert themselves grading lots of assignments? The latter position has lots of proponents. Tenure makes it nearly impossible to fire a professor, so what incentive does he or she have to do any real work?

Leaving aside the issue of whether people are more effectively motivated by the carrot or the stick, there’s one huge reason for the decline in the expectations placed on students in higher education that has yet to be given sufficient attention–the increasing amount of university-level instruction that is being done by what academics refer to as “contingent faculty.” Contingent faculty–primarily adjuncts who are hired by the course–are paid so badly that they are forced to teach more courses per term than can be handled well.

Tenured and tenure-track faculty typically teach two courses per term. There’s no official limit, however, to how many courses an adjunct can teach. Adjunct pay is miserably low. My department at Drexel pays between $2,175-$3,000 per course. We’re on quarters, so an adjunct who teaches two courses per term for the standard academic year would have an annual salary of between $13,000-$18,000. Few people, especially people with student loan debt, can afford to live on so little, so most adjuncts teach more than two courses per term. In fact, many teach more than four.

“This class isn’t like the other critical reasoning classes,” one of my students commented recently. “My buddy took critical reasoning last term and he said it was easy. He said he never had to go and he still did well.” This student, my student, I mean, had added the class at the end of the second week of the term. When he went to add it, he’d found that mine was the only section he could get in. “All the others had 25 students,” he said, “but this one had only sixteen.”

“Yeah, I lost a lot of students,” I explained, “after they got their first essay back.” I’d originally had 25 (the official ceiling) in each of my sections, but no more than twenty actually showed up for the first class because I’d emailed them the syllabus, and I think that scared off a few. The syllabus lists the requirements for the course including the fact that there are quizzes every day on the readings and three in-class essays. That’s a lot of work for me, but it makes for a better class because the quizzes mean the students will do the readings and the essays mean they’ll learn to construct a persuasive argument.

I spend almost all my time during the terms when I’m teaching grading quizzes and essays and meeting with students to discuss them. I don’t mind doing the work because I know it’s important. I do mind having almost no free time, but there are breaks between terms and then the summer when I can do some real research. I can’t do much research while I’m teaching. There just isn’t time.

Here’s the kicker though. I’m tenured. I’m one of an increasingly tiny elite of tenured professors who have reasonable teaching loads and rock-solid job security. I teach two courses per term. Sounds pretty cushy, doesn’t it? It’s all I can handle though, if I want to do a good job.

I complained once to another critical reasoning instructor about the amount of time it took to grade essays.

“I don’t give essays,” he said, “I can’t, I’m teaching four other courses.”

He was an adjunct. He had to teach five classes, he explained, just to be able to pay his rent. Some adjuncts teach more than five classes. Not at Drexel. We don’t let them teach more than three for us. They go other places though. They have to just to be able to eat. Most of the sections of critical reasoning we offer in any given term are taught by contingent faculty. That’s why they’re “easy.” The instructors can’t give so many assignments as tenured or tenure-track faculty because they don’t have time to grade them.

Grading essays in enormously time consuming. I’ve spent as much as an hour on a single essay. They don’t usually take that long, but they sometimes do. First you have to figure out what someone is trying to say. You can’t give constructive feedback on how they might be more successful unless you know what they’re trying to say and figuring that out can require reading some essays over and over again. Figuring out what a student is trying to say is only the beginning of the task of grading. Once you’ve done that, you have to determine where they went wrong, precisely where and how they failed. That isn’t easy either. It’s easy enough to say “I can’t make heads or tails of this,” but that doesn’t help them. You’ve got to figure out why you can’t make heads or tails of it. After you’ve done that (“step two,” I call it) you have to figure out what you need to tell them that will be helpful. You can’t point out everything that went wrong. That’s demoralizing. They’ll just give up if you point out every problem. You’ve got to select from among the myriad things that could be improved, the ones that are absolutely crucial and then find a way to communicate them that doesn’t sound too harsh.

I’m fortunate because my job is secure. I have time to give my students substantial reading and writing assignments and I don’t have to worry that they will trash me in their evaluations if I’m hard on their papers. I trust them to be fair with me if I am fair with them, and they usually are. Tell an adjunct that, though. They’re hired by the course. If their evaluations aren’t good, they know that they can be easily replaced with some other recent Ph.D. who’ll be more accommodating.

There’s a lot of talk about how the consumer model of higher education is destroying it. I think if it were employed properly, it could save it. Students should be getting more for their money than most adjuncts, through no fault of their own, are able to give them. It’s not that adjuncts are less well qualified than tenured, or tenure-track professors. They’re occasionally better qualified.  The problem is that they’re overworked. Most aren’t able to give students the kind of attention, or assignments, or feedback on their assignments that a tenured or tenure-track professor could give them. If I were paying what students are paying nowadays to go to school, I’d want more for my money than I could get from and adjunct.

There’s a lot of discussion among academics about the increasing use of adjunct labor, but nearly all of that discussion concerns how exploitative that practice is–of the adjuncts. You almost never hear anyone point out that it is also exploitative of the students, that it exploits their ignorance. Most students are simply relieved to find they’ve got an “easy” class, a class where the instructor requires very little of them. They’re still assuming they’re in school to get that piece of paper that will get them a job and the easier it is to get that piece of paper the better. Most of them don’t realize yet that that piece of paper is not going to get them a job. That if there is any hope of their ever getting, or at least keeping, a job it will be because of the stuff they’ve actually learned in college.

Academics complain almost constantly about the preoccupation of students with “that piece of paper,” yet the academy itself encourages this attitude by turning so much instruction over to people who don’t have time to do more than rubber stamp a student’s transcript.

The recent spate of blaming academics for the decline in the quality of higher education is just another symptom of what Richard Hofstadter, among others, identified as the anti-intellectualism of American culture. What is increasingly referred to as the crisis in higher education is sometimes characterized as a battle between two different models of education: the liberal-arts model and the vocational one. “[I]s college,” asks James M. Maslow, “an apprenticeship for informed public participation or a store selling competitive private credentials?” (“Losing Our Faculties,” Academe). That’s a red herring, though, because the sad truth is we are failing miserably even at the task of teaching practical skills. American culture is very anti-intellectual, so you won’t find too many people in the general population clambering to rescue the liberal-arts model of higher education. People would scream bloody murder, though, if they realized they were paying tens of thousands of dollars to institutions where students weren’t even learning practical skills.

I’m a big proponent of the liberal-arts model of education, but most of the energy I put into teaching is actually directed at helping my students acquire the practical skills of being able to construct and analyze arguments. That’s true even with upper-level courses in epistemology and metaphysics. Most of my students don’t know the difference between an argument and a bunch of unsupported assertions strung together with a lot of non-argumentative rhetoric. Many of them have difficulty even remembering the topics of papers that are assigned in class. I’ll give them the topic and explain the structure the paper should have and still, many will turn in rambling, unstructured musings on unrelated topics. It’s not because they don’t care about doing well. They care very much, but their minds are so completely untrained that even teaching them the most rudimentary of practical skills requires enormous chunks of time, more time than most adjuncts have to give to their students.

People are blaming academics for the crisis in higher education. The decision to turn over increasing amounts of instruction to beleaguered adjuncts is not coming from academics, however, it’s coming from administrators who’ve migrated to academia from the world of business where cutting costs is pursued as if it were a holy grail.

Academics, even adjuncts, care about teaching, but faculties are being squeezed by bloated administrations that need to cut costs to justify their own existence and one of the ways they have chosen to cut costs is to replace tenured faculty with adjuncts. Students need feedback on their work. They need more than just a grade on an assignment if they have any hope of doing well and for many of them grades are crucial to their receiving the financial aid they need to be able to remain in school. Most adjuncts don’t have time to give much feedback though, or to meet with students one-on-one to discuss how they might improve their work. Imagine how frustrated, how desperately frustrated, a student could become who sees his or her grades slipping but can’t get enough feedback from an instructor to halt that downward trend.

Lack of feedback isn’t the only problem associated with the increasing use of adjuncts. I’ve had students who have never been to a single class email me in week eight of a ten-week term with some sob story as to why they’ve never been to class and begging me to make up some special assignments for them so that they can “still pass.” Where, in God’s name, I’ve asked myself, are these kids getting the idea that any instructor would do such a thing? It took me a while to figure that one out. I’ll bet there are a few adjuncts out there who’ll do it. If the student is still officially enrolled in the course, he can still do an evaluation and the instructor may fear he’ll get a bad evaluation if he doesn’t find some way to help the student pass.

Students are being led to believe that they don’t have to do any real work in order to earn an advanced degree. So then, when they run into an instructor who actually requires something from them, they protest the instructor is being unfair. What isn’t fair, however, is blaming tenured and tenure-track faculty for the diminished expectations that are being placed on students when evidence suggests the problem stems from the gradual takeover of instruction by overworked adjuncts who don’t have the time or energy to require much of their students. What isn’t fair is taking money from people and claiming to be educating them when you’re not.

“From the professorial perspective,” writes Benjamin Ginsberg in The Fall of the Faculty : The Rise of the All-Administrative University and Why It Matters (Oxford, 2011), “the university exists to promote teaching by providing faculty members with classrooms, laboratories, libraries, computers, and other instructional resources. From the administrative perspective, however, the purpose of teaching is to bring fees-paying customers (sometimes known as students) into its dormitories and classrooms.”

That’s the elephant in the room, the thing nobody wants to acknowledge because it makes everybody, meaning every institution, look bad. That’s the dirty little secret behind the crisis in higher education. It’s not so much a battle between populist vocational training and old-guard intellectual elitism. It’s a battle between academics who want to give students something for their money and expanding armies of administrators who care less and less about what sort of product they are providing, so long as the money keeps coming in.

This piece originally appeared in CounterPunchMay 29, 2012

Education and Democracy

Anti-intellectualism (cover)I’m reading Richard Hofstadter’s Anti-intellectualism in American Life in preparation for doing a review of Carlin Romano’s new book America the Philosophical. Romano mentions Hofstadter in his introduction, but only in his introduction. He never returns to him. I suspected that was going to turn out to be a weakness in Romano’s book, so I decided I should read Hofstadter before reviewing Romano. That was no great chore. Hofstadter is one of my favorite authors. His book Social Darwinism in American Thought is a real eye-opener. That book, together with Max Weber’s The Protestant Ethic and the Spirit of Capitalism, is a kind of Rosetta Stone of American culture.

The penultimate chapter of Hofstadter’s book looks at the educational theory of John Dewey. “The new education,” Hofstadter observes, that grew out of Dewey’s thought “would have social responsibilities more demanding and more freighted with social significance than the education of the past. Its goal would be nothing less than the fullest realization of the principles of democracy. In setting this aspiration, Dewey stood firmly within the American tradition, for the great educational reformers who had established the common-school system had also been concerned with its potential value to democracy” (Hofstadter, p. 378). That is, in Dewey’s theory, “the ends of democratic education are to be served by the socialization of the child, who is to be made into a co-operative rather than a competitive being and ‘saturated’ with the spirit of service (Hofstadter, p. 379).

Leaving aside the issue of the mounting evidence that people are inherently more inclined to cooperation than to competition, it seems to me that something essential is omitted here. The traditional conception of the significance of education to democracy is that it is important that citizens in a democracy be well informed, that they should be able to read as a means to being well informed, as well as that they should be able to think critically and analytically so as to be better able to sort their way through the information with which they are presented and to properly understand its significance.

I believe, however, that the significance of education to democracy is much greater than that. It is not simply that citizens in a democracy must be rational and well informed, they must also be happy. Unhappy people are too prone to using their vote punitively, that is, in ways that actually decrease rather than increase the happiness of their fellow citizens. But policies that improve the quality of life of the average citizen are the engine of democracy. Without them democracy ultimately breaks down. That is, Dewey’s ideal of socialization as encouraging cooperation can’t be sustained unless the individuals being socialized are relatively happy both throughout the period of socialization and beyond (if the process can be meaningfully said to stop at any point).

What few people understand, I fear, is the importance of education to human happiness. Human beings, as Aristotle famously observed, are rational animals. They have very highly developed and complex brains, brains that have needs of their own for stimulation and challenge. Helen Keller writes movingly, for example, of how perpetually angry, and even violent, she was before she learned language (The Story of My Life). That was partly, of course, because of her difficulty communicating, but it was also, as she clearly details, because of her difficulty in fixing thoughts in her mind. Language, like mathematics and logic, is a cultural achievement. People do not learn it in isolation from other people and they do not gain an optimal command of it if they do not read. The brain is driven to make sense of its environment. It finds fulfillment in that. People would do science (as indeed they did for millennia) even if it had no obvious utility, just as they always done cognitively challenging and stimulating games such as chess and crossword puzzles.

The need of human beings to develop their minds is, I believe, so acute that its fulfillment is an ineradicable element of human happiness. That, I would argue, is the real value of education to democracy. We need to educate people in a democracy not merely so that they will better understand what sorts of policies would be best for society as a whole, but so that they will also desire what is best for society as a whole rather than the spread of their private misery onto the larger community.

Time Travel

Hobart Arena 1959

Hobart Arena 1959

The philosopher Richard Taylor asserts, in his book Metaphysics, that the idea of time travel is incoherent. The incoherence, he claims, “is exposed in saying that . . . at a later time—someone finds himself living at an earlier time. To imagine,” he continues, “‘returning’ to an earlier time is merely to imagine the recurrence of events of that time.

“More precisely, it is to imagine everything, except oneself just as it was then” (73).

I believe he’s wrong. I believe time travel is possible, not in the sense, however, of imagining the recurrence of past events just as they were, while remaining oneself unchanged. That, after all, is nothing but reminiscence, perhaps extraordinarily vivid, but reminiscence nonetheless. Time travel, real time travel, I believe, is the reverse of Taylor’s description. It is to have everything around one just as it is now, while returning oneself to the way one was at an earlier time. In this sense, it is to be not what one is, as the philosophers say, but what one was.

To the extent that most of us go through some kind of moral development as we mature, this may not seem like a desirable project. Moral development is not the only thing we undergo, however, we tend, as we become older, to lose something of the joy and optimism of youth. It ebbs away with the passage of the years, more or less quickly depending on the events of our lives. I lost much of my own joy and optimism, I think, with my parents divorce when I was seventeen. But there were other events, both before and after, that gradually eroded my innocent faith in the benevolence of fate.

One such event was when I gave up my dream of becoming a figure skater. I was forced to confront the fact that my family simply did not have the money to allow me to pursue that dream. I don’t remember ever dreaming of being in the Olympics or anything like that. I did dream, though, of becoming good, really good.

I always loved skating. My sisters and I used to pretend to skate on our driveway in the winter. The driveway was behind and slightly lower than the house and when it was covered with snow it looked a lot like a little pond. We would pack the snow down very hard and then slide around on it in shoes with slick soles pretending we were skating. Sometimes we would dress up. My mother used to take us to the Goodwill store and allow us to pick out cast-off party dresses, or “formals” as we called them, to dress up in. I had a black velvet one with a heavy rolled hem that made it puff out and flare beautifully when I turned. I would wear it and carry a little rabbit fur muff that must also have come from the Goodwill. I felt like a princess as I glided across the packed snow. We often “skated” in the evening when the light over the garage would illuminate the falling snow and if I looked up toward the night sky, it would seem as if the stars were actually falling softly on me or as if the sky were opening up and I were being carried away into it.

We would “skate” like this until our feet were so cold we had lost all feeling in them and then we would ascend the stairs at the edge of our “pond” that led into the kitchen where my father would be waiting with hot chocolate. My feet used to hurt excruciatingly as they warmed up again, but that never kept me from “skating” if there were sufficient snow.

I think I was ten or eleven years old the first time I went skating for real. I went with my Camp Fire Girl troupe. I don’t remember much about that first time except that I greatly admired the skates of one of the other girls. Most of us had to rent skates, but she had her own and they were not brown like the rental skates, but blue with fur at the top.

I must have liked skating though because I went back. My sisters and I began to go skating fairly regularly and soon we each had our own pair of beautiful white skates. None of us had had lessons, but we would wear little skating skirts and watch the other better skaters and imitate what they did.

My parents could not really afford to give us lessons, but I pestered them anyway until they finally gave in. My lessons were during the public skating sessions at the local rink on a little portion of the ice that had been sectioned off for that purpose by orange traffic cones. I had one fifteen-minute lesson each week with a second-rate instructor.  Eventually, my lessons went to half an hour, not because we could afford it but because, in my mother’s words, I had a talent for getting what I wanted, and I wanted to skate.

I was in a Barnes and Noble a few years ago when I ran across something that brought this all back to me. I wandered aimlessly through the magazine section. My eyes fell on a copy of something called International Figure Skating. I was curious to see what skating was like these days, so I picked it up and began to leaf thought it. There was a section at the beginning of photos from some gala or other. I flipped quickly past it, but then went back. Perhaps, I thought, perhaps there will be a photo of someone I used to skate with. Some of the people in the photos weren’t all that young. I’d assumed I’d have to pore carefully over the several pages of photos before I would find anyone, if I did find anyone, I’d known. But there, in the very first frame was Lee Anne Miller. And I wondered whether I’d actually registered the picture unconsciously and that that had been why I’d flipped back to look at the photos again. Or perhaps it had been the name I’d registered and that had called me back to the page.

There she was, staring out at me from the glossy pages of a magazine, the little girl I’d so envied. I recognized her. She seemed barely changed. The same delicate features, the same pale brown hair. I can still see that hair pulled into a small dancer’s bun, held in place with barrettes that matched the color of her leotards and little wrap-around dancer’s skirts. Pink leotard, pink barrettes; blue leotard, blue barrettes. She was like a doll, Lee Anne. Perfectly proportioned, tiny delicate features, dressed like a little ballerina. She looked like one of those dolls that dances in a jewelry box when one opens the lid, but prettier than that really. Lee Anne was the most beautiful thing I had ever seen. Her every movement was like a dancer’s, slow and deliberate and graceful. I used to love to watch her skate. There was something swanlike about her.

I was not part of that crowd, the elite skaters, not the first year anyway. I came to skate in Troy, Ohio, in the huge cavernous old Hobart Arena, simply because it was the only rink that was open in the summer. I loved the place. Most skating rinks look like barns, or warehouses, from the outside, but there was something noble about Hobart Arena. It was built of brick and stone in the grand style of the late 1940s. It had been given to the town by the Hobart Electric Manufacturing Co. in 1950 and had clearly been intended to be a showpiece. It was not only the rink, however, that was beautiful. It was in the middle of a park and just behind it was the municipal swimming pool that had a snack bar the skaters used to frequent between skating sessions. There was something almost magical to me about that grand cathedral of winter sport situated in the middle of a verdant summer paradise.

A bunch of us came up from Dayton that first summer. We were out of our league and that was kind of humiliating, but there was also something incredibly exhilarating about being around all that talent and dedication. I was fascinated by the discipline of it and all the esoteric trappings like the harness that hung from the ceiling and that was fastened around the waist of the female when pair skaters practiced overhead lifts. I loved the almost meditative hush of the sessions devoted to school figures a hush broken only by the soft whir of the scribes, the large aluminum ice compasses, scratching circles on the ice for the skaters to follow, or the occasional muscular, ripping sound of the push of skaters working on backward eights.

We had stroking class for an hour every Thursday evening and that first summer, at least, I spent the entire session in abject fear of being mowed down by the hoards of more powerful skaters. The second year was better though. I switched teachers. I got a better teacher, Dick Rimmer’s wife, Lynn. They ran that place, Dick and Lynn Rimmer. Dick had been the official coach to the 1972 Olympic team (at least I think that is what it said on the brochure I showed to my parents in an effort to convince them that the program would be worth the expense). I was determined not to remain the worst skater there, so I spent almost a year convincing my parents to secure Lynn Rimmer for me as a teacher. I liked her, she was kind. She told me once, when I was working on a split jump, that I was a “smart skater.” That made me happy, though I was never really sure what she had meant.

I did better that second summer. Not only was I not mowed down, I actually kept up, sort of. I got better skates, passed my preliminary figure test and was accepted, finally, into the periphery of the elite group. But then I had to quit skating. I needed a scribe in order to be able to progress to the first figure test. But a scribe cost fifty dollars. That was a lot of money back then and my parents couldn’t afford it.

Few middle class families can afford the cost of training a serious competitive skater. Figure skating, according to an article in The Wall Street Journal a few years ago, is one of the most expensive sports there is. Skating parents must either have so much money that almost any sum can be spent on their children’s hobbies, or they must be willing to sacrifice everything, even their children’s education, for art or in the hope that they will “win the lottery.”

My parents had neither so much money that they could afford the cost of training a competitive skater, nor the values that would have led them to sacrifice everything else to get the money. I didn’t really understand that. All I knew, or thought I knew, at the time was that what I loved most was not important to them.

I didn’t even follow skating after that. “Never look back!” It was not just my motto, but my entire personality. I began to dream though, when I was in graduate school and when I first began teaching, about taking up skating again. I had a bad time in graduate school and that dream, distant as it seemed, was one of the things that sustained me through that difficult period.

I bought the magazine with the picture of Lee Anne Miller and decided that I should begin taking skating lessons

I had intended to take freestyle lessons but my first teacher steered me gradually toward dance, divining, I suspect, that I would be a much better dancer than I would ever be a freestyle skater. Dance is probably better for most adult skaters anyway because there is less chance of serious injury and a much greater chance of gaining something approaching genuine mastery of the sport. There are quite a few adult skaters who are expert dancers. They have become my role models.

I’m never happier these days than when I am skating. Skating is the only thing I do now for no other reason than the joy of it. It will not make me wiser. It will not help my career. Indeed, for an adult to take up figure skating is viewed by many people, including my husband (who, to his credit, has taken it up himself in order to be able to spend more time with me), as somewhat bizarre. Skating is popularly believed to be an activity for children not for older people, people with brittle bones.

When I’m done skating my session, the “adult session,” and the ice has been freshly resurfaced, I will sometimes stay to watch the beginning of the next session when the competitive skaters, one by one, take to the ice like so many seagulls gathering gradually about an invisible school of fish. They glide easily onto the frozen surface. Flying past me, they swoop, they dip, they dive, each listening to his own inner compulsion. There’s no effort at coordination, and yet they’re a kind of visual symphony, as beautiful as a flock of birds, if not more beautiful, because after all, what birds do is natural to them, whereas what skaters do is natural only to the spirit, not to the body, so to see bodies do it with such effortless grace–well, there’s something miraculous in it.

I am filled sometimes, as I watch them, with a terrible aching melancholy at the realization that I will never be one of them. There’s a tiny window of time in everyone’s life through which he can reach to grasp that sort of dream and mine was closed and locked long ago. Sometimes I can’t bear the ache that accompanies the realization that what I once wanted more than anything, I will never have, that I will have lived and died without ever having realized that dream.

Most of the time though, I am not unhappy. Most of the time I count myself very lucky. Many competitive skaters give up skating entirely after they stop competing, or after they stop performing (if they are so fortunate as to have had a professional career). Some say they simply don’t enjoy skating when they can no longer perform at what was once their peak, others have had all desire to skate extinguished by too many years of too rigorous a training schedule. They accept the diminished vitality that comes with aging as a matter of course. They age, they grow old, they die.

But I am growing younger. I’m a better skater now than I was when I was a child and I have every reason to believe that my skills will continue to improve for many years to come. Oleg and Ludmila Protopopov, the 1964 and ’68 Olympic pair skating champions, still perform and they are in their eighties. Richard Dwyer still performs and so is he.

I don’t know that I’ll ever don a little skating skirt again and my dreams, whatever they are, no longer include becoming a competitive skater. When I skate now, though, I feel like a time traveler. Something of the beauty of the slow and paradoxical summers I spent on the ice as a child comes back to me. I sense again the sweet strangeness of crossing the green expanse of park to get ice cream and then returning to the frosty unreality of the rink. When I skate now all the struggles, stresses and disappointments of the years fade away and I am once again the little girl gazing up at the stars falling from the sky.

The War on Fairness

Portrait caricatureIt’s rare when a person does something that is at once so idiotic and so heinous that it brings discredit upon his entire profession. I fear philosopher Stephen T. Asma has done this, however, with his new book from the University of Chicago Press. I’ve bragged for years to friends and relatives that the philosophy curriculum at the graduate level is so rigorous that it weeds out the kinds of morons who all too often are able to make it through other Ph.D. programs. Not everyone with a Ph.D. in philosophy is a transcendent genius, I’ve conceded, but there’s a basement level of analytical acuity below which philosophers simply do not go.

I stand corrected. Stephen T. Asma’s article, “In Defense of Favoritism,” excerpted from his book Against Fairness (I’m not making this up, I swear) is the worst piece of incoherent and morally reprehensible tripe I think I’ve ever read in my life. I endeavor, as a rule, not to read crap, but I was intrigued when I saw the title of Asma’s article in the headlines I receive every day from The Chronicle of Higher Education. Clever hook, I thought! It seemed obvious to me that few people would undertake a genuine defense of favoritism and that the Chronicle would certainly never publish such a thing, so I was curious to find out what the article was actually about.

Well, it’s just what it says it is–it’s a defense, or an attempt at a defense anyway, of favoritism. I say “an attempt” at a defense because favoritism is considered by most people to be indefensible, and with good reason.  “Favoritism,” as distinguished from the universally human phenomenon of having favorites, is defined by the Oxford English Dictionary as “[a] disposition to show, or the practice of showing, favour or partiality to an individual or class, to the neglect of others having equal or superior claims; undue preference.” It’s the qualification of the preference as “undue” that’s important here.

There’s nothing wrong with wanting your niece or nephew, for example, to get that new tenure-track position in your department, but there’s a whole lot wrong with giving it to them, or giving them preferential treatment in discussions of who should get it, simply because they are your niece or nephew. Ditto for your favorite grad student. To want someone you care about to succeed because you care about them is perfectly natural. To ENSURE that they succeed over other, and possibly better qualified, people simply because you care about them is wrong. That’s what favoritism is though.

I thought at first that Asma might simply be confused about the meaning of “favoritism,” that what he was actually trying to do was to defend the view that there’s nothing wrong with having favorites, that what philosophers refer to as “preferential affection” is simply part of human nature and not something anyone should ever feel guilty about. The further I got into the article, however, the clearer it became that Asma was indeed trying to defend undue preference.

The piece, as Kierkegaard would say, is something both to laugh at and to weep over in that it’s such an inept piece of argumentation that it’s hilarious while at the same time being profoundly morally offensive. That Asma’s opening is, as one reader observes in the comments following the article, “irrelevant to his point” is the least of his crimes against sound reasoning.

“Fairness,” asserts Asma, “is not the be-all and end-all standard for justice,” thus positioning himself as a sort of imbecilic David over and against the Goliath of John Rawls whose theory of justice as fairness is much admired by philosophers. There’s nothing wrong, of course, with taking aim at intellectual giants. It helps, however, when one does this, to have a good argument.

But Asma does not have a good argument. It’s impossible to give a developmental account of Asma’s argument because it has little that resembles a structure. Instead of starting with premises that he carefully arranges to lead the reader from assumptions he already holds to a conclusion the inevitability of which he is finally compelled, if not actually to accept, then at least to concede as probable, Asma presents a mishmash of irrelevant, incoherent, and equivocal non sequiturs that litter the page like toys strewn about a room by a child rooting impatiently through his toybox for the one cherished toy he cannot find. And what is Asma’s cherished toy? Why it’s favoritism! Asma is determined to prove that favoritism is, in his own words, “not a bad thing.”

The upshot of Asma’s rambling argument is that the tendency toward favoritism is part of human nature. This is regrettably true. It makes us feel good when we promote the interests of those we love. Just because something makes us feel good though, doesn’t mean that it’s ethical. The conflation of these two things is known in philosophy as “the naturalistic fallacy.” Asma, ought to know this because he is a philosopher. How he can make such a fundamental mistake is mystifying.

The article begins with Asma recounting a scene with his son who is complaining because Asma will not allow him to play a game that involves the killing of zombies because he, Asma, feels his son is too young for that sort of game. “That’s sooo not fair!” his son protests. Instead, however, of using this occasion as the inspiration to write a book for children that will help them to better understand the meaning of the word “fair,” Asma takes his toddler’s grasp of the term, equates it erroneously with “egalitarianism” and decides to write a philosophical treatise (for adults) discrediting both.

Asma then turns to an examination of what he asserts is the virtue of generosity. What he actually describes, however, is not what most philosophers would identify as a virtue (which, according to Aristotle, for one, requires cultivation), but a natural inclination, found in varying degrees in various individuals, to share what one has with one’s friends–and only, he is careful to explain, with one’s friends. But the fact that most people enjoy sharing what they have with their friends does not make this inclination into a virtue. To equate a natural inclination, in this way, with a virtue is, once again, an expression of the naturalistic fallacy.

The child in Asma’s example gives all her candy to a few friends over the protestations of classmates to whom she has a less passionate emotional attachment. “But the quality of her generosity,” asserts Asma, “is not compromised by the fact that she gave it all to her five friends.” This flagrantly begs the question, however, because there is a sizable contingent of humanity that would contest such a definition of “generosity.” Sure, if you define sharing with only your friends as “virtuous,” then you won’t have a hard time defending favoritism because sharing with only your friends is the same thing as favoritism and far from seeing it as a virtue, most of humanity would see it as downright nasty.

And that isn’t the only problem with conflating inclinations and virtues. How about sharing with your friend when you have good reason to believe that that friend is going to use what you’ve shared with him to further some nefarious purpose he may have? Is that virtuous? Plato talks about that problem in the Republic. Is it possible that Asma, a philosopher, hasn’t read the Republic?

My heart sort of goes out to Asma at that point, though, because he seems to be contrasting the child who shares with only her friends with a child who refuses to share any of his candy with anyone–ever. But that’s not just greedy, it’s pathological and anyone who fails to recognize this must have had a very wretched childhood indeed. To Asma’s credit, he acknowledges that his argument is “counterintuitive.” Readers will find themselves wishing, however, that Asma hadn’t been so dismissive of his intuitions.

Asma erroneously asserts that the activities of those in the civil rights and feminist movements, for example, are expressions of favoritism and tribalism. That’s a fair charge to level, I suppose, against black supremacists, and perhaps against radical feminist separatists, but the two examples Asma cites, Rosa Parks and Susan B. Anthony, hardly fall into those categories. It’s not favoritism to demand rights for one’s group that are equal to the rest of society. Only fighting for more rights, or for preferential treatment, could be characterized that way.

Perhaps it’s the term “equal” that throws Asma off. He seems to have a particular aversion to it. He refers, for example, to what he claims is “American hostility to elitism,” but the example he gives is not one of anti-elitism, which would be hard to find in our culture, but one of anti-intellectualism. That is, he points out that “politicians work hard to downplay their own intelligence and intellectual accomplishments so they might seem less threatening (less eggheadish) to the public.”

We’re not hostile to elitism in the U.S. though. We’re the most thoroughly elitist society in the economically developed world. Everything from our systems of taxation, education, and health, to our system of criminal justice is set up to favor the wealthy elites.

Asma cites several studies that show that what is called “ingroup bias” appears to be inherent in human nature and uses this fact to support his position that favoritism is therefore “not a bad thing.” That something is inherent in human nature does not, however, entail that it is morally acceptable. There are all kinds of unfortunate tendencies in human nature that parents, societies, and finally civilization itself endeavor to control, tame, and even in some cases eradicate.

Asma’s whole defense of favoritism is not simply an expression of “the naturalistic fallacy,” referred to above. To the extent that he tries to defend favoritism by arguing that it’s innate, he’s also guilty of conflating an “ought” with an “is.” Hume referred to this mistake as the “is-ought” problem. That is, it is a misguided attempt to draw inferences about the nature of moral obligation (i.e., how people ought to behave) from observations about how people tend to behave (i.e., how they do behave) when the two things are qualitatively different and need to be kept rigorously distinguished.

Asma returns, at the end of the article, to the example of children. He appears to have hopped on the bandwagon of pseudo-intellectuals who have begun to express concern that we are being too nice to our children. It seems Asma’s son came home one day with a ribbon he’d “won” in a footrace, but Asma’s pride dissipated when his son explained that all the children had “won” the race, that they’d all been given ribbons. “I don’t want my son, and every other kid in his class,” protests Asma, “to be told they’d ‘won’ the footrace at school just because we think their self-esteem can’t handle the truth. Equal rewards for unequal accomplishments foster the dogma of fairness, but they don’t improve my son or the other students.”

Leaving aside the issue that Asma has once again evinced that he has appropriated a toddler’s simplistic and hence erroneous definition of “fairness,” there’s something comically fantastical about Asma’s apparent fear that today’s youth are in danger of living out their lives in blissful ignorance of their own weaknesses and inadequacies. The likelihood, for example, that admissions to elite universities are suddenly going to become merit blind, or that we will cease keeping statistics on the accomplishments of professional athletes seems vanishingly small, and the only professions that seem openly to embrace the conspicuously inept are those in the financial industry.

Sadly, children will learn all too soon that there are winners and losers and that the former are rewarded while the latter are not. Not only does it do no harm to stave off that realization as long as possible, it may actually do a great deal of good if it helps us to teach children that their worth as individuals is not dependent on their bettering their peers in contests. Not everyone can be a winner. Most people have to content themselves with being also-rans. If we can teach children early that the also-rans are to be lauded as an essential part of the race (after all, there is no race without them), then we might actually help to increase the number of people who are able to live happy and fulfilling lives.

Asma’s fears are not restricted, however, to the specter of a utopian future for his progeny. Even while wealth is increasingly transferred to a dwindling minority of the American population, Asma is tortured by feverish nightmares of creeping socialism. “Liberals,” he asserts, “say ‘fairness’ when they mean ‘all things should be equal’”–as if we, in the U.S., stood in imminent danger of sweeping political reforms that would make the social-welfare states of Northern Europe look like Czarist Russia by comparison.

What’s disturbing is not so much Asma’s argument as the fact that it found a reputable (or at least once reputable) academic publisher and that it was actually excerpted in The Chronicle of Higher Education. Noam Chomsky said somewhere that despite all the atrocities he had spent a large part of his life chronicling, he believed humanity was making moral progress. You don’t see moral defenses of slavery anymore, he pointed out, whereas you did see such things in earlier periods of human history. Yes, maybe that’s true. But if we’ve regressed to the point that it’s now socially acceptable to publish moral defenses of favoritism, and attacks on fairness, can defenses of slavery be far behind?

This piece originally appeared in CounterPunch on 11/192012