On Collective Guilt

Ruth_Andreas-TitelWe can’t leave the Holocaust alone. That might be a good thing if we had the courage to view it honestly. We don’t though. We insist that it’s a puzzle we continue to try to solve, ostensibly so that we will know where to place blame, and in that way also know how to ensure that it will never happen again. We refuse, however, to place blame where it really belongs and so we keep turning it over and over, searching for something we will never find.

Why the Germans? Why the Jews? are questions that Götz Aly takes up in a new book the title of which begins with these questions (Metropolitan Books, 2014). Aly’s theory, not particularly novel, is that the social and economic advances made possible for Jews in Germany as a result of a series of legal reforms in the various German states in the eighteenth and nineteenth centuries made them objects of envy. “Not all Nazi voters,” acknowledges Christopher R. Browning in a review of Aly’s book, “were anti-Semitic, but they at least tolerated Nazi anti-Semitism” (“How Envy of Jews Lay Behind It,” The New York Review of Books, January 8, 2015).

“But how to explain,” Browning continues, “this ‘moral insensibility’ and ‘moral torpor’ of 1933-1944, which underpinned the ‘criminal collaboration’ between the German people and the Nazi regime?” The answer Aly offered first in Hitler’s Beneficiaries (Metropolitan Books, 2005), was material gain. Aly’s new work supplements the motive of material gain with a “new morality” involving race theory that would justify such collaboration.

Many Germans remained unconvinced, however, by the new race theory. Many Germans were, in fact, untroubled by the legal reforms that had made possible the flowering of the Jewish middle class. Many Germans had even championed these reforms.

What happened to those people?

The journalist Ruth Andreas-Friedrich, who lived in Berlin during the war, gives us some insight into what happened to them in the diary she kept from 1938-1945. Initially, at least, they were not helping the Nazis. Her entry for Nov 10, 1938, the day after the infamous Kristallnacht,“ gives moving testament to that fact. At half past nine in the morning Andreas-Friedrich took a bus to her office. “The bus conductor looks at me,” she writes,

as if he had something important to say, but then just shakes his head, and looks away guiltily. My fellow passengers don’t look up at all. Everyone’s expression seems somehow to be asking forgiveness. The Kurfürstendamm is a sea of broken glass. At the corner of Fasanenstraße people are gathering–a mute mass looking in dismay at the synagogue, whose dome is hidden in a cloud of smoke.

            ‘A damn shame!’ a man beside me whispers … [W]e all feel that we are brothers as we sit here in the bus ready to die of shame. Brothers in shame; comrades in humiliation” (Berlin Underground 1938-1945 [Paragon House, 1989).

When she gets to the office, her editor, whom she observes, was “rumored to have a tinge of Nazism, ” says “one doesn’t dare look people in the eye anymore” (21).

“They’ve dragged all them all away–all the Jewish men they could get hold of,” begins her entry for the next day.

Only those who were warned in time have escaped the raid. Thank Heavens, a good many were warned. Hundreds managed to disappear at the houses of friends; hundreds sought shelter with strangers and found it. One little seamstress took in two Jewish fugitives; she didn’t even know their names or where they came from. Workingmen in the Frankfurter Allee brought back to the Jewish shop-owners the merchandise that was scattered over the street. They didn’t say a word, just tugged sheepishly at their caps. The chief surgeon of a hospital is hiding a wounded rabbi in the back room from the bloodhounds of the Gestapo.

            While the SS was raging, innumerable fellow Germans were ready to die of pity and shame” (p. 25).

The next line of the translation reads “Almost all our friends have people quartered on them.” If one goes to the original German edition of the diaries, however, the text continues

Women are dashing about the city today with mysterious bundles under their arms, meeting one another on street corners: Shaving articles for Doctor Weißmann. A clean shirt for Fritz Levy, night things for Jochen Cohn. One tries, as much as possible, to look after those in hiding. It isn’t advisable for them to come out of hiding yet. What happened yesterday could continue today (Der Schattenmann [The Shadow Man], Suhrkamp, 2nd ed. 2012, p. 38).

Then comes the line “Almost all our friends have people quartered on them.” There is no ellipsis to indicate material was omitted. One could argue it doesn’t matter because what makes it into the translation makes clear that the general reaction of Berliners to Kristallnacht was one of horror. Still, the omitted material makes even clearer how widespread among gentiles was sympathy for the plight of the Jews.

Interesting, eh? People running about the city collecting the necessary articles for friends, and in some cases even strangers, they’re protecting. Jews being given shelter by countless German gentiles. Workmen returning to Jewish shop-owners merchandise that had been scattered on the street. What happened to those countless Germans who were sympathetic to the plight of the Jews, to those countless “brothers in shame”?

What do you think happened to them? What happens to people who try to help others as it becomes increasingly clear what such assistance might eventually cost them? Some continue, despite the danger, some go into resistance groups such as “Uncle Emil,“ the one with which Andreas-Friedrich became associated, but most do not.

Andreas-Friedrich “looks lovingly” at the man who whispers “A damn shame!” at the sight of the burning synagogue.

“It occurs to me,” she writes, “that this is ”really the time to call your neighbor ‘brother.’ But I don’t do it. One never does; one just thinks it. And if you really do pluck up the courage for a running start, in the end you just ask, ‘Pardon me, could you tell me the time?’ And then you are instantly ashamed of being such a coward” (p. 19).

Why couldn’t she do it? Why couldn’t she acknowledge to the man that she also condemned what had happened the night before? Why couldn’t any of the people on the bus who were hanging their heads in shame, in silent shame? Why doesn’t one do it?

Years ago I saw a nature program that focused on a litter of wolf cubs. There were three cubs in the den. One emerged, however, days before the other two. He was bold, he was courageous. He was eager to explore the outside world. Ah, I thought to myself, he will be the alpha wolf. He will grow up to be the leader.

One day, though, the brave little cub came home from his explorations with an injured foot. He left again the next day, undaunted by his grisly experience of the day before, but that evening, he did not return. He never returned again. Who knows what had gotten him, but something clearly had.

Several more days passed after the disappearance of the first little cub before the two remaining ones peeked out, trembling, bodies pressed together, from the mouth of the little den. Another day still passed before they had the courage actually to emerge fully from the shelter of their home.

And suddenly I understood why human beings are such a miserable craven lot. Natural selection has ensured that cowardly individuals have a higher survival rate than courageous ones. They live longer, produce more offspring. So it isn’t our fault, really, that we’re such a miserable, craven lot. It’s in our genes.

And yet it is our fault because cowardice isn’t the only thing that’s in our genes. We have somehow also evolved a conscience. We know, as Aristotle expressed it in the Nicomachean Ethics, that there are things we ought rather to “face death” than do (Book III 1). And yet few of us have the courage to face death to do the right thing. Few of us even have the courage to say “brother” to another who affirms the values we purport to hold dear.

Elizabeth Kolbert writes in the February 16th issue of The New Yorker that the Germans “failed miserably” to draw a line between the innocent and the guilty after the war. She writes, in fact, that to say they “failed miserably” would be “generous” (“The Last Trial”). That’s true, of course, though in a different sense, I think, than the one Kolbert meant, because the line, drawn properly, would encircle us all, all except for the few whose willingness to martyr themselves to do the right thing places them not outside the group, but above it.

We are all guilty of the cravenness that paved the way for the Holocaust, the glass through which we keep seeing darkly, which we keep turning over and over in a vain attempt to escape our own reflection. If we had the courage to recognize ourselves in it, then perhaps we could learn from it. But courage, sadly, is precisely what we lack.

(This piece is dedicated to my dear friend and German tutor of many years, Ebba Mørkeberg 1924-2014.  It originally appeared in the of Feb 17, 2015 issue of Counterpunch.)

Material Culture

I like things. I’ve always been like that. I’m acquisitive. I have so much stuff that I routinely have to go through it get rid of some of it. I used to feel guilty about my acquisitive tendency, but I’ve become reconciled to it in the last few years.

My desk

I’m fascinated by things. I don’t remember when I first started haunting thrift stores, junk shops, salvage places, but I know that I was very young. There was something compelling to me about things that were old, things that had been so well made that they’d survived when shoddier things would have to have been thrown away, something fascinating about what time, and much handling, does to things. I liked the sheer variety of utilitarian things because they seemed to me to be a concrete expression of the ephemera of human experience, of the daily grind of, for example, working in an office. I like old office supplies, the heavy old Royal typewriters like the one my father used when I was a child, old staplers, hole punches, mechanical pencils. I wonder, always, whether the offices where these things were once so useful still exist, if people are still working there, or if perhaps they’ve been torn down.

I especially like old fountain pens. This is partly, I think, because I write so much and partly because I come from a family of writers. I’ve used a fountain pen for as long as I can remember and have been collecting them for at least twenty years. When I say “collecting,” I don’t mean that I’m stockpiling rare and expensive pens. Firstly, I don’t have the money to do that. Secondly, I buy pens to write with, not to look at.

I’ll go through long periods where I won’t buy any pens, but then I’ll start buying them again. Not many, just one or two. I couldn’t afford to buy more than that because they’re not cheap. In the beginning, I told myself that I was searching for the perfect pen and that when I found it, I’d stop buying pens. But I could never content myself with a single pen for long, no matter how nice it was. You might think that perhaps my standards are too high. It isn’t that, though. I love all my pens (the ones I have kept, anyway; the others I sell). It’s not that I become disappointed with the pens I have. It’s that I want more. I crave variety.

I’ve bought and sold many pens online. I’d never been to a pen show, though, until last weekend when there was one in Philadelphia, where I live. I didn’t need any new pens, of course, but I did need work done on a couple of pens I already had, and I needed someone to show me how to fill my Parker Vacumatic. The Vacumatic has a unique filling system that is not used anymore and is difficult to figure out. I knew there would be someone at the show who could explain it to me, so I packed up my Parker and headed for the Sheraton on 17th Street where the show was being held.

It seemed everyone in the pen world was there. There were lots of dealers selling both new and vintage pens, along with other writing equipment and arcane sorts of office supplies and pen ephemera. I was in heaven! I caught just a snippet of a conversation as I was wandering from one table to another. One of the dealers was talking to a customer:

“You have to have passion for something,” he said. “Passion is what makes life worth living!”

I suspect everyone there would have agreed. Most of the interest was in the old pens, the ones that had seen lots of use but which had been so magnificently conceived and constructed that they now, almost a century later in some cases, could still be used.

The biggest attractions at most pen shows, though, are not the pens. Anything can be bought online now, even the oldest eyedropper pens from the nineteenth century. No, the biggest attractions are the nibmeisters, the guys who repair old pens and, in particular, custom grind nibs. They can take a medium point and grind it down to a fine, put an angle on it for what is called an “oblique” nib, turn it into a “stub” or an “italic.” You don’t have to go to a show to get a custom-ground nib. You can mail a pen off to be reground. That takes a while, though (sometimes months), and, more importantly, you might not be entirely happy with the results. The nib might be too fine, or it might be scratchy. You won’t know, though, until the pen comes back, and if you aren’t entirely happy with the regrind, you’ll either have to settle for less than what you wanted or pop it back in the mail again and endure another long wait. If you get a nib ground at a show, on the other hand, you can sit there while the work is being done. You can try it out and give it back immediately to the nibmeister for an adjustment if you aren’t entirely happy with it.

I had a nib I wanted worked on. I’d started collecting Pelikan pens. I love Pelikans for several reasons. First, they hold more ink than most other pens, and second, they feel very good in the hand. Third, they have what are called “responsive nibs,” meaning nibs that give somewhat if pressure is applied to them so they feel just a little like how old quill pens must have felt. Their nibs tend to be a slightly broader, however, than the nibs on the pens I was used to, so I had brought my latest Pelikan to see if something could be done to it to give it a crisper line. I asked James Baer, of Monomoy Vintage Pen in Newton, MA. He’d shown me how to fill my Parker Vacumatic, so I thought he might also be able to help me with my nib. Unfortunately, he wasn’t doing any grinding at the show. He directed me instead to a young man named Tim Girdler sitting just a few tables away.

I had to put my name on a waiting list and then wander around the show until my turn came.

“What do you want?” he asked like a cook at a lunch counter. I had a Pelikan, I explained.  It had a “fine” point, but I’d started with Japanese pens and their “fines” were much finer so the Pelikan seemed a little “blobby” to me. I told him that I’d like him to put a slight slant on the nib, to make it into an “oblique” because I knew that would give me a crisper line.

He asked me to write something so he could see how I wrote, the angle at which I held the pen. After I did a little writing sample for him, he said he didn’t think that an oblique was really what I wanted.

“Let me try something,” he said. He took the pen and began to rub the nib on a little piece of emery paper.

“Now try it,” he said as he handed it back to me.

It was better, but still not what I wanted, so I gave it back to him and he got out his little motorized grinding stone so that he could work more aggressively on it than the emery paper would allow.

“How’s this?” he said eventually, and then added “you’re going to hate it.” I knew why he’d added that. He’d flattened out the nib like a stub, but he hadn’t bothered to smooth it yet. He wanted to see if I liked the line quality of the new grind before he put in all the effort to make it write smoothly.

He was right, I did hate it. And yet, and yet, it had just the line quality I’d been looking for. The only problem was that it was scratchy, but I knew that was fixable.

“I like it!” I exclaimed, a smile spreading across my face, “except, of course, for the fact that it is very scratchy.” I gave it back to him and he went to work on it again with the emery paper.

When he finally gave it back to me it was perfect. I mean it was really perfect! It had exactly the line quality I wanted, and it wrote very smoothly. He cautioned me that it would never write quite so smoothly as a regular nib. That’s the thing about italic nibs, he explained, they aren’t scratchy if they’re ground properly, but they aren’t so smooth as a standard nib either. There’s a sweet spot on them, he explained. You have to hold the pen at exactly the right angle or you’ll get a little drag. A standard nib is more forgiving, but it’s also less distinctive.

He said I had sixty days, or something like that, during which I could still have adjustments made, so I took his card, just in case. I’d heard him say to the person before me that he’d been a seminary student. I was surprised, though, when I looked at his card, to read “Tim Girdler Pens: Ministering Through the Perfect Pen.” He had made my life better, I realized, through the work he’d done on my pen, through the concern he’d shown for what I wanted.

I’m so happy with with my “new” pen, I’m just writing and writing, so happy now to have my pen just the way I want it. Passion is indeed what makes life worth living. Tim Girdler has it. It must have taken him years to become so skilled as he is now, skilled at something that he must always have known would never be in great demand. Tim Girdler has passion and he uses it to make people’s lives better. He isn’t the only pen person who cares for more than his own material wellbeing. Rick Propas, of “The PENguin Fountain Pens,” once offered to send me a whole tray of pens to practice repairing–for nothing, just because he could tell I shared his passion. I’d never met him either, but only corresponded with him via email.

Passion is part of what attracts me to things. You can see it in the design of things, in how they’re finished, in the attention to detail. You can see, in things, the shape of human aspirations. There’s a humanity that pervades things made by human beings for human purposes. This is even more evident, I think, in things that have been used. That’s why I like the patina of use. It shows the humanity of the thing, or the humanity that clings to it. I can’t get enough of that, or enough of the things that speak in that way of other peoples lives.

So I keep buying things, especially old pens. There’s a line I love in Tennessee Williams’ Cat on a Hot Tin Roof. Big Daddy, who has just learned he’s dying of cancer, is talking to his son about mortality. “The human animal,” he explains, “is a beast that dies and if he’s got money he buys and buys and buys and I think the reason he buys everything he can buy is that in the back of his mind he has the crazy hope that one of his purchases will be life everlasting!”

I wonder if it isn’t life everlasting that I’m looking for. I feel sometimes, when I’m trawling through other people’s discarded possessions, or sitting at my desk surrounded by things I know once belonged to people now long dead, that life everlasting is what I have in my things. It’s as if I’ve taken the lives of these other people up into my own, as if, in that way, I’ve created a timeless connection between us, a timelessness that is a little eternity of its own.

(This essay originally appeared in Counterpunch under the title “Living in a Material World.”

Sport and the Sublime

Greg LouganisThe following piece originally appeared in the 25-27 January 2013 edition of Counterpuch. I am posting it here in honor of the 2014 Winter Olympics that have just gotten underway in Sochi, Russia.

I watched a lot of TV as a kid. That was before cable, so finding something interesting could be challenging. I was channel surfing one day when I happened on some diving. I didn’t know anything about diving. but even people who don’t know anything about it can appreciate the beauty of it. There was nothing better on, so I decided to watch for a bit.

One diver after another came on the screen and executed what seemed to be perfect dives. But then, suddenly, there was Greg Louganis. There’s a video of Louganis on YouTube that begins: “There are two categories of divers, those who perform with magnificent skill, grace, beauty, and courage–” there’s a pause and the narrator’s voice drops an octave, “then there is Greg Louganis.”

That pretty much sums it up. I was watching all these divers who seemed perfect, and then suddenly there was Greg Louganis. He wasn’t just perfect–he was sublime. I didn’t know anything about diving and yet watching Louganis gave me the feeling Emily Dickinson reportedly said one gets from good poetry–it made me cold to the bone. It gave me that shiver of the numinous that Rudolf Otto talks about in The Idea of the Holy.

That was a defining moment in my life. It was, I believe, when I first realized that there was more to reality than what appears on the surface of experience. Louganis executed the same beautiful movements as all the other divers, and yet there was something more in his movements than in everyone else’s. Something ineffable and yet so powerful; it hit the spectator with the force of a blow, like the shock of electricity. It seemed as if there were more energy in every fiber of his being, more vital life force. It was as if he were more real than the other divers, as if the other divers had been only moving images, whereas Louganis was a man in the flesh. Except that the other divers had been real. So Louganis seemed somehow to have more reality than the others.

*

I saw the same thing a few years ago in person. I’d just started taking figure skating lessons and used to go to competitions to cheer on a little boy whom my teacher was coaching. I stayed, once, to watch the next competition for slightly more advanced boys. One of the skaters caught my eye during the warmup. He was doing a very simple move, one I was trying, in fact, to learn myself at that time. It’s called “edges with three turns” and involves the skater making large arcs across the ice on alternating feet with a turn in the middle from forward to backward so that the tracings left on the ice look like a series of elongated number threes facing in opposite directions. It’s a simple looking move, yet it’s very difficult to perform well because, after the turn, the skater’s shoulders have a tendency to continue to pull him in the direction of the turn. If this motion is not checked, then it will be almost impossible for him to step forward again into the next arc. The shoulders and hips have to turn independently of each other, and the skater has to have a considerable degree of control over his upper body to keep the motion of the shoulders in check.

This boy, the one I was watching, can’t have been more than 14 years old, but he had the serene self possession of a dancer at the barre. His movements were slow, deliberate, and exquisite. I’d never seen anything like it. Not only did he have perfect form, he had perfect concentration. Other skaters raced past him, but he was so absorbed in what he was doing he seemed not to notice them. It was almost as if he were out there alone, as if the other skaters had been reduced to shadows. I could not take my eyes off him.

*

The idea that there are degrees of reality will seem strange to most people nowadays. It was a familiar one, however, to medieval and early-modern philosophers. For the medievals, things that were dependent on other things for their existence had less reality than did the things on which they were dependent. People, for example, had less reality than God. God had created people, hence people were dependent for their existence on God, whereas God’s existence was absolutely independent of anything else. God was the ultimately real thing, the thing with the greatest degree of reality, the thing that was more real than any other thing.

Kierkegaard also appears to have appropriated this idea of degrees of reality. Human beings, according to Kierkegaard, begin as ideas in the mind of God. The telos of an individual human life is therefore to bring the substance of that life into conformity with the form God conceived it should have. That’s what Kierkegaard means, I would argue, when he asserts that we must become who we are. We must become concretely who we are for God abstractly.

Most people, and that includes most athletes, don’t do that. Rather than striving to instantiate the ideal of their uniqueness, they constantly compare themselves to other people and try, in effect, to be better at being those people than those people are themselves.

There’s nothing wrong with competition. Competition can push athletes to higher levels of performance than they might otherwise achieve. What has not been adequately articulated, however, is precisely how this works. Competition improves performances, I would argue, only when athletes strive to instantiate a transcendent ideal that no particular performance can ever adequately instantiate. An athlete who strives in this way to instantiate an ideal provides a glimpse into the essence of that ideal that can spur on others in their own pursuit of it.

That’s a very different sort of phenomenon, however, from that of one athlete effectively copying another in the belief that he can do what the other has done better than the other did it himself. That kind of competition is inherently frustrating for the athlete in that he is trying to be something he’s not, and boring for the spectator in that he’s being subjected to what are effectively a bunch of imitations. When athletes strive only to win, rather than to be the best that they can be in their chosen sport, the reality of all the participants in a competition is diminished. Each becomes merely a copy of the others, and the ideal, which in a sense is more real than is any particular attempt to instantiate it, is lost sight of.

 

The idea that there are degrees of reality provides us a way to explain something that is otherwise inexplicable–greatness. Philosophers distinguish between quantitative and qualitative differences. A thing can be more or less blue, for example, in a quantitative sense. To be red, on the other hand, is to be something else entirely. Red is qualitatively different from blue.

A performance that is great is not distinguished from other performances in a merely quantitative sense. There’s something more to it that sets it apart. Greatness is qualitatively different from skill, even the most highly refined skill. It’s possible to execute a movement in a manner that many would judge to be technically perfect, and yet to be uninspiring. Conversely, it’s possible to deviate from universally accepted standards of performance and yet move an audience more profoundly than someone who is merely a consummate technician.

Part of this has to do with passion, but it is not reducible to passion. Passion is necessary for greatness, but it’s not sufficient. Passion is a natural attribute. Some people have more, others have less, just as some people have more or less patience than other people. Greatness, on the other hand, is not a natural attribute. A great artist, as every great athlete is, has to be passionate, and yet he also has to be more than that. He has to have a gift. That’s why greatness is edifying. It bursts the confines of the temporal-phenomenal world, provides us with a glimpse of something that is transcendent. There’s a spark of divinity to it.

That’s why the sport/art dichotomy is false. All great athletes are artists. They give us glimpses of the sublime by bringing into their performances something more in a qualitative rather than a quantitative sense. That’s why it’s wrong for athletes to strive merely to win. It’s not simply that striving to win, as Aristotle pointed out, is misguided in that winning is something over which one has no direct control. To strive to win is to aim for the quantitative rather than the qualitative, and that is inherently limiting. Athletes who strive to be the best they can be at their chosen sport rather than simply to win this or that contest are pursuing something transcendent. That’s ennobling, both for the athlete and the spectator.

Why then is winning so important? Because it is more obviously valued than is being sublime. It takes less energy, less effort, less engagement on the part of the spectator to be caught up in a contest than to be caught up in a performance. We can follow a contest with only half, or even less, of our attention. To follow a performance, on the other hand, is energy intensive. Human beings, like every other living creature, like to conserve energy. Contests are a way of doing that. We are told who the winner is rather than having to determine that for ourselves. To follow a performance, in contrast, requires us to be fully present in the moment, to bring all our capacities of attention and discrimination to the fore.

When we do that, when we truly follow the performances of athletes, we sometimes find that the superb performance is not always the one that wins. There are a variety of reasons for that. Sometimes reputations of athletes unduly influence scores. Other times the scoring systems themselves are simply too arbitrary and opaque to ensure that the best performance wins. Finally, scores are sometimes manipulated to ensure that particular athletes win, independently of how well they perform.

All of these reasons are traceable back, however, to a suspicion of the ineffable. It’s ultimately impossible to articulate what makes a performance great, and not everyone is an equally good judge of greatness. So in the service of fairness, we attempt to construct a set of objective criteria for evaluating performances, and the performance that best satisfies these criteria is the one we call “the winner.”

 

*

The name of the skater I saw a few years ago is Alexander Aiken. I tried to follow his career for a while. If there were a competition in the area I would go in the hope of seeing him, and I would look for news of his results in Skating magazine, the official publication of U.S. Figure Skating, the governing body of the sport. I eventually lost track of him, however, as my interest in the sport waned. The new judging system has imposed a level of conformity that is increasingly making skating boring to watch, and the perennial problem of inequities in the judging too often make the results of competitions an offense to the fair minded.

I quit following competitive skating. I continued to skate myself, though,  because it is the only real exercise I get. When I arrived in Jacksonville, where my husband teaches and where I spend half the year when I am not teaching in Philadelphia, I was surprised to find that a very advanced skater had recently begun to train there. I noticed him as I entered the rink and stopped to watch him for a few minutes. Something about him looked familiar. And then I realized who it was; it was Alexander Aiken. He was older, of course, than he had been the last time I’d seen him, but his looks had otherwise not changed much. I think it was less his face, though, than his skating that caused the shock of recognition to run through me. His skating is distinctively beautiful.

I could hardly believe the coincidence of his showing up to train in Jacksonville. I’d first seen him in Philadelphia and had learned then that he was from Atlanta. What, I wondered, was he doing in Jacksonville? I went over and introduced myself when he finally got off the ice. I told him how I’d seen him years ago and had been impressed with his skating. He smiled and thanked me politely and continued unlacing his skates. I learned later, from his girlfriend Michelle Pennington, who is a former competitive ice dancer and one of the instructors at the rink, that he’d moved to Jacksonville to live with a sister whose husband was in the military and was stationed there.

We skated together, Aiken and I, the sublime and the ridiculous, through the end of the summer and into the early fall. It was wonderful. Most of the time, we were the only two people on the ice. I was concerned that my presence might interfere with his training, but it was wonderful to be able to observe a great athlete so closely, and he went out of his way to make me feel welcome. Aiken brought a better face to the sport than the one I had seen of late and that helped bring back the joy I had earlier taken in it.

I was excited to have someone to cheer on again in competitions. Aiken was going places. He’s not just supremely graceful; he has enormous athletic ability. He’s able to land triple axels solidly and consistently, the jump widely considered to be the most difficult in the whole sport.  He won the bronze medal at the 2011 national figure skating championships in the Junior Men’s division and had competed at the Senior level for the first time last year. He hadn’t placed terribly well, but that’s how the sport works. Skaters are rarely allowed to place well their first year in “seniors.”

 

The nationals are this week in Omaha. The senior men compete on Friday and Saturday. You won’t see Aiken there though. He’s been plagued over the last few years, as so many skaters are, by the astronomically high costs of training. The stress of that has taken its toll on him. He narrowly missed qualifying for nationals and decided he’d had enough. He’s quit skating, or at least quit competing. He said he can no longer afford the $50,000 he’d had to pay every year to train. He’d gotten some help, of course––most skaters at his level do––just not enough.

It’s hard for me to say, finally, which spectacle is more ennobling: the sublime performance that wins the contest, hence reinforcing our faith in providence, or the one that doesn’t. I think sometimes that it’s the latter. The celebrity of the winner makes him a kind of public figure, someone who belongs, in a sense, to the masses, whereas the triumph of the athlete who achieved greatness but did not win is a more private thing, something that belongs only to himself and that select group of spectators whose intensity of attention has initiated them into the realm of the transcendent.

No skater I’ve ever seen in person has made such a strong impression on me as Alexander Aiken has. He’s a sublime skater, a great athlete, a great man. This piece is for him.

(This article has been excerpted from Sequins and Scandals: Reflections on Figure Skating, Culture, and the Philosophy of Sport. I’m indebted to Michelle Pennington for her help with it.)

Where the Conflict Really Lies

9780199812097_custom-c96a4e01f4f5fdd8283f6cf84c1289baddd1d4e5-s6-c30Alvin Plantinga is one of the most prominent figures in a group of philosophers who work on what one could call religious epistemology. Plantinga has decided to take on the “new atheists” in his latest book Where the Conflict Really Lies: Science, Religion, and Naturalism (Oxford, 2011) and while I applaud that project, I’m not optimistic that he is going to succeed in the manner he hopes he will.

Plantinga writes in the preface that “according to Christian belief, God has created us in his image, which includes our being able, like God himself, to have knowledge of ourselves and our world.”

Really? Our knowledge is like God’s? So God is constantly having to discard flawed theories concerning the nature of physical reality in favor of what appear to be better theories? So the discrete bits of God’s “knowledge” are as incompatible as general relativity and quantum mechanics? That’s a disturbing thought. I like to think God is always right, not that he is constantly getting things wrong and having hence to revise and improve his theories.

Plantinga contends that knowledge of physical reality is possible only if one assumes that there’s some kind of pre-established harmony between the way our minds work and the way the world is. That’s actually a very reasonable claim. He’s right in that without some assumption of that sort, we’re stuck in the Kantian realm of the way the world is for us, rather than the way it is in itself. Humanity has been deeply uncomfortable with this insight ever since Kant (or more correctly, the Pyrrhonists) first expressed it. The view that knowledge liberates us from the confines of our subjectivity seems an almost ineradicable intuition, a fact about the way the mind works. And yet, it is not merely difficult to defend; many would argue that it’s demonstrably false.

It’s not that science is a free for all, or that reality is however and whatever we think it is. To say that we cannot escape the confines of our subjectivity, even when we are at our most “objective” (as is the case, hopefully, when we are doing empirical science), is merely to say that the world is always going to look to us the way it does, not because of the kinds of individuals we are in particular, but because of the kinds of creatures we are in general. Kant, as I indicated above, didn’t actually discover this fundamental truth about what you could call our epistemological predicament. This insight goes at least all the way back to the ancient skeptics.

Plantinga is correct in his observation that atheists who claim to base their views in science lack support for their belief in the veridical nature of scientific “knowledge.” He’s incorrect, however, in his claim that believers stand on firmer ground.

Plantinga asserts that God “created us and our world in such a way that there is a match between our cognitive powers and the world. To use the medieval phrase, there is an adaequatio intellectus ad rem (an adequation of the intellect to reality).”

“Medieval” is the operative word here. Plantinga seems stuck in some medieval world view. He appears to have forgotten that science “progresses.” That is, he appears to have forgotten that we are constantly getting things wrong. The history of science makes it glaringly obvious that the purported fit of which Plantinga speaks between our cognitive powers and the world is far from a good one.

The views Plantinga expresses in Where the Conflict Really Lies are not new. He’s been engaging in elaborate machinations for years in an attempt to defend his position concerning the “match between our cognitive powers and the world.” One of the most intractable problems in the history of epistemological theorizing is known as “the Gettier problem.” Edmund Gettier published a little two-page paper entitled “Is Justified True Belief Knowledge” back in 1963 that has been the bane of epistemologists ever since. Basically, what Gettier showed is that it is possible to have a justified belief that is true by accident, or a belief where the justification is not related to the truth in the way we intuitively believe it ought to be.

The second of Gettier’s two counter examples to the view that knowledge is simply justified true belief concerns a man, Jones, that another man, Smith, has reason to believe owns a Ford. Why does Smith believe this? “Smith’s evidence,” writes Gettier, “might be that Jones has at all times in the past within Smith’s memory owned a car, and always a Ford, and that Jones has just offered Smith a ride while driving a Ford.”

“Let us imagine, now,” continues Gettier,

that Smith has another friend, Brown, of whose whereabouts he is totally ignorant. Smith selects three place names quite at random and constructs the following three propositions:

  1. Either Jones owns a Ford, or Brown is in Boston.
  2. Either Jones owns a Ford, or Brown is in Barcelona.
  3. Either Jones owns a Ford, or Brown is in Brest-Litovsk.

Each of these propositions is entailed by (f) [the belief that Jones owns a Ford]. Imagine that Smith realizes the entailment of each of these propositions he has constructed by (f), and proceeds to accept (g), (h), and (i) on the basis of (f). Smith has correctly inferred (g), (h), and (i) from a proposition for which he has strong evidence. Smith is therefore completely justified in believing each of these three propositions, Smith, of course, has no idea where Brown is.

But imagine now that two further conditions hold. First Jones does not own a Ford, but is at present driving a rented car. And secondly, by the sheerest coincidence, and entirely unknown to Smith, the place mentioned in proposition (h) happens really to be the place where Brown is. If these two conditions hold, then Smith does not know that (h) is true, even though (i) (h) is true, (ii) Smith does believe that (h) is true, and (iii) Smith is justified in believing that (h) is true.

It’s kind of a contrived example, but it makes a good point. A justified belief can be true by accident. When it is, that is, when the justification does not relate to the truth of the belief in the way we think it ought to, we’re inclined to think that the belief in question does not amount to knowledge, even though it satisfies what have long been taken to be conditions necessary and sufficient for knowledge.

Everyone has been trying to better Gettier and this has generated some very interesting work in epistemology. No one seems able to do it, however, without abandoning some intuition we feel is basic to knowledge. You can get around the Gettier problem, for example, if you just add a proviso onto your account of justification that requires that it relate to the conditions in the world that are responsible for the belief being true. The only problem with such an account of justification is that we are never in a position to determine whether this is the case. It is possible, on such a view, to have knowledge; you just can’t know when you know something. The problem is that we’re inclined to believe that you can’t know something without also knowing that you know it.

So the project to better Gettier continues…

Plantinga believes he’s done it though. He gives a detailed account of how his theory of what he calls “warrant” (which is Plantinga’s version of “justification”) avoids the Gettier problem in his book Warrant and Proper Function (Oxford, 1993). Look closely, however, at what Plantinga says about how “warrant” avoids the Gettier problem. The basic idea, says Plantinga, is simple enough:

[A] true belief is formed in these cases all right, but not as a result of the proper function of all the cognitive modules governed by the relevant parts of the design plan [i.e., God’s plan]. The faculties involved are functioning properly, but there is still no warrant; and the reason has to do with the local cognitive environment in which the belief is formed. Consider the first example, the original Smith owns a Ford or Brown is in Barcelona example. Our design plan leads us to believe what we are told by others; there is what Thomas Reid calls “the Principle of Credulity,” a belief-forming process whereby for the most part, we believe what our fellows tell us … [C]redulity is part of our design plan. But it does not work well when our fellows lie to us or deceive us … as in the case of Smith, who lies about the Ford” (33-34).

What, you ask? Who said anything about lying? Gettier doesn’t say anything about lying. Jones never says he owns a Ford. Smith’s evidence, again, is “that Jones has at all times in the past within Smith’s memory owned a car, and always a Ford, and that Jones has just offered Smith a ride while driving a Ford.”

You can’t avoid the Gettier problem by pointing out that God designed us generally to believe what other people say because no one lies in either of Gettier’s examples. It would appear that Plantinga hasn’t even read Gettier because the example in question is Jones owns a Ford or Brown is in Barcelona, not Smith owns a Ford or Brown is in Barcelona, and it is the second of Gettier’s two examples (or counter examples) not the first.

If God had a design plan for the operation of the human intellect, I’m inclined to believe that part of that plan was that we should actually read the works on which we argue we’ve improved. Something went wrong with that plan somewhere!

(An earlier version of this post appeared in the Sept. 27-29 Weekend Edition of Counterpunch).

 

 

On Death and Dying

Otis elementary school 2One of the most frightening things, I think, about dying is that we do it alone. Of all the natural evils for which one would like to blame the creator, this seems one of the worst. It would have been so much better, wouldn’t it, if we left this life in groups, left perhaps with the people we came in with, with the children we remember from our earliest days in school, and perhaps also with the people we have come to love, if they are suitably close to us in age. If we could go in groups, as if on a field trip, it would be easier.

But we go alone, even those unfortunates who die in accidents that take many lives die effectively alone because they don’t have time, really to appreciate their fates as shared. They say the people who remained on the Titanic sang as the ship went down. That’s what I’m talking about. It would be so much better, so much easier to bear if we were assigned a time along with many others. We could begin to gather a little before that time, all of us who were assigned to leave together, we could begin to gather and prepare ourselves and share with one another the joys and sorrows of our lives. If we did that, I think we would realize that our lives had really all been variations on the same theme, that we were not so different from one another as we had thought.

I’m not certain if I believe in life after death, even though I am very religious. I’m not certain what it would be for. I doubt I will be ready to leave this life when my time comes. I think I’d like to live much longer than I know I will, say three or four hundred years. I think I’d eventually get tired of living though, so the prospect of living forever is not all that appealing.

It seems to me, however, that if there is life after death, that that place where we will all go (and I believe we will all go to the same place because I am a universalist), wherever it is, that we will all actually arrive there together. Even though each of us will die individually, alone, if we go anywhere, it is to eternity and since there is no temporal change in eternity, there cannot be any arriving earlier or later. Where we will go will be where everyone will go at the same time, or where everyone, in a sense, already is. There will be no waiting for the loved ones who die after us. They will be there waiting for us, so to speak, when we arrive, even if they are in the bloom of youth when we leave.

When I think about death, which I do more and more as I get older, I wonder if perhaps part of the point of it, of the horrible specter of that trip one must take alone, is precisely to make us understand that we never really are alone. And by that I don’t mean simply that God is always with us, although I do mean that also. I mean that we are all part of the whole of humanity, that we are connected to everyone and, indeed, to every living thing.

There is a poem I love by Molly Holden that conveys this sense of connectedness very well. It’s called “Photograph of Haymaker, 1890.” It goes like this:

It is not so much the image of the man
that’s moving — he pausing from his work
to whet his scythe, trousers tied
below the knee, white shirt lit by
another summer’s sun, another century’s —

as the sight of the grasses beyond
his last laid swathe, so living yet
upon the moment previous to death;
for as the man stooping straightened up
and bent again they died before his blade.

Sweet hay and gone some seventy years ago
and yet they stand before me in the sun,

That’s not the whole of the poem. I left out the last couple of lines for fear of violating copyright. You can read the whole of it though if you go to Poetry magazine. Of course the poem is about the haymaker in that it’s about mortality which is inseparable, I think from temporality. Time passes, people pass, as they say. The haymaker will pass, just as the grasses he’s cutting down in the vigor of his manhood. And he is gone now of course the man who was young and vigorous in that photo taken so long ago.

I love to read philosophy and learn that others who lived and died long before me had precisely the same thoughts that I have had. I feel suddenly linked to those people in a mystical way. I feel as if they are with me in a strange sense, that we are together on this journey we call life, even though they completed it long ago.

Kierkegaard speaks often about the idea of death and how one must keep it ever present in his thoughts. I did not understand this when I first read it, but I believe I do now. To think about death, really to think about it, to think it through, will bring you right back around again to life and what a miracle it is, and by that I don’t mean your own small individual life, but all of it, life as a whole, and you will be filled with reverence for it. You will be kinder to every creature.

And you will feel less alone.

This piece is for Otis Anderson, February 6, 1959 – July 14, 2013.

Dawkins’ Delusions

Cuisinart EM-100

Cuisinart EM-100

I’d put off reading any of the ”new atheists” until recently. What I knew of their criticisms of religion had not impressed me as particularly sophisticated or even as new, so there seemed no urgency to read them. I’m teaching philosophy of religion this term though and my students expressed a desire to look at the new atheists, so I reluctantly purchased a copy of Richard Dawkins’ The God Delusion and began reading it in preparation for class.

I was afraid I wouldn’t like it. I was wrong. It’s hilarious! Not only has it caused me to laugh out loud, but it has brought home with particular force what an egalitarian industry publishing is. Anyone can publish a book, even a blithering idiot making claims that are demonstrably false and pontificating on things he knows nothing about and on works he has not read.

To be fair to Dawkins, I should point out that he’s clearly not a run-of-the-mill blithering idiot or he’d never have risen to his current position of prominence in science. He’d have been wise, however, to have restricted his public pronouncements to that field. His foray into the fields of religion and philosophy has made it clear that he’s closer to an idiot savant on the order of the infamously racist Nobel Prize winner James D. Watson, than to a genuine intellectual such as Stephen Jay Gould.

The preface to the paperback edition of The God Delusion includes Dawkins’ responses to some of the criticisms that were advanced against the book when it first appeared. In response to the charge that he always attacks “the worst of religion and ignored the best,” Dawkins writes

If only such subtle, nuanced religion predominated, the world would surely be a better place, and I would have written a different book. The melancholy truth is that this kind of understated, decent, revisionist religion is numerically negligible. To the vast majority of believers around the world, religion all too closely resembles what you hear from the likes of Robertson, Falwell or Haggard, Osama bin Laden or the Ayatollah Khomeini. These are not straw men, they are all too influential, and everybody in the modern world has to deal with them (p. 15).

From where does Dawkins get his statistics concerning the proportion of religious believers who subscribe to “understated, decent, revisionist” views of religion? How does he know their numbers are negligible? Evidence suggests otherwise. That is, most people in the economically developed world appear to accept modern science, so if surveys concerning the proportion of the population in this part of the world who are religious are correct, then the numbers of the “decent” religious people are not negligible, in fact, these people are vastly in the majority.

Of course to give Dawkins credit, he does refer to believers “around the world,” and not just in the economically developed part. It’s possible that Dawkins intends his book to enlighten the followers of Ayatollah Khomeini and other Muslim fundamentalist leaders, as well as to the few fundamentalists in the economically developed world who reject science. It does not appear to have been aimed, however, at such an audience and I’ve not heard anything about Dawkins’ underwriting the translation of the book into Farsi or Arabic.

Also, how come science gets to “develop,” but religion that has changed over time is referred to pejoratively as “revisionist.” Germ theory was not always part of natural science, but I wouldn’t call contemporary science “revisionist” because it now includes belief in the reality of microorganisms.

“I suspect,” writes Dawkins, “that for many people the main reason they cling to religion is not that it is consoling, but that they have been let down by our educational system and don’t realize that non-belief is even an option” (p. 22).

Dawkins is either being disingenuous in the extreme or he is, in fact, feeble minded. Notice he says “our” educational system, so here he is clearly not talking about Iran or the Middle East. The whole reason that it is occasionally controversial to teach evolution in school in the U.S. is that religious extremists have become offended by the ubiquity of evolutionary theory in the science curriculum.

Far from education “letting people down” in failing to make clear to them that non-belief is an option, it more often lets people down in failing to make clear to them that belief is an option. It tends to caricature religious belief in precisely the way Dawkins’ conflation of religion with religious fundamentalism does, with the result that young people are literally indoctrinated with the view that religion itself, not one particular instantiation of it (i.e., fundamentalism), but religion itself is simply a particular form of superstition that is essentially in conflict with the modern world view. Dawkins would appear to be a victim of such indoctrination himself in that he repeatedly conflates religion with religious fundamentalism. He acknowledges occasionally that not all religious people hold the views he attributes to them, but he can’t seem to remember this consistently.

The reader of The God Delusion is faced with a dichotomy unflattering to the book’s author: either a rigorous systematic distinction between religion in general and religious fundamentalism in particular taxes Dawkins’ cognitive abilities beyond what they can bear, or his repeated conflation of these these two distinct phenomena is cynically calculated to raise a false alarm concerning the purported threat that religion in general presents to the advancement of civilization in the hope that this alarm will cause people to storm their local Barnes and Noble in an effort to secure, through the purchase of his book, ammunition they can use to defend themselves against the encroaching hoards of barbarian believers.

In the preface to the original hard cover edition Dawkins writes:

I suspect— well, I am sure— that there are lots of people out there who have been brought up in some religion or other, are unhappy in it, don’t believe it, or are worried about the evils that are done in its name; people who feel vague yearnings to leave their parents’ religion and wish they could, but just don’t realize that leaving is an option (p. 23).

Really, he writes that, I’m not kidding. I cut and pasted that text from the ebook. Yes, Dawkins is seriously asserting that there are people “out there” who do not realize that it’s possible, even in principle, to reject the faith they were born into. Obviously, these are not church-going folks. If they were, they would surely notice the children who cease at some point (usually in late adolescence or early adulthood) to attend church with their parents, or overhear the laments of parents whose children have “left the faith” during the coffee and cookies that often follows services on Sundays.  These people who “just don’t realize that leaving is an option” must be a rare non-church-going species of fundamentalist. Even the Amish, after all, know that “leaving is an option.”

It’s admirable that Dawkins is so concerned about this infinitesimally small portion of humanity that he would write a whole book for their benefit. The view, however, that they represent a significant threat to Western civilization is hardly credible.

A charitable reading of Dawkins might incline one to think that what he meant was that it was not an emotional option, that it would wreak more havoc in their lives than they fear they could bear. (This, presumably, is why more Amish don’t leave the faith.) But if that were truly Dawkins concern, he’d have written a very different type of book because that problem has nothing to do with science or the failure of religious people to understand it.

Atheists, according to Dawkins, are under siege. “Unlike evangelical Christians,” he bemoans, “who wield even greater political power [than Jews], atheists and agnostics are not organized and therefore exert almost zero influence” (p. 27). Oh yeah, atheists exert “zero influence.” That’s why we’re all taught the Bible in school, right? And why my university, like so many universities in the U.S., has such a huge religion department relative to, say, the biology department.

Wait, we’re not taught the Bible in school, that’s part of what fundamentalists are so up in arms about. We don’t teach creation, we teach evolution. We don’t have a religion department at Drexel. We don’t even lump religion in with philosophy, as is increasingly common at institutions that appear to be gradually phasing out religion all together. We don’t teach religion period, not even as an object of scholarly study, let alone in an attempt to indoctrinate impressionable young people with its purportedly questionable “truths.”

The Penguin English Dictionary,” observes Dawkins, “defines a delusion as ‘a false belief or impression’” (p. 27). Is the belief that religion represents a serious threat to the advance of civilization not obviously false?  “The dictionary supplied with Microsoft Word,” continues Dawkins, “defines a delusion as ‘a persistent false belief held in the face of strong contradictory evidence” (28). Is there not “strong contradictory evidence” to the claim that atheists are under siege?

Is it possible that the survival of modern science really is threatened in Britain, in contrast to the clear cultural hegemony it enjoys in the U.S.? Maybe. Eating baked beans on toast has always seemed pretty backward to me. My guess, however, is that Dawkins suffers from the delusion that we in the U.S. are more backward than the folks on the other side of the Atlantic.

I’ll give Dawkins one thing. He’s right about how our educational system has failed us. That’s the only explanation I can think of for the popularity of Dawkins alarmist clap trap. It ought to be obvious to anyone with even a modicum of formal education that Dawkins is talking sheer nonsense. But then Dawkins is a scientist, not a philosopher or theologian. He simply doesn’t seem to understand Stephen Jay Gould’s lovely straightforward presentation of the nonoverlapping magisteria view of the relation between science and religion.

But then it’s hard to say whether Dawkins failure to understand, NOMA, as it is now called, is an expression of his cognitive limits or of his intellectual irresponsibility in that it appears he hasn’t actually read Gould’s paper. What makes me think this, you ask? Well, because Gould goes on at length in this paper about how creationism (Dawkins’ apparent primary concern) is “a local and parochial movement, powerful only in the United States among Western nations, and prevalent only among the few sectors of American Protestantism that choose to read the Bible as an inerrant document, literally true in every jot and tittle” (emphasis added), and one could add here “has made no inroads whatever into the system of public education.”

Perhaps Dawkins thought it was unnecessary to read Gould, that anyone who would defend religion must not be worth reading. We all have our blind spots. I, for example, though I am devoutly religious, refuse to believe that prayer effects any change other than in the one who prays. It’s not because of some paranoid fear I have of inadvertently falling into superstition. It’s because the idea of a God whose mind could be changed by a particularly passionate entreaty, that is, of a God who is capricious and vain, is not at all edifying to me. I refuse to believe God is like that, quite independently of anything that might be presented to me as evidence for or against such a view.

Fortunately, my understanding of the relation between science and religion is a little more sophisticated than Dawkins’, so I can rest easily in my convictions, unperturbed by the phantom of their possible overthrow in the indeterminate future by some hitherto unknown type of empirical evidence. There is no such thing as empirical evidence either for or against the truth of religious convictions of the sort I hold. Fundamentalists may have to live with their heads in the sand but people with a proper understanding of the relation between the phenomenal and numinal realms do not.

That’s where our educational system has failed us. Too many people, even well educated people, have been taught that science conflicts with religion, not with a specific instantiation of religion, that is, not with fundamentalism, but with religion period. Education has failed us in a manner precisely opposite to the one in which Dawkins claims it has. The problem is not that the educational system has led people to the position where they feel that non belief is not an option. The problem is precisely that the pretentious misrepresentation of the explanatory powers of empirical science and the reduction to caricature of anything and everything that goes under the heading of “religion” has led people to the position where they feel that belief is not an option.

I have enormous respect for honest agnostics, despite William James’ point in his essay “The Will to Believe,” that agnosticism is formally indistinguishable from atheism in that it fails just as much as the latter to secure for itself the good that is promised by religion. Agnosticism is at least intellectually honest. The question whether there’s a God, or as James puts it, some kind of higher, or transcendent purpose to existence, cannot be formally answered. Even Dawkins acknowledges that it’s not actually possible to demonstrate that there’s no God (though he asserts, bizarrely, that God’s improbability can be demonstrated). But if God’s existence cannot be disproved, then disbelief stands on no firmer ground than belief, so why trumpet it as somehow superior?

The fact is that we’re all of us out over what Kierkegaard refers to as the 70,000 fathoms. I’m comfortable with my belief. I’m not offended by agnostics. I’m not even offended by atheists. I’m not offended by the fact that there are people who don’t believe in God. I would never try to argue to them that they ought to believe. That to me is a profoundly personal matter, something between each individual and the deity. What’s strange to me is that there are many people, people such as Dawkins, who are apparently so uncomfortable with their atheism that the mere existence of anyone who disagrees with them on this issue is offensive to them. It’s as if they perceive the very existence of religious belief as some kind of threat. What kind of threat, one wonders, might that be?

Religious belief, at this stage of human history anyway, certainly does not represent a threat to scientific progress. Dawkins blames religion for the 9/11. Experience has shown, however, that terrorism, of pretty much every stripe, is effectively eliminated with the elimination of social and economic inequities, just as is religious fundamentalism. So why isn’t Dawkins railing against social and economic inequities?  That would appear to be a far more effective way to free the world of the scourge of religious fundamentalism than simply railing against fundamentalism directly. Direct attacks on fundamentalism are analogous to temperance lectures to people whose lives are so miserable that drinking is the only thing that brings them any kind of joy.

“[A] universe with a creative superintendent,” asserts Dawkins, “would be a very different kind of universe from one without one” (p. 78). But what is the difference for people such as NIH director Francis Collins, and myself, who believe that the description of the universe that is provided by science is precisely a description of the nature of God’s material creation? Dawkins is right in that there’s a difference between those two universes. He’s wrong though in believing that difference to be material.

Suppose that one morning you found on your doorstep an apple. Suppose you love apples. Suppose as well that though you could not preclude the possibility that this apple had simply fallen from an overly-full grocery bag of some passerby, for some reason that you cannot explain, you were infused with the conviction, as soon as you laid eyes on the apple, that someone had placed it there for you. What a lovely thought! The whole experience changes your morning, even your day, in a positive way.

In a material sense, of course, it makes no difference whether the apple came there by chance, or by design. It is the same apple, after all, whatever the explanation for its presence. It is not at all the same experience, however, to believe that one has found an apple by chance and to believe one has found it by design.

Now suppose a well-meaning friend, points out the superfluity of your assumption that the apple had been placed there by someone. Suppose this person pointed out that nothing in the mere presence of the apple compelled such an assumption and that you should thus content yourself with a “natural explanation” of how it came to be there. Ought you to abandon your belief in your invisible benefactor? What would you gain by abandoning it? If your friend had been ridiculing you for your “foolishness,” then presumably that would cease. You would regain his respect. But at what cost? It’s none of his business what you chose to believe in such an instance. That he would make fun of you for believing something the truth of which he cannot disprove but which makes you happy paints a very unflattering picture of him. So you would regain the respect of someone whose respect many would rightly disdain, even while you would lose something that had made you happy. And why is the explanation you have supplied for the presence of the apple less “natural” than his? You didn’t assume the apple had spontaneously sprung into existence. The real difference between your view of how the apple came to be there and his is that yours is nicer, that it makes you feel better.

Or to take a more apposite example in my case: Say that for as long as you can remember, you’ve wanted one of those fancy, expensive home cappuccino makers. You know the ones I’m talking about. Not the little cheapie things that can be had for under a hundred dollars, but the really expensive ones that resemble the real thing that they use in fancy cafes and coffee houses. Say that you have always wanted one of these fancy cappuccino makers but because you had chosen the life of an academic and the modest salary that went along with it, you felt a fancy cappuccino maker was an extravagance you simply couldn’t allow yourself. Lawyers can afford such things you reasoned, but then they also needed them because they are generally very unhappy in their work. If you had gone to law school, you could have had a fancy cappuccino maker. You knew this, of course, but chose to go to graduate school in philosophy instead because you believed a career in philosophy would be more fulfilling than a career in law. You made your choice and so must content yourself with a fulfilling career and more modest coffee-making set up.

This seems to you a reasonable trade off, so you do not waste away large portions of your life lusting after a fancy home cappuccino maker. Still, you do think wistfully of such machines sometimes, particularly when you see them in the homes of your lawyer friends, or in one of those fancy kitchen stores that always have so many of them. You have accustomed yourself, over time, to this occasional quiet longing.

But then one Saturday, when you are on your way back to your apartment, after having done your morning shopping, you spy a large bag on the sidewalk in front of one of the houses on your block. People often put things out on the sidewalk that they no longer want, so you stop to see if there is anything there you might be able to use. As you approach the bag, your heart begins to beat more quickly. Peeping out of the edge of the bag is what looks for all the world like the top of one of those fancy, expensive cappuccino makers that you have always wanted. You peer disbelievingly into the bag and discover that not only does it indeed contain such a machine, but all of the accoutrements that generally go with them, a little stainless steel milk frothing jug, metal inserts in both the single and double espresso size (as well as one to hold those Illy pods that you would never buy because they are too expensive), and a coffee scoop with a flat end for tamping down the coffee. As you are peering into the bag, your neighbor emerges from the front door of her house with more bags of stuff to put out on the sidewalk.

“Are you giving this away?” you ask tentatively.

“Yes,” she replies.

“Does it work?” you ask.

“Yes,” she replies.

“Why are you giving it away?” you ask incredulously, convinced that any minute she will change her mind.

“Well,” she says nonchalantly, I’ve had it for four years and never used it. I figure that if you have something for four years and never use it, you should get rid of it.”

You nod and laugh, affecting a nonchalance to match your neighbor’s. As soon as she has disappeared into the house, though, you snatch up the bag that contains the machine and all the accoutrements and stagger under its weight the short distance to your door. You download the manual for the machine (a Cuisinart EM-100, which you discover retails for $325), set it up and give it a trial run. It works like a dream!

Your innermost wish for a fancy, expensive cappuccino maker has been fulfilled! One was deposited practically on your doorstep. Of course it came there in a perfectly natural, explicable way, but still, your heart overflows with gratitude toward God whom you believe has arranged the universe, in his wisdom and benevolence, in such a way that this fancy, expensive cappuccino maker should come into your possession now. God has favored you with the rare and coveted have-your-cake-and-eat-it-too status in that you have been allowed to pursue your life’s calling of being a philosophy professor and have a fancy, expensive cappuccino maker!

You do not need to attribute this turn of events to any supernatural agency in order to see “the hand of God” in it. It does not trouble you to think that your neighbor had very likely been considering putting that machine out on the street for quite some time. That the whole event came about very naturally. But still, it is deeply significant to you and fills you with a sense of awe and wonder. Why should that bother Richard Dawkins?

It is fair, of course, to point out that you might just as well be annoyed that God had not arranged for you to receive this fancy, expensive cappuccino maker earlier. But you do not think that way. Why, you do not know. You attribute this wonderfully positive psychological dynamic to God’s Grace, but of course you could be wrong, perhaps it’s genetic. Earlier it seemed to you that the sacrifice of a fancy, expensive cappuccino maker in order to pursue your life’s calling was really not so very much to ask, and you accepted it stoically. Now, you are overcome with gratitude toward God for so arranging things that your wish for such a machine has been fulfilled. Earlier you were happy, now you are happier still. What’s wrong with that? That seems to me to be a very enviable situation.

Experience may incline us to expect certain emotional reactions to various kinds of events, but reason does not require such reactions. Many religious people are effectively deists in that they accept what scientists call the “laws of nature” and do not believe that God arbitrarily suspends those laws in answer to particularly passionate entreaties. Such people accept that God must thus be responsible in some way for the things they don’t like just as much as for the things they like, but consider that perhaps there is some reason for those things that human reason simply cannot fathom, and look to God for emotional support when the bad things in life seem to overwhelm the good and thank God when the reverse seems to be the case.

To be able to find strength in God when times are bad and to thank him (her or it) when times are good is an enviable gift. Who wouldn’t want to be like that? Of course it is possible to rail against God for not ensuring that times are always good, but it isn’t necessary. The failure to condemn or to become angry is not a failure of logic. Objectively, everything simply is, nothing necessitates a particular emotional reaction. The dynamic of faith is just as rational as the dynamic of skepticism. In fact, it could be construed as even more rational. That is, happiness is something that it is generally acknowledged human beings almost universally pursue and the dynamic just described is clearly a particularly good way of achieving it in that it effectively ensures a generally positive emotional state. Maybe believers are wrong, but even Dawkins acknowledges that no one will ever be able to prove that. Even if they are wrong, however, it seems there is little, if any harm, in their beliefs and a great deal of good.

Why does religion so offend atheists such as Dawkins? No one is forcing them to sign up. Dawkins is not alone in his outrage. It’s pervasive among atheists. The invectives they hurl at believers always put me in mind of those hurled by a child at the participants in an invisible tea party to which he has not been invited.

“There isn’t really any TEA there, you know!” he yells.

But is the outrage over the fictitious nature of the tea, that anyone should pretend to drink something that isn’t really there, or is it at not having been invited to the party? Perhaps the problem with the atheist is the feeling of being left out. Perhaps they are angry that other people get to enjoy something from which they have been excluded, something they have been led to believe is “not an option” for them.

(For a really excellent piece on The God Delusion see Terry Eagleton’s “Lunging, Flailing, Mispunching” in the London Review of Books.)

Hedonic Adaptation

Paintings over table in PhillyMy reflections here were prompted by an article in today’s New York Times entitled “New Love: A Short Shelf Life.” The article, by Sonja Lyubomirsky, a professor of psychology at the University of California, Riverside, is about how the euphoria associated with the first phase of romantic relationships tends to wear off relatively quickly. I’d initially planned to write a piece on relationships, but the more I thought about it, the more it seemed to me that the problem Lyubomirsky describes isn’t restricted to relationships. Lyubomirsky charges that romantic relationships are subject to the same dynamic of what psychologists call “hedonic adaptation,” as are other thrilling experiences. That is, euphoria, she observes, tends to be short lived, whether it is associated with “a new job, a new home, a new coat,” or a new love.

The first thing that annoyed me about the article was its purely speculative character, or more correctly, the fact that it was mere speculation paraded in front of the reader as scientific fact. “[A]lthough we may not realize it,” asserts Lyubomirsky, “we are biologically hard-wired to crave variety.” Says who? Where is the scientific evidence to support such a claim? We like variety in some things, to be sure, but we like uniformity in others. We appear, in fact, to crave uniformity at least as much as we crave variety. We need, for example, to be able to assume that the future will resemble the past in crucial respects if we are going to be able to function at all and are notorious for being unable to appreciate variety when the variety in question would tend to discredit the worldview to which we have become comfortably wedded.

My point is not that Lyubomirsky is mistaken, or that she has no right to indulge in such speculations. My point is that they are speculations and should not be presented as if they were facts. One reader, Joseph Badler, made the point beautifully. “The ‘we are biologically hardwired’” he wrote, “is just too cheap. It can be used to justify anything. Evo psych post-hoc explanations are making everybody intellectually lazy.”

“Evo psych” refers to evolutionary psychology, which, if you ask me, is a completely bogus discipline that purports to provide evolutionary explanations for traits of human psychology. Why do people appear to crave variety in their sexual partners? Well, the evolutionary psychologist responds (and here I am paraphrasing Lyubomirsky) , because it guarantees a more robust gene pool. That makes sense, of course, but so does the observation that infidelity can be corrosive of social bonds and that promiscuity could thus threaten both the immediate family and the long-term survival of the entire community.

So which is it? Are people hard-wired to crave variety to ensure a more robust gene pool, or are they hard-wired to crave uniformity to be better able to survive to the age of reproduction? Or could they be hard-wired, as seems the most likely, to crave both things relative to particular environments and situations? But if this is the case, then evolutionary “explanations” for psychological traits are obviously speculative because of the seemingly limitless variables one would have to take into account when calculating in what sense natural selection might lie behind a particular psychological tendency.

The situation of the evolutionary psychologist becomes almost unmanageably complex even if we assume that all human beings exhibit the same psychological tendencies. Once we acknowledge that all human beings do not exhibit the same psychological tendencies, then evolutionary psychology, becomes, I would argue, a mere parody of an academic discipline. That is, I’d go further even than Badler. I don’t think it’s simply making people intellectually lazy. I think it’s making them stupid. That it continues to be respected as an academic discipline suggests that the academy is egalitarian to a fault in that even the criterion that one ought to be able to think clearly in order to be admitted to it appears to have been judged unfairly discriminatory.

A number of readers took exception to the comparison of a new love with new material possessions such as a “home” or “coat.” (I almost always enjoy reading the comments readers post to articles such as this one. They confirm my faith that the average person is neither so simple minded nor so superficial as the authors of the articles appear to assume). “Didn’t know love was material,” observes Anna from Ontario wryly.

It’s true that our relationships with people are importantly different from our relationships with things. They may not be so different, though, as some of the opponents of Lyubomirsky’s apparent materialism assume. Another reader points out that Lyubomirsky and those who agree with her “do not consider how the disposable and planned obsolescent qualities of consumer capitalism also ‘program’ us to always desire the new.”

I wouldn’t put all the fault, though, on consumer capitalism. Consumer capitalism, after all, is an expression of something in human nature. Unfortunately, it is the expression, I would argue, of one of the less appealing tendencies in human nature–impatience.

The thrill of the new is something with respect to which we are largely, if not entirely, passive. It wears off though. To continue to be thrilled by the same thing requires diligent effort. The problem with consumer capitalism is that what it parades for our approval is primarily the cheap and tawdry, things that glitter but which are not gold. Such things thrill us before we are fully aware of what they are. Once we learn what they are, they cease to thrill because there is nothing inherently thrilling about them.

Of course even things that are inherently valuable and which thus ought to be inherently thrilling are subject to the same dialectic. We are thrilled with the initial acquisition of them, but that thrill eventually wears off, or at least quiets down. It doesn’t take a great deal of intellectual effort, however, to appreciate that the thrill that dies down in this way is the thrill of acquisition rather than of possession. We are thrilled to have acquired a thing, but then we get used to having it. If it is truly something worth having, though, and we are capable of appreciating it as such, then the initial euphoria of acquisition should be replaced by the more enduring thrill of possession, or more correctly, of appreciation. The problem is, such appreciation requires effort. It requires that we look at the thing again, look at it long and carefully, that we actively search for what is good and valuable in it, rather than simply surrender ourselves to a passive thrill.

Years ago, when I first became engaged, my sister caught me admiring my engagement ring. “You’ll stop doing that after a while,” she said. I found that remark disturbing. I didn’t want to cease to see my ring as beautiful any more than I wanted to cease to love the man who had given it to me. It will happen to you though, her words suggested, independently of what you want. It will happen to you. Kierkegaard talks about that dynamic in the first volume of his two-volume work Either-Or. Everything disappoints, he, or at least one of his pseudonyms, says there.

But does everything have to disappoint? I have never ceased to see my engagement ring as beautiful, just as I have never ceased to love the man who gave it to me. I’m a very materialistic person, in a way. I have lots of things, lots of nice things in which I take enormous pleasure that does not diminish with time. I collect paintings and fountain pens and antiques of various sorts, and each one of these possessions adds immeasurably to the quality of my life.

I love to sit at my table in the morning and sip my coffee (I love coffee!) and look at the two paintings I’ve hung on the wall on the far side of the table. One is a landscape I found in an antique store and the other is a still life I did myself. They are not great masterpieces, but they are very nice and I derive enormous pleasure from looking at them. I look at them in the morning when I am having my coffee and in the evening when I am having dinner. I often work at that table in the afternoon and I’ll glance admiringly up at them periodically even then.

I don’t know what it is exactly that I like so much about them. Each is rough, and yet each is the product of some person’s vision. I like people. They are endlessly fascinating to me. I love handiwork because you can see the humanity in it. I like things because I like creation. I value it as something beautiful and moving. One reader of Lyubomirsky’s article observed that the reason her marriage had been happy until her husband’s death was that they had “had God.” Another reader pointed out, however, that that approach to keeping love alive won’t work for everyone because not everyone is religious. He (she?) went on to point out, however, that “looking beyond oneself and working toward the greater good (of one’s spouse, family, community, world) may be an essential element in the pursuit of lifelong happiness.” I’d agree with that. I’d argue, however, that unless you think that creation, or the universe, or whatever, is good, then even the “greater good” of one’s spouse, family, community, and even the world, will ultimately fall flat.

The challenge, I’d argue, to achieving an enduring happiness is that we’ve programmed ourselves, in a sense, to believe that happiness is inherently fleeting. That the thrill of acquisition is the only thrill there is. Whether that is the fault of consumer capitalism alone or whether it is an expression of something inherent in human nature, I’ll leave it to the reader to decide.