Something to be Thankful For

I didn’t know that Trump had won the election until I woke up on the following Wednesday morning. I had neither the heart nor the stomach to watch the election returns Tuesday night. This was the worst election in my memory, in my lifetime, possibly in this country’s history. I knew watching the returns would be depressing. I wanted to watch something uplifting, something edifying, so I watched the movie 42 about Jackie Robinson. 42 may not achieve the same level of cinematic greatness as To Kill a Mockingbird, or In the Heat of the Night, but the story of Robinson’s integration of baseball is a great story and it makes the film deeply moving despite its shortcomings. Watching it reaffirmed my faith in the average American, the average human being.

There have been a lot of apocalyptic predictions about what would happen if Donald Trump were elected president. There’s no question that he will be able to do a lot of damage, but he will not, as so many people seem to think, be able to turn back the clock to the bad old days of a virulently racist, sexist, and generally intolerant past. Don’t get me wrong. I don’t mean to suggest that we aren’t racist, sexist, or intolerant anymore. We are. We are not so bad as we used to be, though, not by a long shot, and nobody is going to be able to turn the clock back on that, not even Donald Trump.

An election-night guest on Democracy Now said that if Trump were elected, all bets would be off. “These young black people,” she said, “who have been lying down in the streets as part of the many Black Lives Matter protests have been able to count on motorists not running them over. Well, if Trump gets elected, she asserted, they won’t be able to count on that anymore.” I’m paraphrasing her, of course, because my memory is not so good that I am able to repeat verbatim what she said. That’s pretty close, though.

The thing is, I believe she’s wrong. Motorists are not going to start running over protestors. It’s not like they’ve had to be forcibly restrained from doing this by liberal law-enforcement officers. As I explain to the students in my applied ethics classes, fear of arrest is not the reason most people obey the law. You couldn’t have enough law-enforcement officials on the street if fear of arrest were the only thing ensuring order in society. Respect for law, for social order, is the reason most people obey the law. People understand its importance for ensuring social order. They want to live in an orderly society and most people, in my experience, feel their fellow citizens, their fellow human beings, similarly deserve to live in an orderly society.

Motorists are not going to start running over protestors because human beings generally abhor homicide. Most people, the overwhelming majority of people, wouldn’t run over their worst enemy, even if they felt confident that they could do this without any negative repercussions to themselves. People are not the monsters that those who try to shape public opinion would often have us believe.

One of the things I love about teaching is that it keeps me in touch with basic truths about human nature. The overwhelming majority of my students are conspicuously good, decent people. Even the ones who occasionally cheat, clearly do so out of fear. Inter-cultural, and even inter-racial couples are a common sight on campus. No one seems disturbed by their presence. I’ll never forget an experience I had a few years ago when somehow the conversation in one of my classes had turned to the subject of romantic relationships and one of my male students, when discussing his current relationship casually referred to his love interest as “he.” I hadn’t realized that this student was gay. Everyone else seemed to know this, however. At least they exhibited no surprise whatever at what was to me the revelation of this student’s sexual orientation. There was not the slightest pause in the conversation, no raised eyebrow, no suppressed giggles –– nothing!

Racism, sexism, and homophobia among college students consistently make headlines in The Chronicle of Higher Education and Inside Higher Education. I don’t mean to suggest that these things don’t exist among college students. They make headlines, however, because it is increasingly clear that they are the exception among college students rather than the rule.

Something analogous explains the rise of the Black Lives Matter movement. Police have always been killing young black men. Black Lives Matter is not a response to a recent spate of such killings. It is an expression of a growing intolerance of this perennial problem, especially in the face of video proof.

Televisions shows such as The Cosby Show and Will and Grace, not to mention decades of civil-rights activism, have humanized groups that were earlier demonized. Trump’s presidency wasn’t the only significant political change to come out of the recent election, more states legalized marijuana. The growth of the internet and the increasing ease of global communication more generally means many, if not most, Americans now know that a living minimum wage, universal healthcare, and free higher education are not impossible dreams but tangible realities in countries far less wealthy than the U.S. If some Americans think Obamacare went too far, polls suggest many, if not most, Americans think it didn’t go far enough.

We’re not perfect yet and likely never will be. Americans are getting progressively better, though, and we are going to continue to get better even if Trump’s election means the next few years will be ones of fits and starts.

This country has changed. It has changed irrevocably since the days of Bull Connor and death threats to those who would integrate baseball. We are a different country now than we were in our more ignorant and intolerant past. That’s something to be thankful for this Thanksgiving.

(This piece originally appeared under the title “Waking Up to Change” in the 10 November 2016 issue of Counterpunch.)

 

Election 2016

This election, Clinton supporters argued, was about stopping Trump. In fact, it is now clear that it was about stopping the growing movement in this country in the direction of genuine populism. Speaker after speaker who took the stage on the first night of the Democratic National Convention had to fight to be heard over chants of “Bernie, Bernie.” There was little applause for most of the speakers, but Sanders’ reception, when he finally got to speak, made it clear that he was the real popular choice for the Democratic nomination.

What the party apparently didn’t realize, however, was that Sanders’ popularity was not a product of his extraordinary charisma (almost anyone would seem charismatic compared to Clinton). It was a product of his populism. No one in the mainstream media got that that was what this election was really about. That’s what Trump and Sanders had in common. Independently of whether Trump’s populist rhetoric is sincere, it was the source of his appeal.

Liberals are considered to have won the culture war. Gay marriage is finally legal, state after state is legalizing marijuana, and for the last eight years, we have had what not so long ago was actually unthinkable –– a black president!

Some of Trump’s rhetoric may be racist, but his racism is not why he’s popular. There’s always some racist or other vying for the Republican nomination. Yes, racism still exists in this country, but it’s on the wane. Yes, police are murdering innocent black people, but they have always been doing that. The existence of the Black Lives Matter Movement shows that increasing numbers of Americans will no longer tolerate it.

What’s important, Sanders asserted when he conceded the Democratic nomination to Clinton, is keeping the revolution he started alive. Hillary Clinton, he announced, must be the next president of the United States! Did Sanders receive death threats from the DNC, or was he just not very smart? Sanders didn’t start the “revolution.” He simply rode a wave of populism that had been building long before he announced his candidacy for the Democratic nomination for president, and nothing was more antithetical to that movement than the Clinton campaign.

An anthropologist from Mars, to use a phrase of the late Oliver Sacks, would have a hard time making sense of the DNC’s support of Clinton in the face of Sanders’ clear majority of popular support. Both Sanders and Trump tapped a vein in this country. The party that won the election was the party whose candidate did that most effectively. Clinton clearly did not do that. Polls suggested that if she were nominated, she would lose.

So why did the party push her candidacy so relentlessly? Because her nomination would halt the progress in the direction of genuine populism. Halting that progress was more important to the party than was winning the election. Big business controls politics in this country and it is not about to surrender that control to a population that has had enough of it. Trump’s populist rhetoric is likely empty, so the possibility of his election is not so threatening to the forces that control this country as is the specter of Sanders’ election.

“Trump must be stopped!” Democrats chanted over and over. But this anti-Trump rhetoric was simply smoke and mirrors designed to conceal the real agenda of the party, which was to stave off the revolution in the direction of genuine populism. Democrats, the party bigwigs, that is, would rather lose with Clinton than win with Sanders. They are the people who benefit from the status quo. They are not about to see that change.

It is changing, though, whether they like it or not, and no amount of smoke and mirrors will stop it.

(This piece originally appeared under the title “Smoke and Mirrors in Philadelphia,” in the 27 July 2016 issue of Counterpunch. Yes, that’s right, I called this election before it happened, so not everyone in the media got it wrong.)

The Lily of the Field and the Snake in the Grass

Arts and Letters is a great website that publishes blurbs about interesting articles that are available online and posts links to those articles at the end of the blurb. I have made it the homepage of my browser so that I can stay up to date concerning what is being published in the humanities. I haven’t been keeping up with it recently, however, because I’ve had so much work to do. I’m home sick today, though, and when I opened my browser to get to Blackboard (the online learning platform Drexel uses) to email my students that I was cancelling class, I was surprised to see a blurb about an article on Kierkegaard.

As it turns out, the article is a review in the Times Literary Supplement of two new books on Kierkegaard, and a new translation of some of his religious discourses. The books are Mark Bernier’s The Task of Hope in Kierkegaard (Oxford, 2015), Sheridan Hough’s Kierkegaard’s Dancing Tax Collector (Oxford, 2015). The translation, is of the discourses published under the title The Lily of the Field and the Bird of the Air (Princeton, 2016). It isn’t a particularly good review. The titles of the books are intriguing, but there is little indication of their content in the review. In fact, the “review” is basically a very short summary of Kierkegaard’s life and works that isn’t always even correct. Will Rees, the author of the review, identifies Either-Or (1843) as Kierkegaard’s “first book.” Either-Or was preceded, however, by first Af en endu Levendes Papirer (From the Papers of One Still Living) (1838), and Om Begrebet Ironi (On the Concept of Irony) (1841).

As a child, observes Rees,

Kierkegaard was sensitive, sulky, ironical and precocious. In other words, he had precisely that youthful temperament which, while not a sufficient condition, is nonetheless a necessary condition for the later burgeoning of genius.

Really, I’m not kidding you. He actually says that. He says that all geniuses are necessarily “sensitive, sulky, ironical, and precocious” as children. It may well be that such traits are more pervasive among people who later prove to be “geniuses” (whatever it is, exactly, that that means). It strains credulity, however, to assert without qualification that all geniuses have such traits as children.

Rees also repeats the trope that Kierkegaard renounced the joys of “earthly life” in order to pursue his vocation as an author. Kierkegaard does occasionally speak this way himself. It is clear, however, that what Kierkegaard actually renounced was the not the joys of “earthly” life, but of a conventional life. That is, he renounced the joys of marriage and a family for those of a literary life. Kierkegaard was no ascetic. He ate well and dressed well. He relied on the services of a personal secretary and lived in relative luxury. In fact, he occasionally justified the expenditures associated with this lifestyle as necessary to sustain his creative productivity.

Rees explains that Kierkegaard’s assertion that “truth is subjectivity” is often misunderstood, yet his own explanation of the meaning of this assertion is confusing. It doesn’t mean, he explains, that “something becomes true by virtue of my saying or believing it to be true.” What it means, he continues, is that “beliefs acquire truth only in relation to the individual’s lived orientation toward them.” What’s the difference? Isn’t my believing something to be true more or less equivalent to my having a “lived orientation” toward it? I suppose that depends, at least in part, on what one means by “believing” and “lived orientation.” What is missing from Rees’ explanation is the very thing the omission of which has led to the pervasive erroneous understanding of this statement. Only what Kierkegaard refers to as “subjective truth” requires an individual’s lived orientation toward it. There’s a whole host of objective truths, according to Kierkegaard, as I explain in my book Ways of Knowing: Kierkegaard’s Pluralist Epistemology, that require no such orientation.

Rees fails to comment on the quality of the new translation of Kierkegaard’s The Lily of the Field and the Bird of the Air. The strangest part of Rees’ review, however, is that it fails to indicate the translator. Rees mentions the translation is “new,” but not who did it. This is a clear departure from the normal editorial practice of the TLS (see, for example, “They do the war in different voices,” “Storm and stress,” and “Orphaned solemnity,” September 30, 2016). That departure was less puzzling to me after I looked up the book on PUP’s website. The translator is none other than Bruce H. Kirmmse.

Princeton’s website describes Kirmmse as “one of the world’s leading Kierkegaard translators and scholars.” If that’s true, it’s an odd fact to omit in a review of a translation by him. Could it be that the TLS actually wanted to avoid calling attention to the identity of the translator? Readers of my blog on Kierkegaard are likely aware that there would be a good reason for this. Kirmmse effectively bought the title of “one of the world’s leading Kierkegaard translators and scholars” with the surrender of his ethics.

As I explained in an article in Counterpunch back in 2005, there is reason to believe that Kirmmse deliberately tried to obscure in his translation of a Danish biography of Kierkegaard, that the author of that controversial biography had plagiarized some of the book from earlier biographies. If he didn’t do this, then the anomalies described in the Counterpunch piece in Kirmmse’s translation suggest he’s not a particularly good translator.

Let’s assume, for the sake of argument, that Kirmmse didn’t try to cover up the plagiarism in the biography. Let’s assume he just isn’t all that good a translator. Being a mediocre translator isn’t a crime. But even if we assume Kirmmse didn’t try to cover up the plagiarism in the biography, he’s still guilty of failing to support the scholar who exposed the plagiarisms in the Danish media.

Of course failing to act in a way one ought to have done is not so bad as actually doing something one ought not to do. Unfortunately, Kirmmse is guilty of the latter as well as the former crime. He defamed me in an article entitled “M.G. Pietys skam” (M.G. Piety’s shame) in the Danish newspaper Weekendavisen, when I discovered that the plagiarized passages remained in his English translation of the Kierkegaard biography Kirmmse had translated and began to write about this. The article is a straightforward piece of character assassination designed to divert the attention of Danish readers from the issue of the problems with the biography and the promise of the author to fix those problems before the work was translated. The piece appeared only in Danish, for reasons that will be apparent to anyone who reads my English translation of the article in an earlier post to Piety on Kierkegaard:Bruce Kirmmse’s Shame.”

I don’t know whether Princeton knew of the controversy surrounding the book in Denmark when they agreed to publish an English translation of it. They should have, of course, but that doesn’t mean they did. They had learned of the problems with the book, however, by 2006 because Peter Dougherty, the head of PUP sent me a letter in which he explained that the then forthcoming paperback included “some 58 pages of corrections.” That’s a lot of “corrections.” You will search in vain, however, for any indication that the paperback is actually a new, or “corrected,” edition.

So there you have it. There’s good reason why the TLS might prefer that the name of the translator not be mentioned in the review of the translation. Perhaps Kirmmse ought to take a leaf from Kierkegaard’s book and start using a pseudonym.

Racism and Terminology

I do not like the expression “African-American.” It’s patronizing, condescending, and racist. It was coined, rumor has it, to help counteract the corrosive effect of racism on the self-esteem of black Americans. But how is that supposed to work? In practice, I would argue, the effect is unavoidably the reverse. White Americans are never referred to as “European-Americans,” so to identify black Americans as “African-American” is to suggest that they are only half American and that constitutes what is now fashionably referred to as a “microagression.”

Do a google search on the term “African-American” if you want to see how many black Americans feel about it. Check out the Facebook page “Don’t Call Me African-American,” or Charles Mosley’s guest column in in the February 12, 2013 edition of the Cleveland Plain Dealer. “By using the term ‘African-American’ to refer to black people,” Mosley writes, “columnists, readers, TV hosts and commentators perpetuate and embrace Jim Crow racial stereotypes, segregation and historical distortions. … Africa is not a racial or ethnic identity. Africa is a geographical identity.”

In fact, you almost never hear black Americans refer to themselves as “African-American” unless it is to please a white audience, and there is a good reason for that: Most do not think of themselves as African-American. They do not identify with Africa, at least not until we remind them, by referring to them as “African-American,” that they are supposed to.

By referring to black people as “African-American,” we are effectively suggesting that they should not feel too at home here because, really, they are only half American. Hyphenated designations may be fine to apply to people who strongly identify with another culture, but they are offensive and insulting when applied to people who do not and who actually have greater claim to being fully American than do most white Americans.

Most black Americans do not identify with Africans and most genuine African-Americans (i.e., people who recently emigrated from Africa to the U.S. or who divide their time between two continents) do not identify with black Americans. The Nigerian novelist Chimamanda Ngozi Adichie made this point very movingly in a talk she gave at the Free Library in Philadelphia as part of a tour she is on to promote her new book Americanah.

IMG_0906“American,” Adichie explained in response to a question about what race she had in mind when someone was referred to simply as “American,” “is a mark that culture leaves, never a physical description.” She said that when she came to the U.S. she did not want to be identified with black Americans and even “recoiled” when a man in Brooklyn referred to her as “sister.” I’m not your sister, she thought to herself. I have three brothers and I know where they are, and you’re not one of them!

She said she did not, at least originally, identify with black Americans, that she did not understand their experiences. Her friends, she explained, when she first came to the U.S. as a university student, were other foreign students. She felt she had more in common with them than she had with black Americans and suspected this feeling was shared by most Africans on first coming to the U.S.

Adichie explained that she had come to have enormous respect for American blacks, for the “resilience and grace of a people who had weathered a terrible history.” She said that now, if she went back to Brooklyn and someone there called her “sister” she would be pleased, that she would think YES! It took “a journey,” she explained though, “race in America,” she said, “is something you have to learn.”

White Americans could learn something important about black Americans, or more correctly, about American culture, by listening to Adichie. Adichie said she thought James Baldwin was the best American writer of the last two hundred years. Not the best African-American writer, she emphasized, but the best American writer.

She has a point. Go Tell it on the Mountain is not simply, as Wikipedia states, a novel “that examines the role of the Christian Church in the lives of African-Americans, both as a source of repression and moral hypocrisy and as a source of inspiration and community.” It is a novel that examines the role of the Church in the lives of Americans more generally in that the Church has had those dual roles in the lives of Americans of all races.

Yes, Go Tell it on the Mountain is a novel about a black family, but it is also a novel about an American family, not a Nigerian family, or Kenyan family, or a Somali family. Until we acknowledge that we will continue to live a lie, a lie that diminishes not merely black Americans but all of American culture, a culture of which black Americans are an inexorable part and to which they have made an immeasurably positive contribution.

(An earlier version of this article article appeared in the 20 May 2013 issue of Counterpunch.)

 

Some Reflections on an Auspicious Occasion

Cap and GownI’ve been promoted to full “Professor.” I am no longer “Associate Professor M.G. Piety.” I am now, or will be as of 1 September, “Professor M.G. Piety.” According to my colleague Jacques Catudal, I am the first person to make full “Professor” in philosophy at Drexel in more than 18 years.

It has been a long journey, as they say. I decided to study philosophy when I was an undergraduate at Earlham College, a small Quaker college in Richmond, Indiana. I became hooked on philosophy as a result of taking a course on rationalism and empiricism with Len Clark. I didn’t particularly enjoy reading philosophy, and I hated writing philosophy papers. I loved talking about it, though. Talking about it was endlessly fascinating to me, so I switched my major from English to philosophy. I became hooked on Kierkegaard after taking a Kierkegaard seminar with Bob Horn. “Bob,” my friends at Earlham explained, “was God.” He was preternaturally wise and kind and a brilliant teacher who could draw the best out of his students while hardly seeming to do anything himself. I don’t actually remember Bob ever talking in any of the seminars I took with him, and yet he must have talked, of course.

I spent nearly every afternoon of my senior year at Earlham in Bob’s office talking to him about ideas. I worried sometimes that perhaps I should not be taking up so much of his time. He always seemed glad to see me, though, and never became impatient, even as the light began to fade and late afternoon gave way to early evening. I don’t remember him encouraging me to go to graduate school in philosophy (my guess is that he would have considered that unethical, given the state of the job market in philosophy). I do remember, however, that he was pleased when I announced that I had decided to do that.

Graduate school was enormously stimulating, but also exhausting and, for a woman, occasionally demoralizing. There has been much in the news in the last few years about how sexist is the academic discipline of philosophy. Well, it was at least as bad back then when I entered graduate school as it is now, and possibly even worse. Still, I persevered. I began publishing while still a student and was very fortunate to gain the support and mentorship of some important people in the area of Kierkegaard scholarship, including C. Stephen Evans, Robert Perkins and Sylvia Walsh Perkins, and Bruce H. Kirmmse, who was one of my references for a Fulbright Fellowship I was awarded in 1990 to complete the work on my dissertation on Kierkegaard’s epistemology.

I lived in Denmark from 1990 until 1998. I received my Ph.D. from McGill University in 1995 but remained in Denmark to teach in Denmark’s International Study Program, then a division of the University of Copenhagen. I wasn’t even able to go back for my graduation, so I learned only a couple of years ago, when my husband bought me my own regalia as a gift, how gorgeous the McGill regalia are (see the photo).

I came to Drexel from Denmark in 1998 as a visiting professor. I liked Drexel. It was overshadowed by its neighbor, the University of Pennsylvania, but that seemed to me almost an advantage back then. That is, Drexel had carved out a unique niche for itself as a technical university, somewhat like MIT but smaller, that provided a first-class education in somewhat smaller range of degree programs than were offered by larger, more traditional institutions. The College of Arts and Sciences seemed to me, at that time, and to a certain extent, still today, a real jewel, as Drexel’s “secret weapon,” so to speak, because while most large universities had class sizes ranging anywhere from 40 to several hundred students, most of the courses in the humanities at Drexel were capped at 25 students. Drexel also boasted some first-class scholars who were as committed to teaching as to scholarship. Drexel was providing its students with what was effectively the same quality of education in the humanities as is provided at small liberal-arts colleges, while at the same time giving them invaluable hands-on work experience (through its co-op programs) that few liberal-arts colleges could provide.

Drexel asked me to stay on for a second and then a third year, despite the fact that my beginning was less than auspicious in that at the end of that first fall term, I had mistakenly conflated the times of what should have been two separate exams and hence left my students sitting in a room waiting patiently for almost an hour for me to materialize and administer the exam. It was too late, of course, to do anything by the time I learned, via a phone call from one of the secretaries in the department, of the mistake. I was relieved when not only was the then chair of the department, Ray Brebach, not angry with me, he was actually eager to see if I would be willing to stay on for another year. Ray has been one of my favorite colleagues ever since.

I received my tenure-track appointment in the spring of 2001. I liked my department. It was originally the Department of Humanities and Communications and included the disciplines of English, philosophy and communications. It was enormously stimulating to be in such a cross-disciplinary department. There were poets and novelists, as well as traditional literary scholars. I particularly liked being around the communications people, however, because many were engaged in politically significant media studies and that sort of work was reminiscent of the dinner-table discussions I remembered from childhood when my father was an editorial writer for one of the two newspapers in the town where I grew up. My association with the communications people led to the publication of an article I wrote together with my husband on the behavior of the mainstream media in the U.S. leading up to the second Iraq war.

Eventually, however, the communications people left our department and formed a new department together with the anthropologists and sociologists called the Department of Culture and Communications. So then we became the Department of English and Philosophy. I was sad to see the communications people go, but there were still plenty of creative writing people in the department who helped to make it a more stimulating environment than it would have been had it been comprised exclusively of traditional scholars. These people, including Miriam Kotzin and Don Riggs, both brilliantly talented poets, are some of my closest friends. Miriam has encouraged me to write for her outstanding online journal Per Contra, and Don, a talented caricaturist as well as poet, drew the picture of me that I occasionally use for this blog.

It was an ordeal, however, to go up for tenure. Our department has a tradition of requiring monstrously comprehensive tenure and promotion binders into which must go almost everything one has done on the road to tenure or promotion. I think each one of my tenure binders was around 500 pages in length (people going up for tenure or promotion must produce three separate binders: one for teaching, one for service, and one for scholarship). It took me the entire summer of 2006 to put those binders together, a summer when I would much rather have been writing material for publication. To add possible injury to the insult of having to devote so much time to the compilation of these binders was my fear that some of the reports of my “external reviewers” might not be so positive as they would haven been had I not become involved in a scandal in Denmark surrounding a controversial Danish biography of Kierkegaard. I lost several friends, including the aforementioned Bruce Kirmmse, as a result of my role in that controversy, friends whom I feared might well have been recruited to serve as external reviewers.

To this day I don’t know who all the reviewers were. Two were selected from a list I had provided my tenure committee, but the rest were selected by the committee itself. Whatever the reviewers said, however, it was not so negative as to override what subsequently became apparent was the high esteem in which my colleagues held me and my work. I was granted tenure in the spring of 2007 and I have fond memories to this day of the little reception provided by the dean for all faculty who where granted tenure that year. There was champagne and there were toasts and I was very, very happy.

I’d always been happy at Drexel, so I was surprised by the change that took place in me upon my becoming tenured. I felt, suddenly, that I had a home. I felt that people both liked and respected me. More even than that, however, I felt that I had found a community of high-minded people. People committed to principles of justice and fairness. I felt I had found a small community of the sort that Alasdair MacIntyre argues in After Virtue we must find if we are to live happy and fulfilling lives, the kind of community that is increasingly rare in contemporary society.

That all seems long ago now. Drexel has grown and changed. I am still fortunate, however, to have many brilliant, talented, and fair-minded colleagues. Thanks must go to my colleague Abioseh Porter, who chaired the department for most of the time I have been at Drexel and who was a staunch supporter of my development as “public intellectual” long before “public philosophy” enjoyed the vogue it does today. Thanks must also go to the members of my promotion committee, but especially to my colleague Richard Astro, who chaired the committee. I know from merely serving on tenure-review committees that no matter how uncontroversial the final decision is anticipated to be, there is an enormous amount of work demanded of the committee members, simply because of the level of detail required in the final report.

Thanks must also go to everyone who has supported me throughout my career. I set out, actually, to list each person individually, but then I realized that there are many, many more people than I would ever be able to list. I have been very fortunate.

Thank you everyone. Thank you for everything.IMG_0886

 

 

Nonconsensual Democracy

Portrait caricatureWe don’t have democracy in the U.S. People often respond to this observation with the somewhat patronizing explanation that no, of course we do not. What we have in the U.S., they explain, is a “republic,” or a “democratic republic.” That isn’t what I mean, however, when I say we don’t have democracy in the U.S. Qualifying the U.S. as a democratic republic does not avoid the issue of the necessity of the representation of the popular will in political decision making because, a “republic,” according to Merriam-Webster, is a state, or form of government “in which power rests with the people or their representatives.” That is, our “representatives” are supposed to “represent” our combined political will. And yet, they rarely do that.

Poll after poll had shown that the majority of Americans want universal healthcare. Bill Clinton promised to provide such healthcare, but failed to deliver it. Obama promised this as well, but Obamacare, while an improvement on the system that preceded it, still falls short of what Americans really want.

Universal healthcare isn’t the only thing Americans want that they do not currently have. There are lots of other things they want, things such as a better system of public education, free higher education, a better infrastructure, a living minimum wage, a crackdown on the abuses of the financial industry, etc., etc. People aren’t going to get any of these things, though, because they have no influence over the political process.

The problem is twofold. First, it is the sheer stupidity of a large portion of the American electorate that allows itself to be brainwashed by political propaganda the relentless message of which is that the things they want (and which are available in other countries) are not possible. Second, it is the willingness of an equally large, if not even larger portion, of the electorate to be bullied into voting for the “lesser evil” of two candidates, neither of whom represents what they want.

The simple truth is that democracy cannot work if people allow their votes to be determined by ignorance and or fear. The foundation of democracy is the Enlightenment ideal of rational self determination. Human beings, argued Enlightenment thinkers such as Immanuel Kant, are inherently rational. This means not only that they are capable of making intelligent decisions concerning how they want to live their lives, but also that they cannot achieve full humanity if they are not allowed the freedom to make these decisions. That human beings are rational requires that we respect them as such and endeavor to organize society in a way that will allow them to fulfill their distinctively human potential for self governance. Political democracy is an outgrowth of this insight. A person’s vote is the means by which he expresses his political will, his consent to certain ideals of social governance.

There has been a lot written lately about consent because of what appears to be an epidemic of sexual assault on college and university campuses. Can a young woman “consent” to sexual intercourse if the person “requesting” it has drugged her, or has threatened violence against her? Is she “consenting” if she “puts out” because she is too weak and addled to resist, or if she is simply afraid of having her head bashed in? Most people easily see that when “consent” is coerced in such ways, it is not really consent.

Few people seem to understand these otherwise straightforward aspects of consent when consent is placed in a political context. Voters are bombarded with propaganda to the effect that what they want is not possible. Many are so swayed by this propaganda that they can no longer think clearly about the issues to which it relates. The purpose of propaganda is precisely to circumvent rational thought. It works like a drug, depriving those it influences of autonomous judgment. Of course, people conclude, if these things were possible, then we would all do our best to see that they became actual, but, alas, they are not possible, so to work for them is a waste of time.

Is that sort of resignation the expression of an autonomous will? The answer is obviously no. Such people are acting from ignorance, not knowledge. If they knew that what they wanted was possible, they would take steps to achieve it. But they have been misinformed. They have been told it would be counterproductive to pursue such things and since no one wants to waste his energies, they refashion their political hopes to what they are told is more reasonable. A person whose judgment is clouded by a fog of propaganda cannot give informed consent to a political platform any more than a person who has been drugged can give such consent to sex.

But wait, there’s more. Not everyone is taken in by political propaganda. Some people know that not only are the political changes they want possible, they are genuine realities in other parts of the world. A special indignity is reserved for people who dare to keep their political wits about them despite the fact that they are bombarded with propaganda designed to undermine them. These people get to be fully conscious participants in their own degradation. Okay, respond the powers that be, you go ahead and vote your conscience, vote for someone who promises to bring about the kinds of changes you want. You know what will happen? You will get someone far, FAR worse than the “moderate” candidate you deem not good enough for you. The rest of the electorate, the sonorous voice continues, is not so forward thinking as you are. You will be “wasting your vote” on a candidate who doesn’t have a chance, and in that way, you will ensure that your worst political nightmares will come true.

It’s as if your date, upon realizing that the drug he’d given you hadn’t worked, threatens to beat the hell out of you if you refuse to have sex with him and then have sex with you against your will anyway. You can “consent” to something horrific, or you can refuse to consent and endure something even worse.

If you give in to such threats, have you consented to having sex with the person who made them? The answer is pretty clearly no. You’ve been violated every bit as much as if you had simply been taken against your will. In fact, one could argue that there is more dignity in resistance than in giving in because if you give in, not only have you been violated by someone else, you have, in a sense, betrayed yourself as well.

People who vote for the lesser of two evils aren’t expressing their political will in any kind of meaningful sense. They are acting out of fear. They’re not approving of the platform of the candidate they “choose.” They are merely expressing disapproval of the platform of the bogey man with which they have been threatened. They have surrendered their autonomy to fear. The weak minded have their autonomy stolen from them by the insidious drug of propaganda. Those who are more temperate and level headed have it wrested from them at knife point, so to speak.

Why do I insist on voting my conscience in the face of imminent political disaster, I’ve been asked again and again. My answer is always the same: Because it is the only way democracy can actually work. If people allow their political vision to be clouded by propaganda, or surrender their autonomy to their fear of an unthinkable future, then it doesn’t matter how many people turn out to vote because the votes themselves are not an expression of the popular will, but merely of ignorance and fear. And the result of those votes is a foregone conclusion they have had no positive part in determining.

(This piece originally appeared in the 7 March 2016 issue of Counterpunch under the title “Nonconsensual Democracy and the Degradation of the American Electorate.)

 

 

 

 

On Political Bullying and the Hell of Knee-Jerk Feminism

Portrait caricatureI understand Bernie Sanders has a huge flock of male chauvinist supporters. That seems implausible, doesn’t it? I’m not disputing that someone is posting offensive sexist responses to comments by Clinton supporters on various websites. What I’m skeptical of is the claim that such comments are coming from Sanders’ supporters. I’m not saying there is no such thing as a genuine leftist who is also sexist. They exist. The British are particularly prone to this personality disorder. I doubt, however, that there are many British who are all that involved in online debates among the supporters of various candidates for the Democratic presidential nomination in the U.S.

The purported “Bernie Bros” movement is about as plausible as a group called “Vegans for Trump.” In fact, “Bernie bros” sounds very much like an invention of some public relations firm hired by the Clinton campaign. You remember the public relations industry, the people who invented equally implausible fake “grassroots” groups such as the “National Smokers Alliance,” “a supposedly independent organization of individual smokers which claimed that bans on smoking in public places infringed on basic American freedoms” (Trust Us, We’re Experts, p. 239), and the “Wise Use” movement, a fake grassroots group opposed to environmentalism (Trust Us, We’re Experts, p. 20).

The Bernie Bros have been charged with “mansplaining” political issues to Clinton supporters. It wasn’t clear to me, at first, what “mansplaining” was, so I looked it up. It’s apparently a type of explanation that is condescending or patronizing, typically made by a man to a woman whom he assumes may have difficulty understanding what he is trying to say because she is, well, a woman. Now that, of course, is bad. From what I have been able to gather, however, the “mansplaining” of Sanders’ supporters is characterized not by condescension or contempt, but by factual references and valid inferences. That is, Bernie Bro “mansplainers” use sound arguments as rhetorical clubs to beat down the specious arguments of people who claim that the facts, and the valid inferences that can be drawn from them, are not relevant to the issue of Clinton’s fitness to hold the highest office in the land.

I have to tell you that, as a woman, I take offense at the implication that sound arguments are somehow inherently masculine and that using them to defend one’s political position constitutes a type of bullying. It can indeed be humiliating to have one’s errors in reasoning publicly exposed, and I have a certain sympathy for the plight of Clinton supporters for whom this ordeal must seem unrelenting. No one is forcing them to go to the barricades, however, for someone whose record makes her effectively indefensible.

Polls suggest that Clinton’s main supporters are older women. That makes me wonder whether the teaching of critical reasoning is a relatively recent pedagogical development. Learning to recognize fallacious arguments and non-argumentative rhetoric, takes some training (see philosopher Stephen Stich’s “Could Man Be An Irrational Animal”, Synthese 64 [1985] 115-135”). Perhaps many older women failed to receive that training.

Madeleine Albright appears, in any case, never to have taken a first-year critical reasoning course. Albright rebuked female Sanders supporters at a rally for Clinton in New Hampshire. She reminded everyone that the battle for gender equality had not yet been won, that there was still much work to be done before it would be, and that part of that work involved supporting Hillary Clinton for the Democratic presidential nomination. “Just remember,” she concluded, “there’s a special place in hell for women who don’t help each other.”

Really, Madeleine? Do you really think women should support other women simply because they are women? Where would you draw the line? Should women always support other women who seek political office, not matter what their views? Should all the women in the U.K. have supported Margaret Thatcher, simply because she was a woman, even if they disagreed with her conservative views? So women don’t get the same freedom of choice as men do? They don’t get to vote their consciences? And if they dare to do that, they’re bad people?

That sort of effort at persuasion is, in fact, a very specific form of informal fallacy known as “peer pressure,” which is itself one of a family of informal fallacies referred to as “appeals to emotion.” When you can’t get people to agree with your position on its merits, just try making them feel really bad about disagreeing with you. So instead of Clinton supporters attempting to use sound reasoning to persuade women that Clinton is the better Democratic candidate, they hurl invectives at them such as “You’re betraying women!” or better yet: “You’re going to hell!”

Really, Madeleine? Do you really think this generation of educated young women is going to be taken in by such transparently underhanded rhetorical tactics as that? Really, Hillary? You’re not going to denounce that kind of tactic?

If you want an example of bullying, there it is.

There was a time, way back in the early days of feminism, when some cognitively challenged feminist scholars argued that logic was inherently masculine, that while men made decisions based on reasoning and logic, women made them based on intuitions and emotions and that this was an equally valid way of making decisions (see, for example Carol Gilligan’s In A Different Voice). Fortunately, this view has few followers nowadays. Years of increased access for women to high-quality education has made it glaringly obvious that men do not have a monopoly on rationality and that the whole logic versus emotions view of reasoning was itself a false dichotomy based on an inadequate understanding of the complexity of rational thought.

Albright is right, of course, in her observation that women’s fight to “climb the ladder” of equality with men is not done. Bullying them to vote for a candidate against their own better judgement is hardly going to advance that cause, however. The Clinton campaign’s knee-jerk “feminism” is creating a hell of its own, and not just for women who refuse to jump on the Clinton bandwagon, but for all women, because it will only confirm in the minds of horrified onlookers that women are not actually so rational as they claim and hence will set the whole feminist movement back decades.

(This piece originally appeared in the 26 February 2016 issue of Counterpunch.)