“Fake News” and the Responsibility of Philosophers

”Fake news” is not actually a new phenomenon. Sheldon Rampton and John Stauber document in their book Trust Us, We’re Experts, that it is an invention of the public relations industry that is as old as the industry itself. The First Amendment makes it pretty hard to prevent such efforts to manipulate public opinion. That’s the price it appears we have to pay to live in a free society. It wouldn’t be so serious a problem as it is if people in the field of higher education didn’t fall down on their responsibility to alert alert the public to it.

A recent case in point is an article entitled Study: For-Profits Match Similar Nonprofits in Learning Results,” that ran in the January 11th issue of Inside Higher Education. The third paragraph cites the study as claiming that “[i]n all six comparisons, students at proprietary institutions actually outperformed the students at the nonproprietary comparison institutions.”

Who would have thought that? I mean, really, aren’t the for-profits infamous for having poor learning outcomes? One doesn’t actually even have to look at the original study, however, to realize that something is fishy with it. The first red flag is the fact that the study uses the euphemism “proprietary” institutions rather than the straightforwardly descriptive “for-profits.”

The study is described as measuring “learning outcomes in six areas for 624 students from four for-profit higher education systems, which the study does not name, and then compar[ing] the scores with those of a matched group of students from 20 unnamed public and private institutions that were selected because they were similar to the for-profits on key measures related to academic performance” (emphasis added).

The second red flag is the “matched group of students.” Matched in what sense? That isn’t explained.

The third red flag is that neither the traditional nonprofit institutions nor the for-profit ones are named.

The fourth red flag is that the nonprofit institutions were selected because they were “similar to the for-profits on key measures related to academic performance.” Really? Since for-profits are reputed to have abysmal results in terms of academic performance, they must have searched long and hard to find nonprofits that had similarly abysmal results, if indeed they really did find such institutions, which cannot be verified since they are “unnamed.”

The whole thing reminds me of an old television commercial for Rolaids. Someone dumps a white powder into a beaker of what appears to be water with red food coloring in it, then stirs the powder, which gradually becomes clear again, while a voiceover announces “In this test with Rolaids’ active ingredient, laboratory acid changes color to PROVE Rolaids consumes 47 times its weight in excess stomach acid.”

There was no way, however, to prove that the beaker had actually contained acid, or that what had been dumped into it was really Rolaids’ “active ingredient,” or indeed even that the change in color represented Rolaids’ “absorbing” anything let alone acid, not to mention how much acid. I credit that commercial with starting me on the road to becoming a philosophy professor because even as a child I found it outrageous that someone should expect I would take it as proving anything.

One of the chief duties of philosophers, I believe, is to expose errors in reasoning and man were there errors of reasoning in that commercial. I learned very early that commercials were not to be trusted. Most people know that, I think. Most people know to be skeptical when, for example, a commercial claims that some detergent removes stains better than any other detergent ever invented and presents what it purports is proof.

Most people know to be skeptical about claims made in commercials. Unfortunately, most people do not know to be skeptical about claims made in what is presented to them as “news.” That’s why I use Rampton and Stauber’s book when I teach critical reasoning. I feel it is part of my responsibility as a philosopher to alert my students to the pervasiveness of the practice of dressing up propaganda as news.

Back to the education “study.” Even if the study were genuine, the results are pretty much useless because the whole study is circular. That is, the study admittedly sought out “matched” students at “similar” institutions. It thus isn’t surprising that the for-profits come out looking better than one would expect if the selection of students and institutions had been random.

The study was conducted by a group called the Council for Aid to Education, or CAE. The “Executive Summary” (p. 2) of the study makes it very clear where the CAE stand on the for-profits. “The proprietary education sector stands at a crossroads,” it begins.

Proprietary colleges and universities are key providers of postsecondary education in the United States, enrolling over 1.7 million students. However, the sector has seen its enrollment decline since its peak in 2010 due to the growing employment opportunities following the Great Recession, the heavy regulatory burdens imposed during the last six years, and the perception that education at proprietary institutions is not on par with that offered by their non-proprietary peers.

The Council for Aid to Education (CAE) believes this junction presents a critical time to explore the efficacy of proprietary institutions and to document the student learning they support.

If there were doubt in anyone’s mind concerning the study’s objectivity, the opening of the “Executive Summary” should remove it. The CAE set out to show that the for-profits were doing as good a job of educating students as are traditional nonprofit institutions of higher education.

Of course the CAE is within its rights to do this. The problem is not so much the the CAE’s clear bias in favor of the “proprietary education sector,” but Inside Higher Education’s failure to expose that bias. Inside Higher Education purports to be “an independent journalism organization.” This “journalistic independence is critical,” IHE asserts in its “Ownership Statement,” “in ensuring fairness and thoroughness” of its “higher education coverage.”

The thing is, Quad Partners, “a private equity firm that invests in the education space,” purchased a controlling share of IHE in 2014. That is, Inside Higher Education is now an arm of the “proprietary education sector.” So the purported “independence,” “fairness,” and “thoroughness” of its reporting on issues in higher education appears now to be only so much more propaganda in the service of the for-profits.

Doug Lederman, the editor of Inside Higher Education protested to me in an email, after he saw an earlier version of this article that appeared in Counterpunch, that he and the people over at IHE had had their own suspicions about that piece and that that was why they had given is only a “barebones Quick Take.”

“What confuses me,” he said,

is why you viewed our minimalist and questioning treatment of the CAE research as evidence that we are in the tank for the for-profits because our lead investor has also invested in for-profit higher education––rather than as proof that our ownership situation has changed us not at all.

I fear Lederman may be right in protesting that IHE had not been willingly shilling for the for-profits. It apparently didn’t even occur to him that the suspicions he and others had had about the study should have led them to do a full-scale investigation of it (an investigation that would have involved actually reading at least the “Executive Summary” of the study to which they included a link in their article) and to publish an exposé on the study as a piece of propaganda for the for-profits rather than a “barebones” article that presented it as “news.”

What concerns me is not so much that the for-profits are trying to manipulate public opinion to make it more favorable toward them. What concerns me is that the editors of a leading publication that reports on issues in higher don’t have the critical acumen to identify what ought to have been readily identifiable as a piece of “fake news,” or the journalistic experience and expertise to know what to do with it once they have identified it as such.

That’s disturbing.

(An earlier version of this piece appeared in the 12 January 2017 issue of Counterpunch.)

 

 

On the Demise of the Professoriate

Portrait caricatureMichael Schwalbe’s recent article in Counterpunch, The Twilight of the Professors,” paints a rather darker picture of the future of the professoriate than I believe is warranted. Or perhaps it would be more correct to say, paints a somewhat misleading picture of the dynamics behind the demise of the professoriate as a positive force for social and political progress.

Schwalbe is correct that the “tightening of the academic job market has intensified competition for the tenure-track jobs that remain.” He’s also correct that it is prudent for graduate students to focus their efforts on publishing in academic journals rather than in media more accessible to a general readership. Hasn’t that always been the case, though? The problem, I submit, with academic journals is not so much that their intended audience is academics as it is that most of these journals just aren’t very good. The pressure on academics is not merely to publish in academic journals, but also to edit them with the result that there are now too many of them and too many of questionable quality. Almost anyone can get published in an academic journal nowadays, but much of the material that is published in them, as Alan Sokal demonstrated to devastating effect back in 1996, is gibberish.

The situation is not much better with academic presses than with scholarly journals. Even some of the top presses are publishing material that would never have seen the light of day in earlier periods when there was greater quality control. Nearly all the emphasis in academia these days, as in the larger society, is on quantity rather than quality. Academic presses, such as Lexington Books, send out mass emails to academics, effectively trawling for book proposals. I spoke about this problem recently with a representative from the more selective German publisher Springer. “These guys are just publishing too much,” he said, smiling in a conspiratorial way.

No one can keep up with which among the proliferating academic journals and presses are actually any good, so emphasis tends to be placed on the quantity of publications a scholar can amass rather than on their quality. This means, of course, that the savvy self promoter with little of any real value to contribute to the life of the mind can more easily carve out an academic career now than can a genuine intellectual who would have actual scruples about dressing up old insights as new ones, as well as against publishing what is effectively the same article over and over again.

The problem is not that academic journals are in principle of no popular value so much as it is that most academic journals these days are in fact of no popular value because there are just too damn many of them and most of them are no damn good. Hardly anyone actually reads them, even among academics.

It may be true, as Schwalbe observes, that graduate students are advised to craft Facebook pages and Tweets “with the concerns of prospective employers in mind,” but what does that mean? The prospective employers in questions are other scholars, not university administrators. There are too many demands on the time of most university administrators for them to scrutinize the Facebook pages and Tweets of all the scholars who earn the department hiring committee’s seal of approval. The problem, I believe, is less that hiring committees are on the lookout for political radicals as it is that they’re too often on the lookout for people who are going to show them up. Few people are possessed of such high self esteem that they are comfortable in the company of someone they suspect might actually be smarter than they are, and academics are no exception.

The growing ranks of “contingently employed” academics “is further conservatizing” charges Schwalbe. The argument that such faculty will censor their writing in order not to offend their employers sounds good in the abstract, but as is so often the case with arguments that are internally coherent, it doesn’t correspond to the facts. Some particularly fearful and feeble-minded underemployed academics may do this, but it doesn’t take long for contingent faculty to realize that most of the tenured faculty in their own departments, to say nothing of university administrators, don’t even know who they are, let alone what they are writing.

Contingently employed academics represent a growing army of educated, literate, yet grossly underpaid workers. Such a population is the ideal breeding ground for political radicalism and, indeed, some are beginning to unionize.

Demands for grant getting, as Schwalbe observes, undeniably slant research in the sciences in the corporate direction. But, most leftist public intellectuals have traditionally come from the humanities rather than the sciences.

The real threat, I believe, to the professoriate as a force for positive social and political change, comes not so much from the factors Schwalbe mentions as from things more deeply rooted in American culture such as egoism and anti-intellectualism. The egoism that is fostered by so much in American culture keeps many academics from making what appear on a superficial level to be personal sacrifices even for the good of their students, let alone for the good of society more generally (I say “on a superficial level” because faculty who make such “sacrifices” are rewarded many times over by the satisfaction of actually bettering the lives of their students and, in that way, of humanity more generally). Tenured faculty have a responsibility to help their students develop the critical, analytical and communicative skills that are essential to actualizing the distinctively human potential for self determination, but too many abdicate this responsibility because of the time and effort required to live up to it.

The professoriate is almost universally opposed to assessment. I have never been an opponent of it however. I’m well aware, of course, that it can be abused, but it has become increasingly clear to me that at least one reason so many academics are opposed to it is that it would reveal that they are not, in fact, teaching their students much.

Some effort at assessment of student learning in the humanities could be a vehicle of revolutionary change in that it would put pressure on tenured faculty actually to teach students something, and would expose that the working conditions of many contingent faculty are such that requiring this of them is like asking them to make bricks without straw.

Assessment could be a force for radical social and political change in that implemented properly, it would make all too clear both how decades of the dismantling of the K-12 system of public education and the analogous onslaught on the funding of higher education have not simply resulted in a generation of less-than-excellent sheep, but also, as Ray Marshall and Marc Tucker argue in Thinking for a Living: Education and the Wealth of Nations (Basic Books, 1993), threaten the social and economic future of this country. In fact, assessment in higher education could have such a profoundly progressive effect that if I didn’t know better, I’d think the movement against it was a conservative plot.

It isn’t a conservative plot, though, unless conservatives are far more devious than most of us imagine and their whole sustained attack on education in general was originally designed to produce an academic job market that was so neurotically competitive it would gradually weed out academics committed to anything other than the advancement of their own, individual careers.

It’s counter productive to demonize university administrators. There are some bad ones, of course, and their salaries, like the salaries of their corporate equivalents, need to be brought back into line with those of the individuals they supervise. It’s not university administrators, however, as Schwalbe claims, who are responsible for the purported decline in leftist intellectuals, but scarcity conditions in the academic job market that are ultimately traceable back to American egoism and anti-intellectualism. But American egoism and anti-intellectualism are problems that are far less easily solved than the largely phantom “conservatizing trends” in higher education that Schwalbe discusses in his article.

(This piece originally appeared in the 8 June 2015 edition of Counterpunch under the title “The Real Threat to the American Proefssoriate.”)

On the Importance of Learning a Second Language

Portrait caricatureThere is an article in today’s New York Times entitled “The Benefits of Failing at French” that reminds me of a debate in the Times back in 2011 entitled “Why Didn’t the U.S. Foresee the Arab Revolts?” Six scholars, academics, political appointees and think tankers debate the issue in The Times online. They all appear to believe it is very complicated.

Jennifer E. Sims, a professor and director of intelligence studies at Georgetown University’s School of Foreign Service and a senior fellow at the Chicago Council on Global Affairs, thinks the problem is our over reliance on foreign assistance.

Reuel Marc Gerecht, a former CIA officer, thinks it’s that we were captured by “group think.”

Vicki Divoll, a professor of U.S. government and constitutional development at the United States Naval Academy, the former general counsel to the Senate Select Committee on Intelligence and assistant general counsel to the C.I.A., thinks the president is at fault for failing to allocate sufficient resources to the CIA. But then, on the other hand, she says “no amount of resources can predict the unknowable. Sometimes no one is to blame.”

Richard K. Betts, the Arnold A. Saltzman Professor of War and Peace Studies, director of the International Security Policy program at Columbia University and the author of Enemies of Intelligence: Knowledge and Power in American National Security, thinks the problem is that “it is impossible to know exactly what will catalyze a chain of events producing change.”

Celeste Ward Gventer associate director of the Robert S. Strauss Center for International Security and Law at the University of Texas at Austin and a former deputy assistant secretary of defense, thinks the problem is that we’re too preoccupied with “foreign policy minutiae.”

Peter Bergen, the director of the national security studies program at the New America Foundation and is the author of “The Longest War: The Enduring Conflict between America and Al-Qaeda,” thinks the explanation is as simple as that revolutions are unpredictable.

There is probably some small grain of truth in each of these rationalizations. I’m only a professor of philosophy, not a professor of political science, let alone a former governmental bureaucrat, political appointee, or think-tank fat cat. It seems pretty clear to me, however, that despite all the theories offered above, the real reason we didn’t see the revolts coming was good old-fashioned stupidity. That’s our strong suit in the U.S.–stupidity. We’re the most fiercely anti-intellectual of all the economically developed nations, and proud of it! We go on gut feelings. Oh yes, our elected officials even proudly proclaim this. We don’t think too much, and on those few occasions when we do, we’re really bad at it for lack of practice.

One of the great things about Americans is that they are probably the least nationalistic people in the world. Oh yeah, they trot out the flag on the fourth of July and for the Super Bowl, but that’s about it. A few crazy fascists brandish it throughout the year, but most people, except for a brief period after September 11,th pay no attention to them. Danes, in contrast, about whom I know a little because I lived there for eight years, plaster Danish flags all over everything. Stores put them in their windows when they have sales, they are standard decorations for almost every holiday and a must, in their small toothpick versions, for birthday cakes. This isn’t because they suffer from some sort of aesthetic deficiency that compels them to turn to this national symbol for want of any better idea of how to create a festive atmosphere. No, Danes throw Danish flags all over everything because they are incredibly nationalistic, as is about every other European and almost everyone else in the rest of the world who’s had to fight off the encroachment of foreign powers onto their national sovereignty. We’ve seldom, OK, really never, had to do that. Still, if we, you know, seriously studied European history, we would have something of an appreciation for how basic is nationalism to the psyches of most people in the world and we could use this as our point of departure for understanding the dynamics of international relations, as well as for appreciating the obstacles to our understanding of the internal dynamics of other countries.

Years ago, when I had just returned to the U.S. after having spent the previous eight years living in Denmark, I accompanied one of my former professors to a Phi Beta Kappa dinner in Philadelphia (he was the member, not I). The speaker that evening, was the former editor of the one-time illustrious Philadelphia Inquirer. His talk, apart from one offhand comment, was eminently forgettable. That one comment, however, left an indelible impression on me. This editor, whom I think was Robert Rosenthal, mentioned, at one point, that he did not think it was important for foreign correspondents to know the language of the country from which they were reporting because, as he explained, “you can always find someone who speaks English.”

How do you begin to challenge a statement of such colossal stupidity? It’s true, of course, that you can always, or at least nearly always, find a person who speaks English. I don’t mean to suggest that that’s not true. The problem is, if you don’t know the indigenous language, to use an expression from anthropology, then you really have no idea whether you are being told the whole story. And the thing is, if you ever do become fluent in a second language, and more or less assimilated into a culture into which you were not born, you will know that foreigners are never given the whole story. This was clear to me as a result of my having lived in Denmark, Denmark, a country with which we are on friendly terms, a country that in many ways is strikingly similar to the U.S. How much clearer ought it to be with respect to countries with which we are not on friendly terms, countries we know are either deeply ambivalent about us or outright hate us?

You will always get a story in English, certainly, from a native about what is going on in some other country, but if you don’t know the language of the people, then you aren’t really in a position to assess whether the story might be biased. You might have some idea of the social class of the person who is your source, but how are you going to know what the people as a whole think of this class, or of this individual? How are you going to know whether this person has some sort of personal or political agenda, or whether he is simply attempting to whitewash was is going on out of national pride, or a fear of being perceived by foreigners as powerless, or provincial, or intolerant?

This seems a fairly straightforward point, yet it is one that nearly all Americans miss. We generalize from our own experience. We assume everyone is just like we are, or just like we are taught to be, which usually means that we assume pretty much everyone in the world is motivated primarily by the objective of personal, material enrichment. We don’t really understand things such as cultural pride or what is, for so much of the rest of the world, the fierce desire for self-determination, so we are pretty much always taken by surprise when such things seems to motivate people. That’s the real meaning of “American exceptionalism,” an expression that is used in an increasing number of disciplines from law, to political science, to history with varying shades of meaning in each. That is, the real meaning is that our difference from the rest of the world is that we are dumber. Yes, that’s right, we are the dumbest f#*@!ing people on the face of the earth and just now, when we need so desperately to understand what is going on in other parts of the world, we are reducing, and in some instances even completely eliminating, the foreign language programs in our schools and universities.

It’s no great mystery why we didn’t foresee the Arab revolts. The mystery is why we seem incapable of learning from either history or our own experience. It doesn’t help for the writing to be on the wall if you can’t read the language.

(This piece originally appeared under the title “The Writing on the Wall” in the February 28, 2011 edition of Counterpunch)

 

 

 

On Violating the First Amendment

Portrait caricatureA friend, Dave Nelson, who is a standup comedian and academic jokes that the thing about being an academic is that it gives one a kind of sixth sense like the kid in the M. Night Shyamalan film, except that instead of seeing dead people you see stupid ones. He’s right about that. Sometimes this “gift” seems more like a curse in that one can feel overwhelmed by the pervasiveness of sheer idiocy. When I saw the New York Times’ piece from April 2 “Supreme Court Strikes Down Overall Political Donation Gap” I wanted to crawl right back into bed and stay there for the rest of my life. Even the dissenting opinions were idiotic. Limits on the size of contributions to individual candidates are still intact. It’s just the overall caps that have been removed, so now while you can’t give more than $2,600 to a single candidate, you can give to as many candidates as you like. It seems the dissenters are worried, however, that the absence of an overall cap raises the possibility that the basic limits may be “circumvented.”

That sounds to me just a little bit too much like arguing over how many angels can dance on the head of a pin. “There is no right in our democracy more basic,” intoned Chief Justice Roberts, “than the right to participate in electing our political leaders.” Oh yeah? Well, if a financial contribution to a political campaign counts as “participating in electing our political leaders,” then a whole slew of Americans’ first amendments rights are being violated all the time in that many Americans don’t have enough money to pay for the basic necessities of life, let alone have any left over to contribute to political campaigns. The rules of the political game have been written in such a way that the “participation” of the poor is limited before the process even gets started. Sure, they can attend protests, write letters, etc., etc. Or can they? What if their penury forces them to work around the clock? What if they are effectively illiterate? Even if they could do these things, however, the extent to which they could affect the political process is limited by the rules of the process itself. They have less money, so they have less say.

Philosophers are fond of invoking the ceteris paribus clause. All other things being equal, they say, this or that would be the case. The point, however, of invoking the ceteris paribus clause is to expose that all other things are not in fact equal. Ceteris paribus, limits on campaign contributions would infringe on people’s First Amendment rights. So if we think such limits do not infringe on people’s First Amendment rights, the next step is to ask why we think this. The obvious answer is that all other things are not equal. That is, people do not all have the same amount of money. Even in a country such as Denmark that has effectively wiped out poverty and hence where everyone in principle is able to contribute money to political campaigns, some people are able to contribute much more money than other people and thus able to have a much greater influence on the political process. Danes, being the intelligent people they are, understand that such an inequity is antithetical to democracy so they legislate that political campaigns will be financed with precisely the same amount of money and that this money will come directly from the government rather than from individuals.

This is, in fact, how pretty much every country in the economically developed world finances political campaigns and presumably for the same reason. Everyone who is not irredeemably stupid understands that to tether the ability of an individual to participate in the political process to the amount of money he can spend on such “participation” is a clear violation of the basic principles of democracy. If writing a check is a form of political expression, then the economic situation of millions of Americans excludes them a priori from such expression, which is to say that their rights are unjustifiably curtailed in a way that the rights of the wealthy are not. (I say “unjustifiably” on the assumption that few people would openly defend the explicitly Calvinist view that the poor are poor through their own fault.)

So the issue here is not really one of defending the First Amendment. It’s “pay to play” in the U.S. You have no money, you have no, or almost no, political voice. Pretty much everyone who is not, again, irredeemably stupid understands that. The haggling is not about protecting people’s First Amendment rights. It’s a power struggle between what in the eyes of most of the world would be considered the wealthy and the super wealthy.

But then one should not underestimate the number of the irredeemably stupid. “The government may no more restrict how many candidates or causes a donor may support,” pontificated Roberts, “than it may tell a newspaper how many candidates it may endorse.” Anyone whose had an introductory course in critical reasoning, or informal logic, will immediately see that the analogy Roberts has drawn here is false. Roberts is using the terms “support” and “endorse” as if they are synonyms. They’re not synonyms though, at least not in Roberts’ analogy. The “support” a “donor” gives to a political candidate is financial, whereas the “endorse[ment]” a newspaper gives to a candidate is editorial. To suggest that such a distinction is unimportant is to beg the question. God help me we must be the only country on the face of the earth where someone can make it not only all the way through law school without understanding such errors in reasoning, but all the way to the Supreme Court.

But fallacious reasoning isn’t the worst of Roberts crimes. Many on the left have long suspected people on the right are more or less closeted fascists. Well, Roberts has come out of the closet. Yes, that’s right. Roberts explicitly compared the removal of overall limits to campaign contributions to Nazi parades. If the First Amendment protects the latter, he asserted, then it protects the former. The analogy here is just as bad as the earlier one given that a person doesn’t have to pay to march in a parade. It’s a little more revealing, however, to those who have eyes to see.

(This piece originally appeared in Counterpunch, 4-6 April 2014 )

Lies, Damned Lies, and Public Discourse on Higher Education

Portrait caricatureTwo staggeringly inane points are being made ad nauseam in public discourse about higher education. The first is that tenure is an institution that has far outlived its usefulness (if it ever was useful). The second is that universities today need to focus on providing students with the technical skills they will need in order to effectively tackle the demands of the contemporary, technologically advanced workplace.

Kevin Carey, director of the education policy program at the New America Foundation wrote last summer in The Chronicle of Higher Education that tenure was “one of the worst deals in all of labor. The best scholars don’t need tenure, because they attract the money and prestige that universities crave. A few worthy souls use tenure to speak truth to administrative power, but for every one of those, 100 stay quiet. For the rest, tenure is a ball and chain. Professors give up hard cash for job security that ties them to a particular institution—and thus leaves them subject to administrative caprice—for life.”

Carey seems to have confused tenure with indentured servitude. Tenure does not tie professors to particular institutions. A tenured professor is just as free to move to a new institution as a non-tenured one. Few will leave a tenured position for an untenured one, but that doesn’t make them less mobile than they would be if tenure were abolished. Academic stars seldom have difficulty moving from one tenured position to another, and professors who are not stars seldom have the opportunity to move.

I’m uncertain what Carey means by “administrative caprice.” In my experience, the faculties most subject to administrative caprice are those at for-profit institutions. Traditional colleges and universities more often than not share the governance of the university with the tenured faculty through the agency of a faculty senate, as well as through the judicious promotion of faculty to administrative positions.

Sure academic stars don’t need tenure. One doesn’t become an academic star, though, by excelling as a teacher. One becomes an academic star by excelling as a scholar. Excellent scholars, however, are not always excellent teachers. A good university needs both. Of course if human beings were fully rational, then university administrators would realize that the long-term health of an institution depends on its good teachers as much as, if not more than, on the reputation of its scholars. No one gives money to his alma mater because of his fond memories of studying at the same institution where Dr. Famous Scholar taught. I give money every month to my alma mater even though not one of my professors was famous. They may not have been famous, but they were fantastic teachers who cared about their students and instilled in them a love of learning. Quaint, eh? That doesn’t change the fact, though, that I give money to the least famous of the institutions of higher education with which I have been affiliated and that I give it for the simple reason of the quality of instruction I received there–and I am not alone.

Carey would likely counter that he is all for good teaching. He believes making professors “at-will employees” would require them to do “a great job teaching.” But who would be the judge of this “great teaching”? What would the standards be? If it were student evaluations, that could be problematic because students are not always the best judges of good teaching. Too many tend to give their most positive evaluations of instructors who give the fewest assignments and the highest numbers of As. Many come around eventually, of course. I had a student write me last winter to thank me for giving her the skills she needed to make it through law school. She had not written that letter upon her graduation from Drexel (let alone at the end of my course), however, but upon her graduation from law school! Unfortunately, we don’t solicit teaching evaluations from alumni for courses they took years earlier. Fortunately for me, I was tenured, so I could be demanding of my students without fearing that angry evaluations might cause me to lose my job.”At-will” professors are not so fortunate.

These are dark times in higher education. The intellectual backbone of a culture is the mass of university-level teachers who slave away in almost complete obscurity, not because they don’t have the intellectual stuff to make it in the highly-competitive atmosphere of “world-class scholarship,” but very often because they do not have the stomach for the nauseating degrees of self-promotion that are sometimes required to break into that world, and because they have too much conscience to abandon their students to their own, literally untutored, devices. Teaching is extraordinarily time consuming. It takes time away from research, the kind of research that brings fame and fortune. Teaching brings it own rewards, and thank heavens there are many who still value those rewards. Unfortunately, few such individuals are found among the ranks of university administrators.

As I said, however, this is not the only inanity that is being bandied about by talking empty-heads. The suggestions that universities should concentrate on providing students with technical skills is even more conspicuously ludicrous. The most obvious objection to this point is that the provision of technical skills is the purview of vo-tech (i.e., vocational-technical) schools and institutes, not universities. For the latter to suddenly begin to focus on imparting technical skills would effectively mean that we would no longer have universities. (That this may be the hidden agenda of the peculiarly American phenomenon of the anti-intellectual intellectual is understandable given that the days of their media hegemony would be threatened by even the slightest rise in the number of Americans who did not need to count on their fingers.)

There is a more profound objection, however, to the assertion that universities ought to focus on teaching technical skills: the shelf-life of those skills has become so short that any technical training a university could provide its students would be obsolete by the time of their graduation if not before. Dealing effectively and adaptively with technology is a skill acquired now in childhood. Many kids entering college are more tech savvy than their professors. Almost everything I know about computers I’ve learned from my students, not from the tech-support staffs of the various institutions with which I’ve been affiliated. One of my students just posted a comment to a class discussion in which he mentioned that one of his engineering professors had explained that what he learned in class might, or might not, apply once he was out in the workforce.

Technology is simply developing too rapidly for universities to be able to teach students the sorts of technical skills that old-farts are blustering they need. Kids don’t need to be taught how to deal with technology. They know that. They need to be taught how to think. The need to be taught how to concentrate (something that it is increasingly evident they are not learning in their ubiquitous interactions with technology). They need to be taught how to focus for extended periods of time on very complex tasks. They need to be taught how to follow extended arguments, to analyze them, to see if they are sound, to see if the premises on which they are based are plausible, to recognize whether any of the myriad inferences involved are fallacious. They need to be taught that they are not entitled to believe whatever they want, that there are certain epistemic responsibilities that go along with having the highly-developed brain that is specific to the human species, that beliefs must be based on evidence, evidence assiduously, painstakingly, and impartially collected.

Finally, students need to be taught to trust their own educated judgment, not constantly to second guess themselves or to defer to a superior simply because that person is superior and hence in a position to fire them. They need to be taught to believe in themselves and their right to be heard, particularly when they are convinced, after much careful thought, that they are correct and that their superiors are not.

Unfortunately, young people are not being taught these things. We are preparing them to be cogs in new kind of machine that no longer includes cogs. No wonder our economy, not to mention our culture more generally, is on the skids.

(This piece originally appeared in the 3 February 2014 issue of Counterpunch)

Education and Philosophy

Mind CoverOne of the things I love about philosophy is how egalitarian it is. There’s no “beginning” philosophy and no “advanced” philosophy. You can’t do philosophy at all without jumping right in the deep end of the very same questions all philosophers have wrestled with since the time of Plato, questions such as what it means to be just, or whether people really have free will.

This distinguishes philosophy from disciplines such as math or biology where there’s a great deal of technical information that has to be memorized and mastered before students can progress to the point where they can engage with the kinds of issues that preoccupy contemporary mathematicians or biologists. There is thus a trend in higher education to create introductory courses in such disciplines for non-majors, courses that can introduce students to the discipline without requiring they master the basics the way they would have to if they intended to continue their study in that discipline.

Philosophy programs are increasingly coming under pressure to do the same kind of thing with philosophy courses. That is, they are essentially being asked to create dumbed-down versions of standard philosophy classes to accommodate students from other majors. Business majors, for example, are often required to take an ethics course, but business majors, philosophers are told, really do not need to read Aristotle and Kant, so it is unreasonable to ask them to do so.

Yeah, that’s right, after all, they’re not paying approximately 50K a year to get an education. They’re paying for a DEGREE, and the easier we can make that for them, the better!

But I digress. I had meant to talk about how egalitarian philosophy is. Anyone can do it, even today’s purportedly cognitively challenged generation. Just to prove my point, I’ll give you an example from a class I taught yesterday.

We’re reading John Searle’s Mind: A Brief Introduction (Oxford, 2004) in my philosophy of mind class this term. We’re up to the chapter on free will. “The first thing to notice,” Searle asserts, when examining such concepts as “psychological determinism” and “voluntary action,” “is that our understanding of these concepts rests on an awareness of a contrast between the cases in which we are genuinely subject to psychological compulsion and those in which we are not” (156).

“What do you think of that statement?” I asked my students. “Is there anything wrong with it?”

“It’s begging the question,” responded Raub Dakwale, a political science major.

“Yes, that’s right,” I said smiling. “Searle is BEGGING THE QUESTION!” Mr. Big deal famous philosopher, John Searle, whose book was published by Oxford University Press, commits a fallacy that is easily identified by an undergraduate student who is not even a philosophy major. That is, the issue Searle examines in that chapter is whether we have free will. He even acknowledges that we sometimes think our actions are free when they clearly are not (the example he gives is of someone acting on a post-hypnotic suggestion, but other examples would be easy enough to produce).

But if we can be mistaken about whether a given action is free, how do we know that any of our actions are free? We assume that at least some of them are free because it sometimes seems to us that our actions are free and other times that they are compelled. But to say that it sometimes seems to us that our actions are free is a very different sort of observation from Searle’s that we are sometimes aware that we are not, in fact, subject to psychological compulsion.

To be fair to Searle, I should acknowledge that he appears to associate “psychological compulsion” with the conscious experience of compulsion, as opposed to what he calls “neurobiological determinism,” which compels action just as effectively as the former, but which is never “experienced” consciously at all. So a charitable reading of the passage above might incline one to the view that Searle was not actually begging the question in that an awareness of an absence of psychological compulsion does not constitute and awareness of freedom.

But alas, Searle has to restate his position in the very next page in a manner that is even more conspicuously question begging. “We understand all of these cases [i.e., various cases of unfree action],” he asserts, “by contrasting them with the standard cases in which we do have free voluntary action” (158, emphasis added).

You can’t get more question begging than that. The whole point is whether any human action is ever really free or voluntary. This move is in the same family with the purported refutation of skepticism that was making the rounds of professional philosophers when I was in graduate school, but which I hope since then has been exposed for the shoddy piece of reasoning that it was.

Back then, philosophers would claim that the classical argument in favor of skepticism rested on cases of perceptual illusion (e.g., Descartes’ stick that appears broken when half of it is submerged under water but which appears unbroken when removed from the water), but that perceptual illusions could be appreciated as such only when compared with what philosophers refer to as “veridical cases” of sense perception. That is, you know the stick is not really broken because removing it from the water reveals that it is not really broken. But if sense experience can reveal the truth about the stick, then the skeptics are mistaken.

But, of course, you don’t need to assume that the latter impression of the stick is veridical in order to doubt that sense experience could ever be veridical. All you need is two conflicting impressions of the same object and the assumption that the same stick cannot be both broken and straight. That is, all you need is two conflicting impressions of the same object and the law of non-contradiction to support skepticism. That seemed glaringly obvious to me when I was still a student, and yet scads of professional philosophers failed to grasp it.

Professional philosophers can be incredibly obtuse, and ordinary undergraduates, even today, with the right sort of help and encouragement, can expose that obtuseness. It’s a real thrill for a student to do that, to stand right up there with the big guys/gals and actually get the better of them in an argument, so to speak. It’s a thrill that is reserved, I believe, for philosophy. That is, it seems unlikely that anything comparable happens in the average calculus or organic chemistry class.

My point here is not to argue that philosophers in general are stupid, or even that Searle, in particular, is stupid. They aren’t, and he isn’t. Despite Searle’s occasional errors in reasoning, he’s one of the most original philosophers writing today. My point is that philosophy, as one of my colleagues put it recently, “is freakin’ hard.” It’s hard even after one has been rigorously schooled in it.

There’s no way to dumb down philosophy and have it still be philosophy. Philosophy is training in thinking clearly. There’s no way to make that easier for people, so why would anyone suggest that there was?

Perhaps it’s because philosophy is the discipline most threatening to the status quo, even more threatening than sociology. Sociology can educate people concerning the injustices that pervade contemporary society, but only training in critical and analytical thinking can arm people against the rhetoric that buttresses those injustices. This country, and indeed the world, would look very different today, if the average citizen back in 2001 had been able to recognize that “You’re either with us, or against us” was a false dichotomy.

(This piece originally appeared in the Nov. 22-24, 2013 Weekend Edition of CounterPunch)

America the Philosophical?

America the Philosophical (cover)Carlin Romano’s book America the Philosophical (Knopf, 2012), opens with an acknowledgement that American culture is not widely perceived, even by Americans, to be very philosophical. He quotes Alexis de Tocqueville’s observation that “in no country in the civilized world is less attention paid to philosophy than in the United States” (p. 5) as well as Richard Hofstadter’s observation in Anti-Intellectualism in American Life (Knopf, 1963) that “[i]n the United States the play of the mind is perhaps the only form of play that is not looked upon with the most tender indulgence” (p. 3). Romano observes that while in England philosophers “write regularly for the newspapers” and that in France philosophers appear regularly on television, “[i]n the world of broader American publishing, literature, art and culture, serious references to philosophy, in either highbrow or mass-market material barely register” (p. 11). Yet despite these facts he boldly asserts that the U.S. “plainly outstrips any rival as the paramount philosophical culture” (p. 15).

I know Romano. I’m on the board of the Greater Philadelphia Philosophy Consortium and Romano has attended some of our meetings. He’s an affable guy, so I was predisposed to like his book despite its wildly implausible thesis. Maybe there is a sense, I thought to myself, in which Americans are more philosophical than people in other parts of the world. We tend to be less authoritarian, I realized hopefully, and authoritarianism is certainly antithetical to genuine philosophical inquiry. Unfortunately, I didn’t have to reflect long to realize that we tend to be less authoritarian than other peoples because we have little respect for learnin’, especially book learnin’. We don’t believe there really are such things as authorities.

How is it possible that the U.S., despite all the evidence to the contrary that Romano marshals, can be “the paramount philosophical culture”? Romano’s answer is that the evidence that suggests we are not philosophical consists of nothing more than “clichés” of what philosophy is. He asserts that if we throw out these “clichés” and reduce philosophy to “what philosophers ideally do” (p. 15), then it will become obvious that America is the “paramount philosophical culture.” That is, Romano makes his case for America the Philosophical by simply redefining what it means to be philosophical, which is to say that he simply begs the question.

According to Romano what philosophers ideally do is “subject preconceptions to ongoing analysis.” But do most Americans do this? It’s not clear to whom he’s referring when he asserts that Americans are supremely analytical. Some Americans are very analytical, but the evidence is overwhelming that most are not. Public discourse in the U.S. is littered with informal fallacies such as ad hominen, straw man, and post hoc, ergo propter hoc arguments that are almost never exposed as such. Americans like to “think from the gut”–which is to say that they tend not to care much for reasoned analysis.

Even if most Americans were analytical in this sense, however, that alone, would not make them philosophical. Subjecting preconceptions to ongoing analysis is certainly part of what philosophers do, but it isn’t all they do. Philosophers have traditionally pursued the truth. That, in fact, is the classical distinction between the genuine philosophers of ancient Greece, figures such as Socrates and Plato, and the sophists. Socrates and Plato were trying to get at the truth. The sophists, on the other hand, were teachers of rhetoric whose primary concern was making money (not unlike for-profit educators today). They were characterized, in fact, as advertising that they could teach their pupils how to make the weaker argument appear the stronger. That is, they taught persuasion with relative, if not complete, indifference to the actual merits of the arguments in question. That’s why they were reviled by genuine seekers after truth.

Romano is unapologetic in presenting his heroes as the sophist Isocrates and the “philosopher” Richard Rorty. He devotes a whole chapter of the book to Isocrates, attempting to defend him against the characterization of sophists presented above. He does a good job of this, but at the end of the chapter, the fact remains that Isocrates was far more practical in his orientation than was Socrates (or any of his followers). “Socrates,” observes Romano, “in the predominant picture of him drawn by Plato, favors discourse that presumes there’s a right answer, an eternally valid truth, at the end of the discursive road. Isocrates favors discourse, but thinks, like Rorty and Habermas, that right answers emerge from appropriate public deliberation, from what persuades people at the end of the road” (p. 558).

But people are often persuaded by very bad arguments. In fact, one of the reasons for the enduring popularity of the informal fallacies mentioned above is how effective they are at persuading people. Truth has to be more than what people happen to agree it is. If that were not the case, then people would never have come to consider that slavery was wrong, and slavery would never have been abolished. It won’t work to point out that slavery was abolished precisely when the majority of humanity was persuaded that it was wrong, and not simply because masses of humanity had to be dragged kicking and screaming to that insight, but primarily because someone had to do the dragging. That is, someone, or some small group of individuals had to be committed to the truth of a view the truth of which evaded the majority of humanity and they had to labor tirelessly to persuade this majority that it was wrong.

Right answers have to be more than “what persuades people at the end of the road” (unless “end of the road” is defined in such as way as to beg the question). The sophists were the first PR men, presenting to young Athenian aristocrats the intoxicating vistas of what can be achieved through self promotion when it is divorced from any commitment to a higher truth. In that sense, Romano is correct, Isocrates, to the extent that he elevates what actually persuades people over what should persuade them, is more representative of American culture than is Socrates.

But is it fair to say that most Americans are followers of this school of thought in that, like Isocrates and Rorty, they have carefully “analyzed” traditional absolutist and foundationalist accounts of truth and found them wanting, that they have self consciously abandoned the Enlightenment orientation toward the idea of the truth in favor of a postmodern relativism or Rortyan pragmatism. There’s a small portion of American society that has done this, a small sub-set of academics and intellectuals who’ve fallen under the Rortyan spell. Most Americans have never even heard of Richard Rorty, let alone self-consciously adopted his version of pragmatism.

That’s not to say we Americans are stupid though. Hofstadter distinguishes, early in Anti-Intellectualism in American Life, between “intelligence” and “intellect.” Intelligence, he observes,

is an excellence of mind that is employed within a fairly narrow, immediate, and predictable range; it is a manipulative, adjustive, unfailingly practical quality—one of the most eminent and endearing of the animal virtues. …. Intellect, on the other hand, is the critical, creative, and contemplative side of mind. Whereas intelligence seeks to grasp, manipulate, re-order, adjust, intellect examines, ponders, wonders, theorizes, criticizes, imagines. Intelligence will seize the immediate meaning in a situation and evaluate it. Intellect evaluates evaluations, and looks for the meanings of situations as a whole. Intelligence can be praised as a quality in animals; intellect, being a unique manifestation of human dignity, is both praised and assailed as a quality in men (p. 25).

These characterizations of intelligence and intellect seem fairly uncontroversial, and according to them, philosophy would appear to be an expression of intellect rather than intelligence. That is, it’s possible to be intelligent, indeed to be very intelligent, without being at all intellectual. Hofstadter asserts that while Americans have unqualified respect for intelligence, they are deeply ambivalent about intellect. “The man of intelligence,” he observes, “is always praised; the man of intellect is sometimes also praised, especially when it is believed that intellect involves intelligence, but he is also often looked upon with resentment or suspicion. It is he, and not the intelligent man, who may be called unreliable, superfluous, immoral, or subversive” (p. 24).

What, you may wonder, does Romano think of this argument? That’s hard to say because the only references to Hofstadter in the book are on pages 3 and 8. His name is never mentioned again, at least not so far as I could tell, and not according to the index. Conspicuously absent from the index as well are both “intelligence” and “intellect.” Romano has written an entire book of over 600 pages that purports (at least according to the intro) to refute Hofstadter’s argument that Americans are generally anti-intellectual without ever actually addressing the argument.

Now that is clever! It’s much easier to come off looking victorious if you simply proclaim yourself the winner without stooping to actually engage your opponent in a battle. It’s kind of disingenuous though and in that sense is a strategy more suited to a sophist than to a genuine philosopher.

(This piece originally appeared in the Nov. 8-10, 2013 Weekend edition of Counterpunch)