Pages

    Pages

      Sunday, February 28, 2010

      Sunset for Sundown Signs?

      File this away in the E-for-Effort folder of "Things The New York Times Gets Wrong About The South."

      There's an interesting article in the New York Times today titled "Race in the South in the Age of Obama" about James Fields, a black State Representative from predominently-white Cullman, Alabama. As the author Nicholas Davidoff notes, Cullman is notorious for its history of being a "racist white bigot county." (Dawidoff adds: "In Alabama, this is, of course, saying something.") Though Rep. Fields neither confirms nor denies this, one of the things that helped Cullman County earn its reputation was its display of "sundown signs" up through the 1970's. Sundown signs were not unusual in the pre-Civil Rights South, nor were they unambigious. Cullman's sign was typically straightforward in its warning, and read "Niggers Don't Let The Sun Go Down On You In This Town." The aim of the NYT article, it seems, is to show how much has changed "in the Age of Obama" in towns like Cullman, and Representative Fields' success story is just about all the evidence we need.

      Or is it?

      Dawidoff's article is one of those strange pieces in which the argument and the evidence seem to be at odds with each other. Recounting Rep. Fields' unlikely rise to political office in the midst of a historically (and abidingly) racist Alabama county is meant to demonstrate that the era of sundown signs has decidedly passed. And, of course, to a certain extent, that's right. Towns do not (read: cannot) hang racist sundown signs anymore, and despite the fact that Obama was soundly defeated in Alabama, it is probably true that his candidacy and eventual election cleared a space for other black politicians like Fields. But most of Dawidoff's article, and Fields' own first-person account, is a testament to what Fields has accomplished in spite of Obama, not because of him... that is, what Fields has accomplished in spite of, even in the midst of, abiding Southern racism. The title of Dawidoff's article indicates that we are to learn something about "Race in the South in the Age of Obama," but there is little in the article there to learn. It is, for the most part, a very familiar picture of Southern racism presented in the context of an instance of its exception. Dawidoff's tone suggests that we might understand the Fields Exception as signaling the sunset of "sundown sign" racism. That sort of racism, this story allegedly demonstrates, can be and is overcome by white racists... as long as they can find a "respectable" black (not like the rest of "those people," not like Obama) that they know and can relate to as a "friend and neighbor."

      Interestingly, Dawidoff includes a quote by historian Wayne Flynt that illuminates much of the tension Dawidoff seems to miss in his own article. Explaining how Fields' white constituents might be able to look past their generally-held racist assumptions and vote for a black man, Flynt says:

      Incongruous and incomprehensible to people is that Southerners generally don't personalize race. The more abstract race becomes, the more racist it becomes... In my lifetime, the major change is that personalism can trump racism.

      What Flynt's observation illuminates, I think, is that white racists' sundown signs are still around (even if invisible) and the warnings still generally apply to blacks (even if allowing exceptions for the blacks they know "personally"). So, I'm not sure that Dawidoff's article doesn't really show that "race in the South in the age of Obama" ends up looking quite a bit like "race everywhere else in the age of Obama," that is, too self-congratulatory and too in thrall with the image of a "post-racial" America. There's not a whole lot that's unique to the South in this story, save the fact that the South still suffers from regionalist stereotypes of it as a place where sundown signs still hang. (Even though they don't still hang, and even though, when they did, they also hung everywhere else in America.) It's just yet another example of the terribly naive belief that racism is a personal, psychological anomaly that can be combatted incrementally.

      Flynt had it right: the more abstract and impersonal race becomes, the more racist it becomes. There's nothing about statements like "Some of my closest friends are black" or "I voted for a black candidate" that cancels out the old sundown sign's generic "Niggers Don't Let The Sun Go Down On You In This Town." Exceptions do not invalidate the rule. Rather, exceptions-- qua "exceptions"-- reinforce the Rule of the rule.

      Saturday, February 27, 2010

      Trials for Terrorists

      Earlier this month, Senator Lindsay Graham (R-South Carolina) introduced a bill that would prohibit the use of Justice Department funds for prosecuting alleged 9/11 plotters in federal courts. The aim of the bill (and a similar one introduced in the House by Representative Frank Wolf from Virginia) is to tie the Obama administration's hands and force them to try terrorism suspects in military tribunals. Defense Secretary Robert Gates and Attorney General Robert Holder wrote a letter to Senate leaders opposing Graham's bill, explaining that it
      would be "unwise, and would set a dangerous precedent." And the Department of Justice recently issued a statement to the same effect, claiming "There is no precedent in the history of our Nation in which Congress has intervened in such a manner to prohibit the prosecution of particular persons or crimes." Given that there have been over 300 successful prosecutions of terrorist suspects in federal criminal courts, compared with only 3 successdul prosecutions by military commission, it seems fair to ask: what is there to fear about fair trials?

      The issue here is clearly not about which prosecutorial form (federal court trials or military commissions) are more "effective," as the NYU School of Law's "Terrorist Trial Report Card" shows. The federal court system has been eminently effective both in intelligence collection and incapicitating terrorists, while the legality of military tribunals have been challenged all the way to the Supreme Courth three times already. That is to say, if the only concern here were simply whether or not "justice" would be done-- in the sense of punishment meted out to demonstrably guilty parties-- we would have to side with the federal court system. Unfortunately, I fear that the accomplishment of "justice" is not what is primarily at stake in these debates.

      Last month, I attended a lecture by Karl Rove (which I wrote about in my post "Rove at Rhodes") in which he presented what I consider a representative argument of the opponents of fair trials for terrorism suspects. Rove claimed that the Obama administration's error is their tendency to "treat terrorism as a crime and not a war." I'm not sure I realized it at the time, but I have come to believe since that Rove's characterization is far more telling that he (or I) may have realized. What is really at issue in this debate over whether we should turn over terrorist suspects to the Department of Justice or to the Department of Defense is precisely a question about how we intend to frame what sort of infraction "terrorism" is. Is a terrorist act a "crime"? Or is it a "declaration of war"? Is it a violation of a law (or, more broadly, the rule of law) or is it a violation of sovereignty?

      My philosophical inclination is to say that it is something between the two. Clearly, all terrorist acts violate some law and, inasmuch as they have the superadded political element of being characterized as "terrorist," they do so in a way that is intended to disrupt and disavow the very rule of law that proscribes them. But it is precisely that superadded characterization that makes terrorist acts also a violation of State sovereignty, even a violation of the idea of State sovreignty, of the principle of a State governed by the rule of law. So, technically speaking, terrorist acts are "crimes," but they are also a peculiar sort of "declaration of war"... something like a "declaration of war against the rule of law." My guess is that it is this latter offense that so troubles people like Rove, who see terrorism as a threat to the integrity and security of a recognized State and, consequently, the proper business of the Department of Defense. The problem is, of course, that the State the Defense Department is charged with defending is a State governed by the rule of law. If the mechanisms and strategies of its defense are themselves abrogations of the rule of law, as they arguably are in military tribunals, then I fear we run the risk of robbing Peter to pay Paul.

      I would argue that there is a both a principled and a strategic reason to insist that "terrorism" be treated as a crime and not a war, despite the fact that it may be both. On principle, affording terrorist suspects all of the rights and privileges of the judicial process reaffirms our commitment to the very rule of law that their acts terrorize and which put us at such grave risk. Inasmuch as we are able to execute fair and impartial trials for these suspects, we can make good-faith judgments about the impermissability of terrorism as form of political action without worrying that we may have terrorized our own legal system in the course of those judgments. Strategically, advocating trials over military commissions is a way of delegitimizing the claims of those who too quickly subordinate the principle of the rule of law to national security... forgetting that, without the rule of law, the Nation they are claiming to secure disappears. If we accept Rove's claim that terrorism should be treated as a war and not a crime, we will inevitably find ourselves at war with ourselves, and we will have abandoned any appeal to impartial principles that might adjudicate our internal disputes.

      In short, we need terrorism to be a "crime," even more than alleged terrorist suspects need it to be, because we need this issue to be about justice and not war. War does not decide matters of justice. I am reminded of La Fontaine's fable "The Wolf and the Lamb," a kind of cautionary tale about the dangers of conflating might and right. In the fable, the wolf engages the lamb in a charade of a trial, in which every defense the lamb presents for its existence is futile. It is clear that the proceedings are a fait accompli, and that the lamb will be devoured by its predator "without any other why or wherefore." But it is in the "why or wherefore" that we determine matters of justice, and it is there that we place our confidence that, in the end, our judgments are not merely brute exercises of power.

      Friday, February 26, 2010

      Guess Things Happen That Way

      I've often said that if God speaks to humanity, he does so through Johnny Cash. Unfortunately for the rest of us, Cash passed away in 2003, officially of diabetes-related complications, but more likely because he was unable to stand this world any longer after the death of his wife, June Carter Cash, a few months earlier. Today would have been The Man in Black's 78th birthday.

      Having grown up in Tennessee, I've heard Johnny Cash's music my whole life. Yet, it wasn't until I picked up a guitar for the first time, at the age of 19, that I really fell in love. I think "Ring of Fire" was one of the first songs I learned to play. Like most of Cash's songs, it's just three basic chords, one simple story, and a well-worn metaphor in the service of one powerfully, intensely complicated truth: Love is a burning thing. "Ring of Fire" wasn't even written by Johnny Cash (June Carter Cash wrote it, about Johnny), but he was the one who heard it in his head with those now-famous mariachi horns and popularized it, and it's probably the song most people associate with him. That's probably an appropriate association, as love was just one of many things that burned, burned, burned Johnny Cash throughout his life-- there were also the pills, and the booze, and the temper, and the money, and the fame. But that rode-hard-and-hung-up-wet life made him the irresistable, gravel-voiced siren that we fell in love with, and it's what we miss now that he's gone.

      In a fitting and serendipitous turn of events earlier this week, iTunes announced that it had sold it's 10 billionth song, which was Johnny Cash's "Guess Things Happen That Way." This is one of my favorite songs of all time. It's a kind of stoic ballad, a concession to the vicissitudes of life and love, with a tempered and mild expression of dissatisfaction about it all. The refrain-- "I don't like it, but I guess things happen that way"-- is delivered with about as much conviction as a pro forma complaint, and the imposition of that generic form somehow strangely makes it all the more universal and, consequently, all the more potent. "Guess Things Happen That Way" is a sad song, but it's sad in that way that Bob Dylans's "Don't Think Twice, It's All Right" is sad-- that is, as a demonstration of the utter failure of the principle that sadness can be shrugged off. (Btw, you can read more from me about sad songs here.) Weeping and wailing is sad too, of course, but not as sad as: "You ask me if I'll get along? I guess I will, some way." Anyway, here's the man himself:



      Happy birthday, Johnny Cash. You're gone, and I don't like it... but I guess things happen that way.

      Thursday, February 25, 2010

      The Things They Take With Them From Class

      I'm inclined to just post this image without any explanation at all. It's a snapshot of a status update I saw on Facebook yesterday in reference to my colleague Dr. Grady and myself. Grady is teaching Kant's first Critique right now, and I just began teaching Marx in my 19th C. Philosophy course.











      (You can click on the image for a clearer picture.)

      In my defense, the communists-and-rainbows remark came over two hours into a three hour lecture, so perhaps I had let the rigor of my philosophical vocabulary slip a bit by that point. And it was intended as a tongue-in-cheek paody of the way people think Marx envisions the post-revolutionary future. No matter how hard I try to emphasize to students that we need to work our way through Marx's critique of capitalism before we can take an informed look at what he offers in its stead, everyone still wants to skip ahead to the end and condemn the "communist future" as utopian first. My remark was just a way of temporarily indulging that misconception so that we could table it and get back to the work we needed to do.

      Or, at least, that's my story... and I'm sticking to it.

      Monday, February 22, 2010

      Outbursts I've Considered, Then Thought Better Of...

      I don't think I've said so officially on this blog yet, but the NBC sitcom "Community" is one of the funniest things on television in a long, long time. It's about a group of ne'er-do-well's attending community college who accidentally fall into a Spanish "study group" together. Much hilarity ensues. The group's Spanish professor, Señor Chang (Ken Jeong) is something like the wickedly uninhibited alter-ego of just about every teacher I know (including me). Just check out his self-introduction to his class on their first day. There hasn't been an episode yet where I haven't guffawed, sighed, and secretly wished I had Señor Chang's cajones.

      Every job has its own special set of wearisome burdens, its own unique species of albatross, that can weigh one down and wear one out. Academia is no different. And every job has its own special set of imaginary scenarios that exasperated workers conjure up from time to time and in which they envisage themselves as the heroic protagonists. Restaurant servers imagine spitting in their customer's food. Taxi drivers imagine running over the foot of their passengers. Customer service reps imagine telling their complainants where they can stick it. Professors, I suspect, imagine being Señor Chang.

      Most of us never really indulge those diabolical fancies, either because we're just too professionally invested in our own integrity (and decorum)... or because we know that it's only on television that people actually get away with those kinds of outbursts. And so, we dutifully trudge along, answering endless questions (that are clearly stated on the syllabus), responding to emails (that assume more familiarity than we grant to our own families), scribbling constructive comments on pages and pages (that will be neither read nor heeded), and generally maintaining all of the pomp and circumstance of the Life of the Mind. We don't ever, EVER, say things like what Señor Chang says-- "My knowledge will BITE YOUR FACE OFF!"-- because we know it would seem, well, unseemly. Even if it is true.

      [Incidentally, my other favorite Chang quote comes from his impromptuu self-aggrandandizing rap: "All up in yo' cabeza/ without a chasah/ there's no otha TEACHA with this much FLAVA..."]

      Apparently this default reserve is not inhibiting ALL of us, though. New York University Business School Professor Scott Galloway apparently doesn't suffer from the same make and model of superego as the rest of us. According to a recent report, Professor Galloway has decided to throw caution to the wind when it comes to inane emails from students. Recently, he received the following email from one of his students:

      Sent: Tuesday, February 9, 2010 7:15:11 PM GMT -08:00 US/Canada Pacific

      Subject: Brand Strategy Feedback

      Prof. Galloway,

      I would like to discuss a matter with you that bothered me. Yesterday evening I entered your 6pm Brand Strategy class approximately 1 hour late. As I entered the room, you quickly dismissed me, saying that I would need to leave and come back to the next class. After speaking with several students who are taking your class, they explained that you have a policy stating that students who arrive more than 15 minutes late will not be admitted to class.

      As of yesterday evening, I was interested in three different Monday night classes that all occurred simultaneously. In order to decide which class to select, my plan for the evening was to sample all three and see which one I like most. Since I had never taken your class, I was unaware of your class policy. I was disappointed that you dismissed me from class considering (1) there is no way I could have been aware of your policy and (2) considering that it was the first day of evening classes and I arrived 1 hour late (not a few minutes), it was more probable that my tardiness was due to my desire to sample different classes rather than sheer complacency.

      I have already registered for another class but I just wanted to be open and provide my opinion on the matter.

      Regards,
      xxxx


      xxxx
      MBA 2010 Candidate
      NYU Stern School of Business
      xxxx.nyu.edu
      xxx-xxx-xxxx


      To which, the Professor responded:

      From: [email protected]
      To: "xxxx"
      Sent: Tuesday, February 9, 2010 9:34:02 PM GMT -08:00 US/Canada Pacific
      Subject: Re: Brand Strategy Feedback

      xxxx:

      Thanks for the feedback. I, too, would like to offer some feedback.

      Just so I've got this straight...you started in one class, left 15-20 minutes into it (stood up, walked out mid-lecture), went to another class (walked in 20 minutes late), left that class (again, presumably, in the middle of the lecture), and then came to my class. At that point (walking in an hour late) I asked you to come to the next class which "bothered" you.

      Correct?

      You state that, having not taken my class, it would be impossible to know our policy of not allowing people to walk in an hour late. Most risk analysis offers that in the face of substantial uncertainty, you opt for the more conservative path or hedge your bet (e.g., do not show up an hour late until you know the professor has an explicit policy for tolerating disrespectful behavior, check with the TA before class, etc.). I hope the lottery winner that is your recently crowned Monday evening Professor is teaching Judgement and Decision Making or Critical Thinking.

      In addition, your logic effectively means you cannot be held accountable for any code of conduct before taking a class. For the record, we also have no stated policy against bursting into show tunes in the middle of class, urinating on desks or taking that revolutionary hair removal system for a spin. However, xxxx, there is a baseline level of decorum (i.e., manners) that we expect of grown men and women who the admissions department have deemed tomorrow's business leaders.

      xxxx, let me be more serious for a moment. I do not know you, will not know you and have no real affinity or animosity for you. You are an anonymous student who is now regretting the send button on his laptop. It's with this context I hope you register pause...REAL pause xxxx and take to heart what I am about to tell you:

      xxxx, get your shit together.

      Getting a good job, working long hours, keeping your skills relevant, navigating the politics of an organization, finding a live/work balance...these are all really hard, xxxx. In contrast, respecting institutions, having manners, demonstrating a level of humility...these are all (relatively) easy. Get the easy stuff right xxxx. In and of themselves they will not make you successful. However, not possessing them will hold you back and you will not achieve your potential which, by virtue of you being admitted to Stern, you must have in spades. It's not too late xxxx...

      Again, thanks for the feedback.

      Professor Galloway


      Of course, I would never do this. Just like I would never actually indulge in the many Señor Chang-ish outbursts that I find so satisfying by proxy. But there's something that makes me very happy to know that there are people out there who would, and do, do this. Even if only because maybe, just maybe, the very real possibility of this kind of response from professors might motivate students to self-monitor just a tad more.

      More On Tenure

      The Chronicle of Higher Education has a follow-up piece on the Amy Bishop story called "Reactions: Is Tenure a Matter of Life and Death?", in which they ask several academics (at varying levels of seniority) to repond to the questions: What are the psychological effects of academic culture, particularly on rising scholars? Can or should something be done to change that culture?

      I was particularly interested in John C. Cavanaugh's (Chancellor of the Penn State System of Higher Education) answer. Cavanaugh writes: "If rising scholars need to give up any semblance of a normal life to obtain a doctorate or tenure, then that program's values are out of alignment. I, for one, do not want institutions full of people who sold their souls for a degree or for tenure. I want balanced, well-rounded scholars. Funny thing about that—isn't that exactly what we say in our marketing materials: that we want to produce in our undergraduate programs well-rounded, educated graduates?"

      Now, compare that to the remarks by Robert J. Sternberg (Dean of the School of Arts and Sciences at Tufts University), who writes: "Academe is a calling: If you do not feel called to it, find something else to do. The pay isn't great; the hours are typically long; and you never quite have a vacation. If you enter the game, you should do so accepting the rules and knowing that you may not get the outcome you desire."

      I sort of half-agree with both Cavanaugh and Sternberg. Academe is a calling, I think, but it ought not be a soulless one.

      Sunday, February 21, 2010

      Unscrambling Marx

      I'm about to begin teaching Karl Marx in my 19th C. philosophy class this week. Although students usually get some (very elementary) introduction to Marx in most of my other classes as well, this is the course in which they get the most extensive and systematic exposure to his writings. I always anticipate the Marx section with an admixture of joy and apprehension, excited about delving into Marx's ideas again, but dreading the resistance with which they are always met.

      I have often asked myself: Why is Marx so difficult to teach to students? Marx certainly isn't the "hardest" thinker I cover in my courses-- his texts aren't as dense as Hegel's, or as complicated as Kant's, or as polyvalent as Plato's, or as idiosyncratic as Derrida's. But unlike those other thinkers (with the possible exception of Derrida), students come to class already with a host of prejudices and presuppositions about Marx that are very hard to overcome. A few years ago, I experimented with the practice of beginning the Marx section of my course by simply asking students what they "knew" of Marx already, in the hopes that getting the "scrambled" Marx out there and visible right on the front end would be helpful. (It wasn't helpful.) NYU Professor Bertell Ollman once noted, in book called Social and Sexual Revolution, that "the major hurdle in presenting Marxism to American students is their bourgeois ideology, the systematic biases and blind spots, which even the most radical bring with them." There is nothing in bourgeois idelogy, Ollman notes, that doesn't have a "scrambling effect" on students' reception of Marxian ideas. He describes this bourgeois ideology in two levels, the first harder to overcome than the second. Ollman writes:

      In my experience, the most troublesome notions have been students' egotistical and ahistorical conception of human nature; their conception of society as the sum of separate individuals, and with this the tendency to reduce social problems to problems of individual psychology; their identification of Marxism with Soviet and Chinese practice; and of course the ultimate rationale that radical change is impossible in any case. Much less destructive and also easier to dislodge are the intrinsically feeble notions that we are all middle class, that there is a harmony of interests under capitalism, that the government belongs to and represents everybody equally, and that history is the product of the interaction of great people and ideas.

      Check. I've encountered all of these in my classes. (The most consistently frustrating to me being the "ultimate rationale that radical change is impossible in any case.") But if there's one thing that we ought to have learned from Marx, it is that bourgeois ideology tends to be totalizing and, hence, none of us are entirely free of its distorting effects. And so, even as I attempt to chip away at and unscramble some of the bourgeois misconceptions above, I must also remain attentive to my own bourgeois blind spots. Yet, taking this kind of piecemeal approach can be frustrating and time-consuming, and it evidences its own kind of misunderstanding of how ideological frameworks work. If only there were a way to do the unscrambling work at the meta-level instead of at the level of details.

      Here, Ollman is helpful, I think. After describing the elements of students' bourgeois resistance to Marx above, he wrties:

      Underpinning and providing a framework for all these views—whether in the form of conclusions or assumptions, and whether held consciously or unconsciously—is an undialectical, factorial mode of thinking that separates events from their conditions, people from their real alternatives and human potential, social problems from one another, and the present from the past and the future. The organizing and predisposing power of this mode of thought is such that any attempt to teach Marxism, or indeed to present a Marxist analysis of any event, is doomed to distortion and failure unless accompanied by an equally strenuous effort to impart the dialectical mode of reasoning.

      We've just completed 5 weeks on Hegel's Phenomenology in my class, so I hope that my students are well-prepped for an extra emphasis on the dialectical mode of reasoning. I plan to try this strategy this time around. If nothing else, I hope it can help assuage my frustration with the fatalistic, "radical change is impossible" mindset that so often impedes our study of Marx. One of the things that I hope my students learned from Hegel's Phenomenology is that no matter how frustrated consciousness got at the apparent irrationality of its world, and no matter how convicted it was in its despairing claims that nothing could be done, it eventually learned that the tools it needed to reconcile itself with the world were already immanent to it.

      Thursday, February 18, 2010

      The Deadly Serious Business of Tenure

      Last week, University of Alabama-Huntsville Professor of Biology Amy Bishop opened fire in a faculty meeting, killing three of her colleagues and wounding three others. Despite our hopeful image of the Ivory Tower as a place far removed from the ugliness of "real world" violence, stories like these remind us that, regrettably, wishing doesn't make it so. (As an aside, I have argued before on this blog-- in a post titled "The Problem With Packin'"-- that all campuses should be weapons-free zones.) The reports of witnesses at the faulty meeting seem to indicate that Bishop gave no warning of her plans, sitting through a full hour of the meeting before she began shooting. And, subsequently, Bishop's husband has reported that he didn't see any signs of trouble, either, though he did note that she had borrowed a gun and visited a shooting a range in the last couple of weeks. In the immediate aftermath of the event, all reports seemed to point to Dr. Bishop's tenure denial as the cause of her psychological "break."

      More news of Dr. Bishop's shady past has come to light in the last several days, casting some doubt on the theory that her tenure-denial was the cause of the shooting. In 1986, Bishop shot and killed her brother in an incident that at the time was ruled an "accident," but since has been revisited with considerable suspicion by the Braintree Police. Seven years later, in 1993, she was also questioned as a suspect in the attempted mail bombing of a Harvard Medical School professor. These new findings seem to point to a long history of psychological distress and a proclivity for violence on Bishop's part. Perhaps her tenure denial was the "trigger" for this particular incident, perhaps not. It's likely that we will never really know.

      Whether or not Bishop's contentious tenure bid was the culprit here, though, I still think it's worth looking at the havoc that the tenure process can wreak. Tenure is a phenomenon that is largely unfamiliar to (or misunderstood by) those outside of academia. But inside of academia, it's everything. In most colleges and universities, tenure is rewarded on the basis of the candidate's demonstration of excellence in three areas: scholarship, teaching, and service (which is largely interpreted to mean administrative service, like serving on committees and advising students, but also can include more nebulous categories of judgment, like "collegiality"). The reward of tenure is job security; one can't be fired after receiving tenure without demonstrable "cause," and the standards for that demonstration are considerably high. The aim of tenure is to protect academic freedom, to insure that scholars do not feel pressure to edit or amend their work for fear of being dismissed.

      Since the 1970's, there has been a steady decline in the number of academics who hold tenured positions. In 2005, the figure of college professors who are either tenured or tenure-track was put at just over 30%, which means that more than 2 out of 3 academics are neither tenured nor eligible for tenure. Given the considerable reward that accompanies tenure, one might think that those numbers are appropriate. But, let's look at the typical career trajectory of an academic:

      On average, it takes between 4 and 9 years to complete a Ph.D., which is a prerequisite for teaching at most postsecondary-level institutions. For many, this is at least a half-decade of "unemployment"-- if not also significant debt-accumulation-- occasionally off-set by graduate student teaching (at truly exploitative remuneration) or small stipends. IF one is lucky enough to land a tenure-track job right out of grad school, which is a considerable achievement in this day and age when most academic job candidates spend at least 2-5 years in fixed-term appointments, then one begins a "probationary" period of employment. On average, the "probationary" period before tenure usually takes another 6 years, at the end of which one is evaluated on one's achievement in the three areas mentioned above. So, in the 7th year (plus the 4-9 years of grad school and possibly a few more in fixed-term appointments), one FINALLY gets to make a one-time bid for tenure that, according to statistics, less than 3 in 10 cadidates will actually receive. If a candidate's tenure-bid is denied, he or she usually gets a one-year terminal appointment to look for a new job and, possibly, start all over again... only this time with the added disadvantage of being "damaged" goods.

      So, imagine that: almost two decades of one's life and work, not to mention tens of thousands of dollars of debt, all-in on one make-it-or-break-it chance for job security. You don't have to be a Vegas bookie to raise an eyebrow at those odds. Of course, none of that is justification for what Dr. Bishop did-- nothing could justify that-- but I hope it might give some pause to those who think that the academic life is a cushy and stress-free existence where professors don't have to worry about being fired like the rest of the working stiffs in the "real" world. Office politics are office politics no matter what kind of "office" one works in, and many tenure decisions are inflected with all of the human-all-too-human personality and ideology prejudices that contaminate every office. Of course, the stakes are high in any case where job termination is a possibility, but I fear that the general public's romantic idealizations of academia sometimes distort the realities of this life. Tenure is deadly serious business. Whether or not tenure was the chief cause of this case being actually deadly is probably unlikely, but one can see how it easily could have been a contributing factor.

      Sunday, February 14, 2010

      Battle of the Sexists?

      First, my apologies to regular readers of this blog for my extended absence of late. As some of you know, my department is hiring for a new tenure-track line this semester, a process which has the tendency to eat up every last moment of "spare" time for everyone involved.

      Second, I hope that you don't take the coincidence of this post and Valentine's Day to be an indication of some cynical, brooding ressentiment on my part. Not all single people hate Valentine's Day... least of all me. So, let me take a moment here to wave my Romantic flag in celebration, and to wish all of you in love the happiest of holidays.

      Now, with that out of the way, back to business. As you all know, the Super Bowl last weekend was a quite a shocker, bringing us the long-overdue and much-ballyhooed victory of the underdog New Orleans Saints over their highly favored opponent, the Indianapolis Colts. I'm a Colts fan-- well, really, I'm a Peyton Manning fan-- but even I will admit that it was hard for anyone with a human heartbeat not to take some joy in the Saints' come-from-behind victory. (As a historic and ironic sidenote: the last time that I remember experts being so positive-- and so positively wrong-- about the outcome of a Super Bowl was two years ago when the New York Giants beat the New England Patriots, ending the Patriots' perfect season. And who was the Leader of the Underdog Pack then? None other than Peyton's brother, Eli Manning.) My friend, fellow-blogger and bona fide football fan, JLotz, has an excellent post up on her blog recounting her Reflections in the Wake of Victory, which I highly recommend. I won't say much more on the game itself here other than to add my voice to the chorus of congratulations.

      For non-football fans-- such strange creatures they, so mysterious, a species that endlessly stymies my understanding-- the Super Bowl is less about the game than the party, the food, the friends and, of course, the commercials. In advance of game day, all the talk was about Tim Tebow and his mother's "anti-abortion" commercial. Turns out, the actual Focus on the Family 30-second spot was pretty tame... and not very funny. But lest all of that feminist outrage go un-discharged, Dodge stepped up to the plate with this commercial, called "Man's Last Stand."




      There are lots of things to criticize about the privileged-cum-victimized false consciousness of this ad, and one wouldn't have to dig very deep into feminist literature to find resources for such a critique. Yet, it seems fair to ask: how likely is it that a guy-- who, after all, just wants to drive a freakin' Dodge Charger and not get an earfull about it, for goodness sake-- is going to be receptive to all of the nuances of gender construction, heteronormativity, privilege-and-power confluence, and the manner in which each works to buttress and conceal the others? If only there were a way to meet this poor, beleaguered man-victim where he is...

      There are times when I worry that our Jon-Stewart-generation has forgotten what real political engagement looks like, preferring instead to be witty and ironic, to point our fingers at the naked Emporers and mock them for being ever-so-unhip(ster). There are times when I wonder whether or not that's engagement at all, or whether it's really something more like an exasperated, Bartleby-esque "I would rather not." Nevertheless, when I saw this response-ad, I was encouraged:





      What I like about this ad is that it demonstrates, with deadpan precision, just how absolutely elementary the asymmetry of sexual power and privilege is. I'm teaching my Feminist Philosophy course again this semester, and as a result I spend a lot of time trying to equip students with the conceptual infrastructure for understanding that asymmetry. Yet, even in class I sometimes wonder whether or not all that philosophical sophistication is wasted if they do not already admit that the asymmetry is real. Young men and women still need to be reminded, I fear, that for all of our progress, women still aren't paid equally for equal work, they aren't proportionally represented in lawmaking or judicial bodies, they still do the lion's share of child-rearing and housekeeping, and they still suffer sexual violence at atrocious rates. Those are just the facts, ma'am. If those facts are more likely to be received as true when presented in poker-faced irony (or parody), so be it. That's political engagement I can believe in.

      Of course, the danger here is that they won't be received as true, but rather as further evidence of the kind of suffering against which our Dodge Man is trying to make his "last stand." Dodge Man will protest, I am sure, that his feminist counterpart is engaging in the same manner of sexist stereotyping that she portends to protest. So, let me say again that "reverse sexism" is a myth that fundamentally misunderstands the systemic and structural nature of power. Dodge Man is no victim.

      Don't believe that sexism and so-called "reverse sexism" are different? Think they're just two sides of the same coin? Well, here's another response-ad to the Dodge commercial that might help disabuse you of that opinion.



      Now, if the complaints of our man and woman were really just two sides of the same coin, it ought to be fairly easy for each just to substitute him- or herself in the narrative of the other, right? But, just for kicks, go ahead and try to imagine Dodge Man voicing the woman's complaints.

      Q.E.D.

      Tuesday, February 02, 2010

      Understanding Health Care Reform In 16 Easy Steps

      I've been too busy to post here since the State of the Union last week, but I've got a healthy backlog of posts forthcoming. In the meantime, I want to direct readers of this blog to an excellent run-down of the need for health care reform over on Slacktivist. (I would re-print it in its entirety here, but I'd hate to steal Slacktivist's traffic.)

      In other news, Defense Secretary Robert Gates is expected to talk about the plan to repeal DADT today. That's "Don't Ask, Don't Tell," the federal law prohibiting openly gay, lesbian and bisexual individuals from serving in the military. It was later changed to "Don't Ask, Don't Tell, Don't Pursue, Don't Harass" (DADTDPDH) in an attempt to curb investigations initiated by the military without prior evidence of disallowed behaviors. My guess is that the repeal will go through, but mostly because the acronyms are just getting too unwieldy.

      Stay tuned...