Tuesday, July 28, 2009

Same Old, Same Old?

Mark Bauerlin's asks some interesting questions in his recent article in The Chronicle of Higher Education ("Diminishing Returns in Humanities Research"). Questions like: with 4,230 new academic publications on Hamlet appearing in just the last fifty years, is there really anything else to say about it? He raises legitimate concerns about the "publish or perish" ethic in academia now, which has resulted in exponentially more publications read by exponentially less people, and he wonders whether or not there is anything worthwhile to come of squeezing some of those turnips any longer. Although I would probably want to stop short of his exasperation (he writes: "Whoa! Slow down! Hamlet can't give you any more!"), I think he's onto something in his call for us to reconsider the merits of valuing quantity over quality in tenure and promotion decisions. I like his two suggested remedies: (1) for departments to "limit the amount of material they examine at promotion time," and (2) for subsidizers of humanities research to "shift their support away from saturated areas and toward unsaturated areas." I suspect his will be an uphill battle on both counts, as are all battles with the Publish or Perish Regime, which at this point has sedimented into some bloated, baroque, immovable object that serves as the center of gravity in the academic world. For all the complaining that people do about it-- and a lot of people do a lot of complaining-- all junior faculty know that you can't go over it, you can't go under it, and you can't go around it. Gotta go through it.

Still, I do have a couple of issues with Bauerlin's piece, which reads a little bit like the Book of Ecclesiastes at times. Remember this?
Vanity of vanities! All is vanity!
The thing that hath been, it is that which shall be;
and that which is done is that which shall be done:
and there is no new thing under the sun.
Those words from the Teacher in Ecclesiastes seem to be consonant with Bauerlin's tone when he's talking about what he calls the "saturated" areas of humanities research. I'm less inclined to agree with Bauerlin that there's nothing else to be said about Hamlet, or that filtering old texts through new theories (feminism, queer theory, deconstruction) doesn't actually produce new ideas, but that may be a prejudice that inheres in my particular discipline... where we've been knocking around the same questions and many of the same texts for over two millenia. Of course, that doesn't mean that everything new that comes out about these texts is genuinely "new" (or even interesting), but I'm not sure that his approach is the best one. There's a difference between saying that we should cut off some of the fat in academic publications and saying that many of the age-old areas of humanities research are now saturated, or "all" fat.

Especially given Bauerlin's concern with fostering good pedagogy as much as good research-- which constitutes some of the best parts of his article-- I wonder how productive it is to dismiss the importance of considering and reconsidering the humanities "canon." If he really does want students and young professors to spend more time talking about books and ideas, then the "there's-nothing-new-under-the-sun-to-say-about-Hamlet" approach probably isn't the best way to make a space for those conversations. I get the sense that he would probably be sympathetic with this criticism, that he really wants to say "there's nothing new to publish about Hamlet" (or "there's not that much new to publish about Hamlet"), so that young academics could feel more free to spend their time saying all the things there are to say about it to their students instead of to an ever-diminishing audience of professional readers.

On the other hand, to Bauerlin's credit, I haven't heard anything new about Hamlet in a long while, either.

Saturday, July 25, 2009


Listening to the Digital Dialogue conversation about Identity the other day, coupled with reading way-too-many of the "comments" sections on the Skip Gates' arrest story, has gotten me thinking a lot about the merits and demerits of online anonymity. Anyone who spends more than a second on the Internet surely knows the drawbacks-- "flame" wars, misinformation, lack of accountability, misattributions, and the cultivation of a kind of Hobbesian state of nature-- so I won't rehearse them all again here. But there are some advantages, some of them necessary, to the anonymity that digital media provides as well. For better or worse, there are views and criticisms that likely would never see the light of day if the subjects of those opinions didn't feel shielded from retribution. And, as anotherpanacea noted in the "Identity" interview, online anonymity also provides an avenue for "trying out" ideas that for whatever reasons (personal or professional) one might not want to wed oneself to just yet. So, at first glance anyway, it seems like its 6 one way, half-dozen the other... hard to make a call.

When I started this blog, I intentionally decided not to use (or, for the most part, even associate) my "real" name with it. That was a few years ago now, and I can't rightly remember all of my reasons for doing so, but I suspect it had a lot to do with the fact that I was nearing the end of graduate school and the beginning of the job market, and I worried that somehow, someway, this blog might come back to bite me in the rear. On the other hand, it wouldn't exactly take a crack detective to figure out who I am, and "Doctor J" is a far cry from an undecipherable code name, especially given all of the other identifying clues included in my posts over the years. So, I think it's fair to say that I am riding the fence of online anonymity here-- I most certainly do still enjoy some of the advantages of remaining behind the veil of "Doctor J", but it's a very thin veil, and the general outlines of my real features can be seen easily through it.

The conversation between Chris and Joshua in the Digital Dialogues piece tended to focus, not surprisingly, on the lack of responsibility that seems to attend anonymity. This is something with which we are all familiar-- the "I-can-say-anything-I-want-because-nobody-knows-who-I-am" sort of recklessness that dominates online fora. But what I found particularly interesting was the way that Joshua resisted that criticism. Of "flame wars," which tend to be grossly racist or misogynist, Joshua says:

"We say, 'the problem here is that none of these messages are being attributed to the speakers, so people just say whatever comes into their head.' I actually think that's wrong. I think that the problem is NOT that we're anonymous. The problem is that the medium we're communicating in is the written word. When we communicate in that medium, and we don't 'know' our interlocutor, we don't have any way of reading what they say except for in whatever voice we make up for them in our heads."

I like this approach, in particular, because it focuses attention on the fact that writing always already separates author and text, unsettles the authority that thinkers allegedly lend to their thoughts and, to a very real extent, always already "anonymizes" (clunky formulation, I know) the putative interlocutors. And, of course, "writing" is NOT a "new" medium. So, the superaddition of genuinely new forms of technology, on Joshua's account anyway, doesn't really change what is an age-old problem with ALL forms of written communication. As we all remember (or, more accurately, are "reminded of") from Plato's Phaedrus, the problem with writing is that the "speaker" cannot correct, amend, or explain him- or herself. And as we all remember from Derrida's "Plato's Pharmacy", this means that all written words are vulnerable to being de-contextualized and re-contextualized, such that the distinction between the meaning that the "author" intends and the meaning that the "reader" intends becomes thoroughly contaminated.

So, to return to the question that Chris wanted to put to Joshua: who's responsible when things seem to go terribly wrong? It's hard to say... because a bit of distance has already been put between each one and his or her thoughts just by virtue of those thoughts being written down. There isn't an isotropic relation between author and text, nor is there between reader and text, because the written medium introduces an extra, totally anonymous, space of interpretation. Does the absence of a "real" signature absolve one from responsibility any more than claiming authorship? Or, alternatively, should we credit those who "own" their thoughts by attaching their real signature with some extra moral responsibility? That, unfortunately, remains undecideable.

I am reminded of a passage from T.S. Eliot's "The Love Song of J. Alfred Prufrock" (one of my favorite poems of all time). The poem's anonymous speaker asks several times in the early stanzas exactly the sort of question that questions of anonymity provoke: how should I presume? should I then presume? Later in the poem, in a moment of muted frustration, the speaker realizes, and bemoans, that "it is impossible to say just what I mean!" This impossiblity seems exactly the one that attends all written communication, all anonymous interaction. What are the stakes of that impossible saying-just-what-I-mean being actually impossible? Again, from Eliot:

And would it have been worth it, after all,
After the cups, the marmalade, the tea,
Among the porcelain, among some talk of you and me,
Would it have been worth while,
To have bitten off the matter with a smile,
To have squeezed the universe into a ball
To roll it toward some overwhelming question,
To say: “I am Lazarus, come from the dead,
Come back to tell you all, I shall tell you all”—
If one, settling a pillow by her head,
Should say: “That is not what I meant at all.
That is not it, at all.”

This is what I think we want to avoid with online anonymity, this terrible reckoning with the "that is not what I meant at all." And yet, I fear, this is what we get nonetheless, even with a signature.

ADDENDUM (7/28/09): Anotherpanacea responds over on his blog here: "Qui Parle: Ethics and Anonymity"

Friday, July 24, 2009

Digital Dialogues

Friend and fellow philosopher-blogger Chris Long (Pennsylvania State University) has started a really interesting project that he's calling "Socratic Politics in Digital Dialogue," which is a series of philosophical conversations/interviews that Chris is making available as podcasts. (You can subscribe on iTunes by searching for "Digital Dialogues" under the podcast section.) This is a summer faculty fellowship project that Chris has undertaken with a few of his colleagues at PSU, and I really hope they decide to continue it beyond the summer. Followers of Chris' blog no doubt are aware that he is really the model of a sensitive and reflective citizen, epitomizing the "think globally, act locally" adage that so many others champion, but fail to enact. Here is Chris' description of his project:

In Plato’s dialogue Gorgias, Socrates claims to be one of the only Athenians who attempts the true art of politics. As is well known, Socrates haunted the public places in Athens looking for young people with whom he could converse. During these discussions, Socrates was intent on turning the attention of those he encountered toward the question of the good and the just. It is difficult to understate the lasting political power these dialogues have had over the course of time. Yet the emergence of social Web 2.0 technologies opens new possibilities for this ancient practice of politics, which Socrates fittingly called in the Gorgias, a “techne,” or art. "Socratic Politics in Digital Dialogue" is designed to explore the opportunities digital expression offers to enhance, deepen, expand and promote my academic scholarship in philosophy by focusing on issues related to the Socratic practice of politics.

Chris Long is really on the cutting-edge of pedagogical technology and, despite my endorsement of teaching naked a few days ago, I have admired Chris' innovation and imagination from afar for a while now. This most reent addition to his bag of tricks is a winner. There are already several episodes available, including one on "Social Practice" with my friend, Michael Brownstein (Philosophy, New Jersey Institute of Technology) and one on "Identity and Anonymity" with my friend, Joshua Miller (aka Anotherpanacea). Both Michael and Josh were in grad school with me, and it's great to hear about their exciting projects. I'm really looking forward to more from Chris' Digital Dialogues series and highly recommend to readers of this blog.

I love the fact that he has framed this undertaking as an exercise in "Socratic politics," which is an entirely fit description, in my view. Socrates, as we all know, could neither read nor write, so it is not hard to imagine him taking advantage of the forum that Chris is providing. And Chris, if you're reading, how about this for an alternative title: "Skyping Socratic"? Togas optional, of course.

Tuesday, July 21, 2009

Teaching Naked

In an article from The Chronicle of Higher Education entitled "When Computers Leave Classrooms, So Does Boredom", Jeffrey Young reports that the Dean of Meadows School of the Arts at Southern Methodist University has recently banned all "machines" from classrooms and challenged his faculty to "teach naked" ... by which he means, to teach without the crutches that modern technology provides. I'm actually surprised to hear that anyone could get away with such a suggestion, what with the millions of dollars that colleges and universities have spent to build and maintain "smart" classrooms, but I really like the idea. SMU's dean, Jose Bowen, worries that professors are using in-class technologies like PowerPoint (which he calls "the absolute worst form of technology for the classroom") as a substitute, instead of a supplement, for good pedagogy. Now, it's important to note that Bowen is no Luddite-- he actually uses a lot of the technologies that he has now banned, only he makes those resources available to stundets outside of actual class-time, in order to make more space for engagement and participation during those precious 50 minutes professors have with students.

Bowen's main argument for teaching naked is that we in the professorate need to find ways to demonstrate to students that there is an unique value to being in the classroom. So, if what we're giving them in the classroom is merely stuff that they could download to their laptops (like PowerPoint presentations) or their iPods (like straight lectures), then there's really no reason for them to pay a lot of money to attend a "real" college or university. They can get all of that material from online courses. The sorts of things that you can only get in the classroom are profound conversations and debates with classmates, or real and dynamic engagement with a real expert in the field, or exposure to questions (and answers) that might be unexpected or unusual. Those are the sorts of things that make interesting classes interesting, and they're the sorts of things that transcend the simple information-transmission model than bores the pants off so many students.

Young's article does note that some students are initially put-off by these changes, preferring instead the non-participatory model to which they've become accustomed. But those same students also admit to being bored in class, and bored by PowerPoint in particular. I never use PowerPoint in the classroom (for many of the same reasons that I harly ever use small groups), so it is some consolation to me to hear that students actually find those presentations boring. I heard a story a while back about Gayatri Spivak beginning one of her lectures, at a conference where most of the presenters had used PowerPoint, by saying: "Today, my lecture will be both pointless and powerless." I still think that's hilarious.

So, although I can't really endorse being pointless and powerless in the classroom, I can and will enthusiastically endorse getting naked. Leave the stuff that can be done outside of the classroom where it belongs... OUTSIDE of the classroom!

Monday, July 20, 2009


As you have probably heard by now, Harvard professor and academic superstar Henry Louis Gates, Jr. was arrested at his home in Cambridge today in a completely bizarre story. It's still not clear what exactly transpired--though you can read the police report here-- but Gates got cuffed-n-stuffed for "exhibiting loud and tumultuous behavior" in the presence of Boston's finest. (Is that a real offense? I mean, is it on the books that you can't act "loud and tumultuous"? Because, if so, they're going to have to close down a lot of the Boston bars I used to hang out in!) Gates claimed that he was racially profiled, which is probably true, and the arresting officer claims that Gates was running off at the mouth a bit too much, which is also probably true, but it's still unclear exactly how everything went down in this story. On the one hand, Gates was arrested in his own home after showing identification, so whatever he may or may not have been mouthing off about seems pretty insignificant in comparison. But, on the other hand, the police report reads as if Gates was on the attack from the get-go, which inclines one to be a bit more sympatheitc than usual with the arresting officer. (Believe me, Dr. J is no fan at all of the po-po, but even I winced a little reading the officer's report.) At any rate, this looks to be a bona fide drama-in-the-making, so stay tuned...

[ADDENDUM: Theory Teacher has an excellent write-up about this whole incident and the complexities of racial profiling and racism over on his blog. Read it here.]

Sunday, July 19, 2009

You Are Not Going To Be Famous

Take a look at this short lecture (only about 10 minutes) that Jim Hanas delivered as a part of the "useless lecture series" that he helps curate.

According to Hanas, the point of his address was to debunk what he calls "America's Big Lie," the one perhaps best epitomized by Andy Warhol's famous remark about each of us getting our 15 minutes of fame. The stats that Hanas uses are surprisingly effective for disabusing we Un-Famous Masses of our illusions about someday rising above the hoi polloi. Particularly shocking was the 51%, that's FIFTY-ONE PERCENT, of 18-25 year olds who claim that to be famous is one of their generation's greatest goals. Hanas attributes this phenomenon to "the explosion of fame itself" and, although he doesn't mention it specifically, I'm sure that at least one big fuel source for that explosion is reality TV, which makes fame seem so very achievable. (I mean, just look at VH1's Tiffany Pollard, a.k.a. "New York." How hard can it be?) But as it turns out, there are only 4,536 famous people today-- and that's even using a generous definition of "fame." So the odds that any one of us might join their ranks is, as Hanas points out, less likely than odds that we will be put to death by a legal form of execution.

Obviously, reality TV is not the only culprit here. MySpace, Facebook, and Twitter surely share part of the blame, allowing us to broadcast the most mundane of our activities and thoughts, reinforcing the myth that we really are that interesting. And, alas, if I'm being completely forthright, I would have to include blogs as well. Hanas' lecture forced me to ask myself, regrettably and uncomfortably: why DO I feel the need to have a "counter" on this page? Is it perhaps because, somewhere deep in the recesses of my consciousness, I may think that each new click (almost 30 grand now!) somehow validates that I am not alone in the world? that people know me? that they may even care what I think?

[I pause to reflect on the profound irony of my publicly airing these reservations about my own fame-drive. And then I continue.]

As someone who has an insatiable fascination with and genuine appreciation for many of these fame-craze culprits (Facebook, reality TV, blogs), I'm ambivalent about how much they are to be condemned. There are a lot of benefits to the increasingly large and increasingly accessible "social network," not the least of which is the expansion of our community-building skills and vocabulary. But, of course, participation in and access to multiple communities cannot be an end-in-itself. Perhaps this is a problem of confusing quantity and quality. Whatever "fame" is, and I'm not sure it's all that easy to define, at least one characteristic must be that more people know you than you know. And as we all know, there's a very fine line between famous and infamous.

When I was young and our family would go on road trips, I remember regularly passing water-towers in various small towns, which almost always had been marked-up by some kid who climbed up it late one night with a spray-paint can and made him- or herself "famous." If I read the graffitti aloud from the backseat, my mother would always remark: Fools' names, like fools' faces, are always seen in public places. (Isn't it weird how those rhyme-y adages stick with you after so many years?) Anyway, I suppose the moral there was that fame for fame's sake isn't a goal worth pursuing... and I suppose that Hanas' lecture has prompted me to be a bit more vigilant about making sure that things like this blog don't end up being spray-paint on some Hicktown water-tower.

Wednesday, July 15, 2009

Gitmo Soldiers On

Just a reminder that the U.S. detention center at Guantanamo Bay is still open and operating. President Obama, as one of his first acts in office, vowed to close the facility within the year... but there are still over 200 detainees waiting there. It's still unclear what awaits them, though, when they are released.

The ACLU's Close Gitmo site is an excellend resource for information on the progress (or lack thereof) towards deciding what to do with "enemy combatants." I strongly urge readers of this blog to help keep everyone's attention on this matter.

Tuesday, July 14, 2009

Frankly, my dear...

Last year, in my section of our College's great books program (which is called "The Search for Values"), I taught Michel Foucault's Fearless Speech for the first time. The book is an edited volume comprised of six lectures that Foucault delivered at the University of California-Berkeley in the fall of 1983, all centered around the Greek rhetorical device of parrhesia, which Foucault loosely translates as "fearless speech." The person who employs this kind of speech is a parrhesiastes, and I paired Foucault's text with Martin Luther King, Jr's "I Have A Dream" speech and Malcolm X's "The Ballot or the Bullet" speech, challenging my students to choose which of the speakers was a proper parrhesiastes. (You can read one exchange in this debate by my students on their course blog here.) As pedagogical experiments go, I think this one was a particular success, so I've decided to include it again this year and I thought I'd share it with those of you who might be struggling with whether or how to teach Foucault to undergraduates.

The first advantage of Fearless Speech, I think, is that it is both short and accessible, making it easy to pair with the speeches of King and X as a kind of exercise in "applied" philosophy. Also, my experience was that students felt really engaged by and invested in the idea of parrhesia, especially last Fall when we were in the midst of a Presidential campaign. Although it's hard to anticipate whether or not they will have the same investments outside of an election year, my guess is that they will, since many students believe themselves to have a complicated and very roughly-defined commitment to "truth-telling," though it is often hard for them to put a finger on exactly what they mean by that. To that end, Foucault's definition helps, especially the following passage, on which we focused a good amount of our time:

[Parrhesia] is a kind of verbal activity where the speaker has a specific relation to truth through frankness, a certain relationship to his own life through danger, a certain type of relation to himself or other people through criticism (self-criticism or criticism of other people), and a specific relation to moral law through freedom and duty. More precisely, parrhesia is a verbal activity in which a speaker expresses his personal relationship to truth, and risks his life because he recognizes truth-telling as a duty to improve or help other people (as well as himself). In parrhesia, the speaker uses his freedom and chooses frankness instead of persuasion, truth instead of falsehood or silence, the risk of death instead of life and security, criticism instead of flattery, and moral duty instead of self-interest and moral apathy.

That definition is, of course, the very picture of what we mean when we talk about "speaking truth to power," and it is what I think many students admire and imagine their best selves to be engaging in. The challenge for them, as for us, is to get clear on exactly what the "truth" is and the nature of the "power" to which that truth is being fearlessly and frankly spoken. The contrast provided by King's and X's speeches are an excellent test-case. I highly recommend giving it a try in your courses.

Wednesday, July 08, 2009

The Ever-Elusive "Hook"

Ever since Michael Jackson died last week, I've been listening to his music almost non-stop. I had forgotten just how many mega-hits he had in his arsenal, and I've been shocked and awed all over again by their timelessness. The music of the 80's and early-90's had a very disctint sound that, in my humble opinion, "aged" very quickly. (Lots of synthesizer, fake-drum tracks, guitars plugged into too many pedals played by guys who probably still lived with their moms, and overly-complicated anthems that went on forever.) Most of those songs didn't age well-- by which I don't mean that they're not still enjoyable when you hear them on the radio, but only that as soon as they start playing, you can't help but imaginatively transport yourself back to the roller-skating rink. (Couples only. All skate slowly, please.) On the other hand, much of MJ's music still sounds like it could have been just written, and if it was, it would still be a chart-topper. Why is that? Because he mastered the ever-elusive pop "hook"...

As all songwriters know, the perfect "hook" is the most wily and most sought-after of artistic prey. Wikipedia defines it as "a musical or lyrical phrase that stands out and is easily remembered"-- though that sort of vague, pointing-in-its-general-direction definition is exactly what has frustrated many a songwriter over the years. Pop music is full of great hooks (that might be what makes pop music "pop"), and the true demigods of pop music are all its master. You can find these masters of the hook in every genre, too: disco (ABBA), country (Johnny Cash, Willie Nelson), pop-rock (Paul McCartney, Neil Diamond), hip-hop (Kanye West, Black-Eyed Peas), R&B (Smokey Robinson, Alicia Keyes), alternative (Pearl Jam, R.E.M.), alt-country (Ryan Adams, Steve Earle), heavy metal (Metallica, Led Zeppelin), reggae (well, pretty much ALL reggae is a "hook"), etc., etc., etc.. If you really want to get the sound of a hook, you need to listen to oldies, like 1950's and 60's-ish, music-- the true beginnings of rock-n'-roll, where the alchemy of blues, country, gospel and folk music worked its magic. There's just no denying the irresistability of hooks like the one in The Crystals classic hit. Da doo run, run, run. Da doo run, run. So very simple, and yet so maddeningly difficult to reproduce.

The thing about a great hook is that it must be simple and familiar enough to be memorable, but not so simple and familiar that it's forgettable. They say that Beethoven's music had this effect on his contemporaries, that people would leave his concerts whistling the key bars of his compositions after having only heard them for the first time. And this is why, even if you truly hate it, you can't get well-written pop song out of your head. Will you still need me, will you still feed me, when I'm sixty-four? It's like some kind of weird stimulant, like aural crack. Anyway, Michael Jackson's music had a lot of these hooks, and it's a large part of what makes his music so timeless.

I've always said the following of the "hook": Harder to define than Being. Harder to achieve than virtue. As someone who has bloodied her fingers many, many times trying to capture one, I think that's the best effort I can make at explaining it. Anyway, I'm opening the comments section to your examples of great hooks. (Please provide song title and artist, since we may not all know the same music.) Maybe if we collect enough samples, we can nail down something more definitive about it.

[Apologies in advance if you now have The Beatles "When I'm Sixty-Four" repeating in your noggin.]

Tuesday, July 07, 2009

Human, All Too Human (In Memorium: Michael Jackson)

The memorial service for Michael Jackson is being broadcast on television here in the United States today and, not surprisingly, there is mixed reaction from talking heads and the public. There is no denying that Michael Jackson will go down in history as the archetype of a "pop icon," nor that he was an almost-unrivaled musical genius... but, as we all painfully know, his life after leaving the Jackson 5 became more and more strange, more and more tragic, more and more sad over the years. For a lot of us, myself included, MJ's music was the soundtrack of our childhood and adolescence (and beyond). Thriller was the first album I ever bought, and the poster of MJ that came with it was (to my memory, at least) the only poster of a musician I ever had on my wall as a young girl. Still, like most everyone today, I'm having a hard time classifying and deciding where to file away my memory of Michael Jackson, as there are so many contradictory images and personae competing for the place.

Michael Jackson was a trangressor in all the good and all the bad ways, often at the same time. His music and his dancing were groundbreaking, idiosyncratic, revolutionary. There are sounds ("G'on girl!") and moves (moonwalk, crotch-grab) that are and will always be unmistakably "Michael." But like his music and his dance, much else about his life exceeded the categories of our understanding. He was soft-spoken, fey, curiously-dressed, troubling everything that we think we know about gender and sexuality. He upset our racial categories without meaning to-- or maybe he did mean to-- making himself the butt of many uncomfortable jokes. He never belonged to a proper family, or never belonged to one properly. The Brits dubbed him "Wacko Jacko," the Americans called him a freak, and one can't help but think that Nature herself often looked upon poor Michael with frustration and confusion. And, of course, he was an actual "transgressor" too, perhaps even a criminal, though the complicated machinations of out-of-court settlements will never allow us to know for sure.

Michael, you were a mystery.

If the very worst stories of Michael Jackson are true, no one can be his apologist for those transgressions. But even if they are all true, his is still a story that ought to inspire compassion. Blessed with remarkable talent, MJ was dragged into the spotlight before he could have possibly known better, before he could have possibly known that the "normal" world of Gary, Indiana was a world he was leaving forever. Even in the most circus-like and abnormal periods of his life, I often wondered how fair it was for us to criticize him for not abiding by the conventions of the "normal" world to which he never belonged because his family, his handlers, his fans, his haters, and Michael himself kept him out of it. Social mores are conditioned into us over time, reinforced and corrected by the world that seeks to maintain them, and Michael had precious little interaction with that collective sense of normality. He was an imperfect human being in an imperfect world, like all of us... but not with all of us.

So, I'm posting the video below because this is how I want to remember Michael Jackson. He's on stage, alone, rehearsing his song "Human Nature." This is where he is at home, where he is normal. He is clearly imagining the rest of us there, but we're not there. There's a moment during the rehearsal when Michael gets to the chorus and yells "Everybody!", stretching his arms out to the absent audience, inviting them to sing along... but, in our absence, there is silence. He's just there, dancing and singing by himself to a great song, probably like many of us did in our bedrooms, alone, imagining him there with us. Here is an enormous stage, simultaneously too big and too small for this one man. Just Michael and his music... and, of course, this story:

And they say, "Why? Why?"
Tell them that it's human nature.
"Why? Why does he do me that way?"
They say, "Why? Why?"
Tell them that it's human nature
"Why? Why does he do me this way?"

Monday, July 06, 2009


Carlin Romano (Univ. of Pennsylvania) recently wrote an excellent piece for The Chronicle of Higher Education heralding President Obama as our first "Philosopher-in-Chief," an honorific given to him largely as a result of the nuanced cosmopolitanism that characterized his Cairo speech entitled "A New Beginning" (full text and video of that address here). I was actually at a meeting of scholars and activists for Israeli/Palenstinian peace the night after that speech, and remember many of the Arabs and Muslims who were present talking about the significance of the Cairo address, how absolutely unprecedented it was, and how Obama seemed to signal for the first time America's willingness to not only acknowledge, but also endorse, our obligations to the global polity. Romano traces the extended philosophical roots of Obama's cosmopolitanism-- through Cicero, St. Paul, Kant, Arendt, Patocka, Derrida, Rorty and Appiah-- while at the same time noting the unusual symmetry that seems to exist between his personality and that philosophy. Obama is, according to Romano, "cool, polite, generous, cosmo." He is a uniter, not a divider. He is a master rhetorician, with the rare ability to inject a palpable sense of capital-H History into his addressses. As Simon Schama noted, he is "our new American Pericles."

Obama's cosmopolitanism is a particularly admirable virtue, I think, given the absolute mess we're in at home. People are out of work, many are out of money, out of their cars and their homes. The "quagmire" of our domestic medical care system is rivaling the one in Iraq and Afghanistan. Each of the fifty states is operating on I.O.U.'s, if it's operating at all. And yet still, Obama looks onward, upward, and most importantly, outward. Somehow, he manages to navigate the tricky pitfalls inherent to philosophical cosmopolitanim, even and especially the "bad" cosmopolitanism in which America might view itself as a City on a Hill and the rest of the cosmos as putative wards of a "global America." Unlike his predecessor, he is an proselyte of the best sort, an evangelist for statesmanship, conversation, deliberation, and measured, incremental, progressive change.

In 399 B.C.E., during the Golden Age of ancient Athens, Pericles said: "As we are a democracy, we can never fail." He would be proven wrong within a few years, as his once-great democracy crumbled and disappeared. Thankfully, our new American Pericles does not seem to suffer the same hubris. Obama, I think, knows that we can fail-- yes, we can-- and his cosmopolitanism is grounded in this realization. That's a cosmopolitanism I can believe in.