Sunday, October 31, 2010

200,000-Strong Bartlebys Unite To Say: "Meh"

The much-ballyhooed "Rally to Restore Sanity and/or Fear"-- brainchild of America's Ironists-in-Chief Jon Stewart and Stephen Colbert-- came and went this past Saturday in Washington, DC. Although the crowd-count estimates vary (as they always do), most put the number at somewhere between 200,000 and a quarter-million attendees. (TRANSLATION: For Midwesterners, that's somewhere between the total populations of Des Moines, Iowa and Lincoln, Nebraska. For Southerners, it's somewhere between Montgomery, Alabama and Greenville, North Carolina. For East Coasters, between Yonkers and Buffalo, New York. And for West Coasters/Southwesterners, its between Modesto, California and Glendale, Arizona. Apologies in advance to Pacific-Northwesterners, but the range between Spokane and Seattle is just too wide to be of any use to you in this case.) For those of you keeping score at home, that's basically a SH*TLOAD of people, in layman's terms, who made it out to a political rally for.... uhhhh, err, umm... what was it all about, anyway? A rally for ironic engagement? For no more "politics" ? For indecision? For sanity "and/or" fear? Or maybe just for Comedy Central?

As Bob Dylan once said: there's something going on here but you don't know what it is, do you, Dr. J? (Well, okay, he was technically addressing "Mr. Jones," but same sentiment anyway.)

It would be hard to gleen an overarching theme from the political signage at the rally, a kind of self-parodic and intentionally ascerbic ambiguity that was itself part of the motivating principle behind many people's attendance. As Jon Stewart performatively reminds us each evening on The Daily Show, what passes for "political discourse" in this country is too often regrettably vacant of meaningful substance, relying as it does on sound bites, misinformation, "spin" and "truthiness"-- all of which reduce complex and nuanced (and important) issues to sloganeering and, in so doing, eliminate any possibility for rational deliberation about them. The truth is, as we all would admit when we're using our inside voices: politics is messy. It's neither easy nor wise to pretend otherwise, to try to strategically over-simplify it, to act as if the interests of over 300 million American citizens can be aggregated and addressed in a way that each of them will find satisfactory. The "Rally to Restore Sanity"-- and its coincidental, uncanny doppelgänger, the "Rally To Keep Fear Alive"-- were intended to be, in their novel 21st-century styling, an iteration of the age-old political strategy of consciousness-raising. (Gasp! Combo Marxist-and-feminist reference!) In that sense, both rallies were what philosophers like to call "meta-"exercises. Everyone tried to step back for a moment from the mudslinging madness, the he-said/he-said character assasinations, the Hitler-analogue hurling, the pork-barrelling, the filibustering and gerrymandering and I'm-gonna-take-my-toys-and-go-home obstructionism to just, for just a moment, dial it down a notch. The whole point of the rally, such that there was one, seemed to be to give moderate, reasonable and informed discourse a moment in the spotlight. Or, at the very least, a minor speaking role.

As an admittedly left-leaning (oh, who am I kidding? I'm solidly Left), socially progressive, communist-sympathizing, democracy-loving believer in the powers of rational deliberation between and among informed citizens, I'm ALL FOR "dialing it down a notch." I've had more than my share of choking back throwup-in-my-mouth during this season of political advertisements here in Tennessee. (Most of the ad campaigns here can be summed up thus: on the Right we have "I'm a good ol' country boy, a Christian, an NRA member and a protector of unborn babies who drives a truck, not a 'Washington politician,' and I will oppose every single thing that Nancy Pelosi does" OR, on the Left, "C'mon, give me a break, I'm from Tennessee, too! I'm a country boy, too! I have a truck! And I am not Nancy Pelosi!") Even during the anti-Harold Ford, Jr campaign by Bob Corker in 2008, which was incontrovertibly backwards and racist, I didn't find myself so consistently itching and crawling with the political-creeps as I have during the current (MIDTERM!) election campaign. I'd like to believe that this isn't peculiar to Tennessee, an intuition that I think can be confirmed by watching (debatably-rational) talking heads prattle on every day on every channel about Obamacare the so-called Tea Party Takeover. Although, as a rule, I'm generally more inclined toward proactive political engagement-- by which I mean the kind of engagement that forces our differences to the fore, and in real confrontation-- I have also found myself in recent months exhausted enough by pro forma discursive hyperbole to have considered attending the DC rally, if only to show my unwavering support for that most regrettable of our collective political casualties: SANITY.

So, why didn't I go? To be honest, the truth is that I just didn't have the money or the free time to spare for a weekend jaunt to DC... reasons that no doubt mark me as an inadvertant collaborator and in absentia participant in the rallies, even despite my protests that follow. But I do want to register a few (minor and not-so-minor) complaints to justify my non-attendance at what may turn out to be the most significant-- or at least, most generationally characteristic-- political event of my life. (The election of Obama notwithstanding, of course.) Here's the thing: I'm all for the so-good-it-hurts kind of morally and politically instructive irony that Stewart and Colbert have not only mastered but also managed to funnel into something like an identity, even when it's bitingly sardonic (even somtimes mean) irony. And, yes, I'm one of the many 18- to 40-yr-olds in this country who really does think that Jon Stewart and Stephen Colbert are more informed and more politically astute than most of the people that I find on my ballot when I step in the voter booth. I get both a deep, moral satisfaction and a healthy, hearty laugh out of their eviscerating send-ups of contemporary Tartuffery. I happen to be (what I'm sure most people who know me would describe as) a "loud" person-- especially when it comes to politics (and philosophy, music, pop culture and deconstruction)-- but, at the same time, I know that the best way to neutralize loudness like mine is to not shout back. It's a fundamental life-lesson we all should have learned from the first Rocky movie, or from Muhammed Ali's famous rope-a-dope boxing strategy. Blowhards and bullies will eventually wear themselves out if you don't engage them on their own terms. So, I genuinely appreciate the effort of Stewart/Colbert to passive-aggresively combat the deafening flame-war din that passes as political conversation today. But I worry, and oh how I worry, that what they may have inadvertantly inspired is....

Shhhhhhh... Quietism.

As much as I absolutely loathe the anti-intellectualism with which so much of the Right (and too much of the Left) is enamoured these days, I really can see in this weekend's rallies a lot of what those anti-intellectualists find so objectionable. Let me offer what is, I think, an illustrative anecdotal aside: When I was an undergraduate (Philosophy major), I remember my father coming to visit me once at college and going out to dinner with me and my friends. As a 18- or 19-yr-old at the time, I of course thought that all of my friends were super-smart and politically astute and witty and hilarious. But when we left dinner, I remember my father saying to me, with one of those truly disappointed faces that you only recognize in the visages of parents: "Your friends are all so proud and so cynical. It's really kind of sad." At the time, I thought-- as young, hubris-marinated smartalecs are inclined to think-- that he just didn't "get it." But now I'm older (not quite the age my father was then, but getting closer by the day) and I can appreciate his exasperation with his dinner-mates' affective disposition that pretended, ever so deliberately, to be above-it-all. He was right, I now see, that we were "proud" and "cynical" and in our own way "sad." We were (or thought we were) so smart, so removed, so disgusted by the petty little goings-on of the hoi polloi, so ironic, and ever so proud of ourselves for being so superlatively critical. We were "against" everything, but none of us, for the life of us, could have formulated something that we were "for," not even if it meant a free dinner. In retrospect, I can see now that we were embedded in just another mundane phase of post-adolescence, in which it was very important to us (for very important reasons) to mark ourselves off as autonomous and reflective and educated. But our disposition was, at its heart, little more than the flailing, grasping, ungrounded and ever so common affect of fundamentally unprincipled young people.

Without articulable governing principles, without a cause or a rule that one can be for, without some motivating first premise from which one can derive the host of arguments and positions that come to form an informed citizen, we were just adolescents-- by which I mostly mean, angry, rebellious, directionless, fully equipped with range of affects and yet unable to generate any effects. I worry that too many in the crowd at this weekend's rallies were the same. They're mad at "politics as usual"-- for good reason-- and they're disaffected by a discourse that echoes vacantly in their ears and their lives. But what does their attendance at the Stewart/Colbert rally signify, other than that exasperation? What does it positively signify? What does it demand? What does it refuse? I mean, I get it that calling for a "restoration of sanity" has a palpably ironic and shaming force to it, like that of the child who pointed out the Emporer's nudity in front of His Majesty's passive and adoring subjects. I like that kid as much as the next person, for many of the same reasons that I like Socrates-- I think we need more of both!-- but I wouldn't vote for either. I'm not even sure that I would count on either of them to effect any real political change. I doubt I'd even want them as neighbors.

I'm worried that there might be too much self-congratulating going on among the throngs of people who identify too-reductively with Stewart and Colbert, with the Emporer's demystifying child, with Socrates, with the so-called "educated elite," or with the hundreds of thousands of the rest of us who are sick and tired of being sick and tired. Those figures and the sentiments that they inspire are only the first step toward thinking more critically, toward tearing away the veil of truthiness and spin, toward understanding what IS well enough to formulate some reasonable idea of what OUGHT TO BE. But if one only takes that first step and then proceeds no further, if one only sits at home and laughs/cries ironically at the insanity (and inanity) of it all with Colbert and Stewart, if one is unable to formulate anything more substantively principled than the claim "I don't like what is happening now," then I'm not convinced that one has developed a civic posture any more mature than Bartleby the Scrivener, who goes to work each day, more and more disaffected and less and less really effective, responding at each demand for his performance with the (in)famous retort: I would prefer not to. I want, desperately, to believe that this weekend's rallies was something more than a quarter-million Bartlebys shrugging their shoulders in exasperation, and not because I think widespread exasperation is an insignificant political phenomenon. Rather, I want to believe that that exasperation is only the first step to something more transformative, even revolutionary, than exasperation all by itself can ever be.

Thank the heavens, or whatever overarching-goodness analogue you prefer to substitute, for Jon Stewart himself. It turns out that The Voice of irony is also the voice of Reason, as it should be in every good democracy. Stewart's closing remarks to the Rally (below) were measured, empathetic, inspirational, sober and, most importantly, eminently sane. Just like he promised. If I'm wrong in my criticisms above, if there were in fact a quarter-million reflective and caring and sober and reasonable people there who think like this, then I have hope. Hope for a change I can believe in. I know we're going to lose the House in the upcoming midterm elections, but let me go on the record as saying I would prefer not to.

The Uncanny Valley 6: Unreal and Unreal-er, or, Why a "Fake" Fake Isn't Uncanny

I made a brief mention in my last uncanny valley post about the difference between "real" music, by which I mean music played on actual (i.e. "real," material or physical) musical instruments by musicians (i.e. human beings with some skill on those instruments, availing themselves of said instruments without superadded technological assistance) on the one hand and, on the other hand, synthetically computer-generated and auto-tuned "music" (scarequotes emphasized). It was mostly a throwaway remark at the time, but then one of my friends posted the picture to the left (of the living, breathing Form of the Fake, Taylor Swift, and her wax-double) on my Facebook page asking: "Which one is real?" I responded, of course, "NEITHER is real." And here's why:

Like the godforsaken "music" with which she tortures us all ad nauseum, Taylor Swift is an entirely manufactured product. She is, at best, a "copy of a copy" or a simulacrum, a phenomenon that has been elaborated at length by theorists like Frederic Jameson and Jean Baudrillard. Jameson used the painting technique known as photorealism (or hyperrealism) as the archetype of simulacra, in which artists attempt to perfectly replicate a photograph in painting, an example which I think is helpful in understanding Taylor Swift's especially nauseating kind of τέχνη. (Techne is usually translated as "skill" or "craftmanship," though here I want to emphasize another corresponding element of techne, namely, it's definition as "that which has its origin in another.") Photorealist painters do not artistically represent "the real," but rather they artistically re-present artistic representations of the real. That is, they "copy" a copy, and in so doing produce an image of an image that (Jameson et al speculate) no longer possesses-- or possesses at an exponentionally inferior value-- the substance or qualities of the original being replicated or represented. (Against my inherent deconstructionist impulses, I'll leave aside for the moment the many problems with unreflectively privileging "the real" over "the image.") Taylor Swift, in my view, is like a musical version of a photorealist-- she copies the copy of "the pop icon," the copy of "the musician," the copy of the "pretty girl," even the copy of the sound and story of "the song." Consequently, the products of her technique always come across as thoroughly, irredeemably false.

Swift's products are so false, in fact, that they even fail to achieve the kind of repulsion/attraction affective response that we experience in the uncanny. I don't know that I'm ready to go on the record with this claim or not-- though, I am posting it here, which is a kind of record, so whatevs-- but I think that what makes pop music "popular" involves a little bit of the uncanny. As I've said on this blog before, pop music is first and foremost defined by the "hook", that familiar/unfamiliar sonic phenomenon that simultaneously draws us to a song (because it sounds familair) and yet at the same time marks it as something novel ("new" or unfamiliar). The problem with Taylor Swift's music, and her persona, is that it is too familiar. It's not a slightly-intriguing variation on an original, it's merely a copy of a copy-- only a MORE inferior iteration of an already inferior representation. That is to say, it aims at copying successful iterations of "the pop (or country) song," which are themselves re-presentations of the substance and qualities of some "original" human emotion or experience, without aiming to represent the original itself. In my mind, anyway, these sorts of hyperreal simulations subvert the fundamentally provocative psychic operations of the uncanny double, which always retains enough of a reference to the substance and qualities of the original to warrant an (affective, emotional, responsive) association of the reproduction with the original it aims to represent.

An aside: Although I am a diehard fan of country music, I am NOT a fan of the early-to-mid-90's variant that came to be known as "new country" or "pop country" or "crossover country," of which Taylor Swift's so-called music is a representative. What happened in the corporatization of Nashville (aka NashVegas) during that period was a mass production and proliferation of the "form" of country music without any of its accompanying "substance." So, "new country" involves lots and lots of over-produced harmonies and pedal-steel guitars and fiddle solos, lots and lots of phone-it-in sad stories, lots and lots of orchestral key-changes intended to swell the sum of affective response, without any of the "real" talent or grit that made "real" country music really great. No poor people, no real heartbreaks, no rode-hard-and-hung-up-wet characters, no genuinely impressive musical or vocal talent... just an always-inferior, always-disappointing, always-too-familiar copying of the same. Hence, the reactive musical phenomenon known as alt-country-- which is not only an "alternative" to old-style country, but also an alternative to its nauseatingly false falsifying-- gained a following, in part, because it attempted to recapture something that new/pop/crossover country seemed to have carelessly disregarded: authenticity.

And it's not only the sub-standard "music" that Taylor Swift produces that I object to, but also the fact she is packaged in/as a sub-standard, photorealist, bad copy of the sort of "person" that might really produce it. To the extent that I can extend any sympathy whatesoever to that fake-itude, I feel sorry for Taylor Swift for allowing herself to be trapped in the trappings of a production that is not, and will never be, her own. (Now THAT'S a sad story about which Swift should write a real song.) I'll be the first to admit that there are a lot of problems with overly-romantic, overly-nostalgic celebrations of some reductive sense of "authenticity," but I suppose that I would say in my own defense here that if there IS an appropriate context to talk about "authenticity" anywhere outside of existentialism, it's country music.

PSA aside, my point here is that one shouldn't take the pretensions of pop-culture like Taylor Swift-- unlike the truly interesting, doubling pretensions of pop-culture that we see in much of reality television-- as examples of the ultimately (morally, politically, technologically and pschycologically) instructive nature of the uncanny. The fake-out of the uncanny is instructive only insofar as it "really" fakes us out, that is, only insofar as it touches close enough upon the familiar to make one worry about the confusion of the familiar and the unfamiliar. Taylor Swift is not "strange" in that interesting, unheimliche way. She's just fake.

Maybe as fake as the wax-reproduction of her persona... which might actually be something interesting to talk about.

Tuesday, October 26, 2010

The Uncanny Valley 5: Double, Double, Toil and Trouble

Over the course of the last year or so, I've written several posts about the "uncanny valley" on this blog. The theory of the uncanny valley is loosely based on Freud's account of Das Unheimliche (the "uncanny"), a major trope of psychoanalytic theory and a favorite play-thing of literary, film and cultural theorists who borrow heavily from psychoanalysis. If you don't know what the uncanny is and don't feel like doing the homework, take a look at the picture on the left of Dexter (played by Michael C. Hall), the fictional serial killer star of a Showtime series by the same name. Dexter himself is not uncanny in the image, but that arm is. The picture looks familiar and normal at first glance, but upon closer inspection you'll see that it's not Dexter's hand upon which he's resting his chin. It's a corpse's hand. The image gives us the creeps, or so it is speculated, not simply because the idea of touching a dead body is creepy, but more so because the image is composed in such a way to make the arm look almost right, almost normal, almost alive. Uncannily so. As Freud explains at some length in his essay, our experience of the uncanny involves a coincidence of the familiar and the unfamiliar. That is, we find ourselves in the rather unique, cognitively-dissonant situation of being both attracted to and repulsed by the same object.

The uncanny "valley", as I've explained here before, is a theory of robotics that attempts to map the precise places in our experience where the psychic discomfort of Freud's "uncanny" is activated. I won't rehearse the whole story here again, but the basic premise is that we find robots "uncanny" when they too-closely approximate "real" human beings. For example, in the video below, the theory of the uncanny valley would explain that the coincidental "creepiness" and "fascination" the we feel toward the robot-performer in the front is a product of our apprehension of her/it as both familiar (very much like a human being) and unfamiliar (not a human being):

Let's just put aside for the moment that the trippy-techno-pop song itself is also a little bit uncanny, confusing as it does our familiar sensibility about "real" music as opposed to computer-generated and synthetically auto-tuned "music." (I'm really not trying to be a music snob here-- I like catchy, hooky tunes as much as the next person-- but I do think that there is something different about our experience of these sorts of tunes that sound almost "real.") The point is that robots are the perfect test-cases for delving into the psychic operations of Das Unheimliche because they operate as almost-perfect "doubles" for the very thing that is the most familiar to us, namely, ourselves.

The German word for the uncanny, Das Unheimliche, contains within it adjectives for what we find familiar, what we find homely (heimeliche) or belonging to the house (heimelig), even what we perceive as being autochthonous or native (heimisch). So, the uncanny or unheimliche often gets described as the "unfamiliar," but it's important to note that the uncanny involves a very special kind of unfamiliarity, because the "familiar" that is being negated is the most familiar to us. Like the things in our home, which are "familiar" in a very special sense-- a private, intimate, personal, almost secret kind of familiarity. If I were to encounter something radically unfamiliar-- say, an object or species entirely unknown to me-- I might be frightened by it, but my fright would be qualitatively different than if I were to encounter slightly strange variant of something intimately familiar to me-- say, a robotic double of myself, or an almost-perfect replication of my living room, or my best friend's doppelgänger.

The opening sequence of the television series Dexter provides a excellent rendering of this phenomenon. In it, the everyday routines of the literal "home"-- getting dressed, shaving, making breakfast, tying one's shoes-- are depicted in such a way as to make them seem somehow sinister and foreboding. There's nothing "really" sinister or foreboding depicted in these images, of course, but a palpable malevolence is instead conveyed entirely by exploiting the uncanny. Here it is:

Not only in the opening sequence, but in the content of the show, Dexter makes extensive use of uncanny, doubling juxtapositions of the familiar and the unfamiliar. Theory Teacher posted an excellent essay on just this topic recently, titled "Dexter, Psychoanalysis, or Something", where he does a fine job of explaining how our invention of, confrontation with, and sometimes violent dispensation of (real and symbolic) doubles is a fundamental operation of both the individual psyche and the communal one. In the Dexter series, the complicated machinations of uncanny doubles is hyperbolic and hypostasized, as it is in much literature and film as well. What is fiction, after all, if not the sine qua non human activity of fabricating doubles? But what has really piqued my interest of late is the widespread depiction of non-fictional doubles, "real" doubles, and the collective fascination/repulsion we feel toward them.

Yes, I'm talking about reality television. I've always thought that explaining the reality television boom by directing a "J'accuse! Voyeur!" at the unsophisticated masses was a bit too simple. Part of our fascination with reality television is a voyeuristic fascination, to be sure, but we're repulsed (even frightened) as much and as often by what is depicted there as we are fascinated. That repulsion can sometimes be transformed into pleasure in its own schadenfreude kind of way, but quite often it's not pleasurable. Quite often, it's uncanny. Not only are the people and places and conflicts and dramas of reality television uncanny, inasmuch as they seem very much like "real" people and places and conflicts and drama, but the very world in which all of this takes place is uncanny. It's a "real world" that is not real. It's simultaneously familiar and unfamiliar. It's populated by doubles of ourselves and people we really know, animated by doubles of the competitions and romances and conflicts and resolutions of conflicts that we really engage in, governed by doubles of the laws and moral codes and superegos that order our real world. Reality television is, at its base, a variant on the age-old philosopher's meta-doubling thought experiement: the compossible world. And, like literature and film, like all of our fictional creations, it can also be instructive.

In his Poetics, Aristotle speculated that in order for tragedy to inspire pity and fear, in order for the peripeteia and hamartia and anagnorisis depicted therein to be believable, in order for it to accomplish catharsis, it must proceed by mimesis. That is to say, it must present to its audience a double of their world, with characters that serve as recognizably familiar-- but not exactly identical-- doubles of themselves, so that the moral education accomplished in the drama can be doubled in the minds and souls of its witnesses. In order to be instructive, Aristotle seems to suggest (though he could not have availed himself of this vocabulary), it must be uncanny.

It's not just Dexter and robots and Jean Valjean that upset us, that trouble our understandings of ourselves and our world by doubling us and it (with an uncanny difference). Reality television does the same. It present us the real as if it were fictional, or the fictional as if it were real. Everytime we find ourselves captivated by those doubles, we are simultaneously and unconsciously replulsed by their fabrication, their pretension as "doubles." But every time we rejct them for being contrived, we are secretly forced to concede the ultimately contrived nature of our "real" world and our "real" selves. Dispensing with the double keeps our world in order, makes it sensible, but it is a complicated pyschological operation that insists upon and accomplishes that dispensation.

Friday, October 15, 2010

One World (Stand By Me)

This is one of the greatest things I've seen on the old interwebs in a long, long time. It's a collaborative recording of "Stand By Me," the familiar Ben E. King standard, with musicians and vocalists from all over the globe contributing.

Here it is, for those of you who may still doubt that some sentiments are universal:

Tuesday, October 12, 2010

Morehouse Mean Girls

I'll admit that I hesitated, more than once-- more than a dozen times, to be honest-- before posting this entry. So, allow me a few caveats here at the start. First, I'm not a Morehouse grad, not even a Spelman grad-- two of the most prestigious HBCU's in the country-- and I can appreciate that the real alumni of those institutions might not take kindly to my weighing in on their business. Second, I am well aware that the vicissitudes of gender-construction often operate differently in different racial communities... though, for all that, I am still convinced that injustice and intolerance is injustice and intolerance, regardless of the context in which it is made real. Third, the fact that I can voice my criticisms below is, without a doubt, a privilege of which I am aware and which I know not everyone enjoys. Finally, and most importantly, some of my closest friends are Morehouse men, ergo there can't possibly be anything misguided or nefarious or uninformed in my remarks below.

(That last one was a joke. C'mon, people.)

You may remember that about this time last year there was a massive brouhaha at Morehouse over the administration's decision to institute a new dress code. As I think most people would admit, most of the details of that dress code were largely non-objectionable-- for example, "no wearing pajamas to class"-- but there were a couple of guidelines that came under immediate criticism. In particular, the prosciption of "grillz," "saggy pants" and "do-rags" caused many to question whether or not Morehouse was imposing a kind of generation-specific image on its undergraduates at the expense of their freedom to express themselves in many of the styles and fashions that define young, black men in the United States. I won't comment on these specific elements of Morehouse's dress code, partly because I don't feel like it's my place, but more so because I can see both sides. I don't find grillz or saggy pants or do-rags offensive, nor do I assume that any young man sporting those fashions is necessarily disrespecting himself or the institutions he represents. But I can appreciate that Morehouse wants to cultivate a particular kind of image among its charges, and it is correct in assuming that most people in our culture would associate that look with the uneducated, or worse, the criminal. I don't have a dog in that fight, really.

There was another proscription in the Morehouse dress code, however, that does cause me real concern. It's Rule #9 and it reads as follows:
9. No wearing of clothing associated with women’s garb (dresses, tops, tunics, purses, pumps, etc.) on the Morehouse campus or at College-sponsored events.
There are many problems with the wording of this rule ("clothing associated with women's garb"?) as well as the sentiment behind it. I understand, of course, that Morehouse is the "College of choice for Black men" (emphasis added) and that, as such, it has an implicit investment in shaping the image of African-American men in this country. But, as has become painfully obvious, some of those Black men have different ideas about how that image should be fashioned.

The floodgates were unleashed recently when Aliya King published her exposé "The Mean Girls of Morehouse" in Vibe magazine. King's article recounted the story of gender-bending, cross-dressing, former-Morehouse-"man" Diamond Martin Poulin and, according to Morehouse administrators, "five students like him," who have been driven out of the college by its dress code. Maybe a salacious and provocative story on its face, but enough to warrant a reponse from the President of Morehouse College himself, Robert M. Franklin. I won't produce King's article or Franklin's response here in their entirety (because I've provided links), but let me sum up:

King's article is, as accused, salacious and provocative... but, presumably, it's at least true, based as it is in the real life of a Morehouse "man." Franklin's response, on the other hand, is an utter disgrace. It neither addresses the issue of Morehouse's institutional patriarchy and heteronormativity head-on, nor does it evidence whatsoever any sensitivity to (or real understanding of) what is at stake. It is, in the very worst sense, lip-service. Franklin's letter effectively says: "Because I believe in the freedom of press, Ms. King has the right to publish whatever she wants. But I will summarily dismiss it. And you, Morehouse men, should too."

Let me be clear, I am sympathetic with Franklin's appeal to the many and varied assaults on young black men's identity and well-being in this day and age. But the presence of one (or many) troubles is no excuse for ignoring another. It reminds me of the manner in which Congress decides to apportion funding for disease research every year: as Senators or Representatives, I know that they must decide to which causes they will lend their support. Maybe it's cancer, maybe it's HIV, maybe it's Alzheimers. Because there is only a limited amount of resources, they have to pick one over the others... but that doesn't make the others any less deadly. But Franklin's "moral" resources are not limited. Franklin does not have to pick one over the other. One ought not pit the dying against the dying.

And so, in this one case, I say: Shame on you, Morehouse. I hope that Morehouse men, whatever they choose to wear, will remember that they have been trained, according to their alma mater, to become "the epicenter of ethical leadership."

It's time to man up. Pun intended.

What Is Philosophy?

A strange confluence of events recently has led me to consider more seriously the question above: What is philosophy?

One might think that this is a perennial question within the discipline of Philosophy, but one would be wrong. Truth is, most of us (professional philosophers) are too busy teaching/researching/thinking about our own projects, the history of philosophy and/or its core subfields (metaphysics, epistemology, ethics, logic, etc.) to step back and survey the whole. That's not to say that the question "what is philosophy?" is not, in most cases, implicitly present in most of the more-specialized work that we do on a day-to-day basis, only that it is seldom made explicit or taken up "as such" in that work. So, I have to admit, I was more than a bit embarrassed myself, upon reviewing the foci that generally occupy my attention, to realize just how little I ponder THE question.

Just a little backstory may be in order here. In no particular order, these are the serendipitously (or ominously, depending on your disposition, I suppose) converging events that have pressed this question to the fore in my mind of late:

~First, the most recent issue of Jobs for Philosophers was just released. (If you're not a member of the American Philosophical Association, you won't be able to read the JFP. So, you'll just have to take my word for it that it's a very depressing document for those who need it not to be.) I'm not "on the market," which means I'm not looking for a job in Philosophy, but I'm not far enough away from that yet to not still feel the "what's it all about?" anxiety that the JFP produces.

~Second, there's been a lot of buzz about the status of women in philosophy (as evidenced in my previous post, which was itself prompted by the new blog What Is It Like To Be A Woman In Philosophy?). Anytime these sorts of questions about who belongs (or doesn't belong) in Philosophy arise, people inevitably find themselves forced to "go meta" and at least try to define what they mean by "Philosophy" in the first place. Hilarity occasionally ensues. More often tragedy, though.

~Third, I recently had to turn in the text-adoptions for my department's Senior Seminar, which I will be teaching in the Spring semester. I've chosen Deleuze and Guattari's What is Philosophy? and the two "Right to Philosophy" collections of Derrida's essays, Eyes of the University and Who's Afraid of Philosopy?. I haven't worked this all out yet, but my idea is to center a part of the seminar around the question "What is philosophy?," attending to the many different valences of that question: historical, professional, cultural, political and ideological.

~Fourth, I had out-of-town family visiting last weekend and, because I'm on sabbatical this semester, I found myself (more than once) needing to answer the question about what exactly it is that I do with my time every day. Not to mention how it is that I get paid for whatever it is that I do.

~Finally, I just listened to the interview with Brian Leiter on the (very excellent) philosophy podcast program Why?, produced by the University of North Dakota's Institute for Philosophy in Public Life. Leiter was called upon to discuss the "profession" of philosophy, about which he had many interesting (and sober) things to say. Of particular interest to me was a question put to him by interviewer Jack Russell Weinstein, which went something like this (not an exact quote): "how would you describe Philosophy if you were on Ellen or Oprah?"

I like this question a lot, in part because it implicitly requires that the answer be straightforward, clear, jargon-free, sufficiently explanatory and accessible, if not to everyone, at least to the understanding of reasonably intelligent people. I've tried to formulate my own Ellen/Oprah answer, and it's more difficult than you might think. So, readers, I'd like to hear yours. You needn't be a professional philosopher to weigh in on this one, though some familiarity with philosophy (the discipline and the profession) would be helpful.

Let's hear it, then: What IS Philosophy?

Here are a few arbitrarily-imposed limitations on your answers, just to keep things interesting.
1. Answers should be NO MORE than 200 words or, alternatively, 4 (non-Teutonic) sentences.
2. Answers should NOT include vocabulary with which a reasonably intelligent (not necessarily college-educated) adult would be unfamiliar, nor should they employ familiar vocabulary in a specialized, discipline-specific or idiosyncratic manner (without explanation).
3. Answers should attempt to address both the discipline and the profession of "philosophy," assuming that Ellen or Oprah (or whoever else is asking) may not know the difference.

NOTE: If you got here through this blog's Facebook page, please be sure to post your answer in the comments section below, even if you have already posted it on Facebook.

Sunday, October 10, 2010

REPOST: Picking A Fight... Like A Girl

[NOTE: This is a post that was originally published on this blog a year ago (10/08/09), which I am re-posting now because of recent interest in the newly-developed and eminently revealing What Is It Like To Be A Woman In Philosophy? blog. It is interesting to me to see this issue resurface with such force a year after I originally addressed it here. Plus ça change, plus c'est la même chose, I suppose.]

The interwebs are all a-buzz right now about women in philosophy. Wait, correction: they're all a-buzz about the LACK OF women in philosophy.

An article by Brooke Lewis in The Philosopher's Magazine entitled "Where are all the women?" confirms what just about anybody could have guessed: Philosophy departments in the U.S. and U.K. trail FAR behind the other humanities in female faculty. Brian Leiter picked up the story on his philosophy blog (here and here), and the SWIP (Society for Women in Philosophy) list-serv has been on fire with the topic. One suggestion, present in the original article and repeated endlessly in the commentaries on it, is that the discipline of Philosophy has an intrinsically "masculine"-- i.e., agressive and argumentative-- culture, which is ill-suited and off-putting to many women. This is the explanation for why, despite the fact that almost equal numbers of men and women graduate with B.A.'s in Philosophy, the number of women drops off dramatically at the M.A. level, and even more dramatically at the Ph.D. level. At present, only about 1 in 5 full-time professors of Philosophy are women, meaning that it is not only possible, but very likely, that if you are an employed female philosopher, you could be the only one in your department. (That's the situation in my department, for example.) The knee-jerk explanation showing up all over the place goes something like this: Philosophy is rigorous and demanding, not soft and womanish, so it's not surprising that the ladies can't hack it.


A part of me feels like this is not even worth entertaining, but since I've somehow managed to make it through the professional-training-in-verbal-sparring gauntlet and thus proven that I ain't skeered of an argument, here are a few retorts:

(1) Philosophy, as a discipline and as an intellectual practice, is not "intrinsically" argumentative and aggressive. That's just one way of doing philosophy-- a way that has its virtues and its vices. It's not the only way of doing philosophy and it's not always even the best way of doing philosophy.

(2) The argument that women are less inclined to engage in argumentative and aggressive scholarship than men, that they are turned-off by rigorous and demanding intellectual exercise, and that they don't possess the "natural" aptitude for philosophy depends, of course, on an essentialist account of gender-determined affects and abilities that has absolutely no reasonable or scientific basis. Women flourish in plenty of other disciplines that could be characterized in the same way as Philosophy-- law, the "hard" sciences, and almost all of the other humanities. Surely, we don't want to say that those are all "soft" disciplines. Seriously.

(3) The discipline of Philosophy DOES, however, have a protracted and sedimented institutional culture. That culture includes-- along with actual and explicit sexist prejudices-- a kind of default devaluation of women's thought and abilities and a gross underrrepresentation of women who might correct that devaluation. If you're color-blind, you can't complain that the world isn't popping and sparkling with more color.

(4) The characterization of Philosophy that we see in these apologetics is more indicative of how (particularly male, "analytic") philosophers WANT to see themselves and their work than it is of women's aptitude or inclinations. So, the more felicitous question to ask would be: why are we so invested in seeing "Philosophy" this way?

I feel very fortunate to work in a department with enlightened and progressive-thinking male colleagues, but I know that many of our conversations would be VERY different if I weren't the sole representative of my gender-group. I also know, though, that my own disposition and personality tend toward the kind of Type-A characterization of Philosophy that many men want to preserve. (I can be, admittedly, "agressive and argumentative," to put it mildly.) But I would hope, and I think my colleagues would also hope, that philosophers would be attuned enough to the complex operations of social constructions to realize that what we see in the recent spate of articles on gender disparity in Philosophy is not only a red herring, but a terribly unreasonable and uncritical account of an relatively easily-explainable phenomenon.

[Unfortunately, I am not able to repost readers' comments to the first iteration of this post. If you're interested to see them, go here.]

The Many Faces of You, America

I've tried to refrain from commenting here upon the complete train-wreck that is ("surprise" Senate primary winner from Delaware) Christine O'Donnell, but a self-respecting political blogger can only be reasonably expected to hold out for so long. We're all aware of the catalog of crazy things that have somehow made their way out of Christine O'Donnell's mouth, without any apparent vetting by political handlers, close friends, rational principles or cultural sensitivity. For the most part, I would characterize O'Donnell as a bit like a kinder, gentler version of Sarah Palin... but she's giving the Grand Dame of Illiterati a real run for her money in the crucial right-wing categories of Uninformed, Self-Righteous, and Cuh-Razy. I'll leave aside for the moment her Victorian-cum-medieval views on masturbation, her idiosyncratic-cum-ignorant understandings of evolution and socialism, her personal campaign communiques from God, her insider knowledge of the dark arts, and I won't even touch that remark about how SCARYSCARYSCIENCE has managed to produce, miraculously, genetically-engineered mice with "fully-functioning human brains." (Fully functioning, she says, for real.) Any one of these bon mots would provide a lifetime supply of grist for my mill. But instead, I want to focus on O'Donnell at her best, by which I mean, you know, when she's being like me.

What?!, you may be saying to yourself, Christine O'Donnell is like Doctor J?! I know, I know, I was surprised to learn so myself. But then I saw O'Donnell's campaign ad (below) where, lo and behold, our commensurability was confirmed, straight from the witch's mouth:

As it turns out, O'Donnell isn't just like me, she IS me. And she is you. Or, rather, "you"-- the "you" serving as a vacant placeholder into which we can all insert our precious individual personhood. Because O'Donnell is me, she promises to go to Washington and do what I would do. Nevermind that a few of the crucial things that I would do involve "believing in science" and "not bothering with other people's auto-affection rituals" and "having at least a grade-schooler's understanding of civics." None of us are perfect, after all.

All snarkiness aside, though, here's the thing that really chaps my hide about this ad: it seems that the whole "I'm you" pretense has come to serve the same function for female political candidates that the "I"m someone you can drink a beer with" pretense did (and does) for male candidates. The Beer Quotient was a crucial variable in the "likability" algorithm during the last couple of Presidential elections, an American political oddity that continued on even past the otherwise expectedly insane campaign seasons. (Remember the Obama/Biden/Gates/Crowley Beer Summit?) Being able to drink a beer with a male politician means that he's not only a real person, but a real man, that he won't embarass you with some fey or professorial affect if you take him down to your corner bar. In sum, "you can drink a beer with me," if delivered in a sufficiently baritone voice, is nothing other than a perfect translation of the otherwise girly sentiment "I'm you."

Of course, the ladies can't say "I'm someone you can drink a beer with." And not only because voters (if they could get the Freedom Fries out of their mouths long enough to pronounce the French) would find it très gauche, but also because the woman's "I'm you" actually carries more meaning than the man's "I'm you." When O'Connell (or Palin) say "I'm you," they don't simply mean that we are equally interchangeable, rational, conscientious but in the end anonymous co-signatories to an abstract social contract. They mean something much more personalized, more caring, more Oprah-esque, more mommy-ish. They mean that you are a precious little snowflake, absolutely unique and valuable and important in the grand scheme of things, unlike any other... AND, bless your heart, they are precious and unique and important (and underappreciated as such), too. So, despite the logical impossibility of two perfect, unrepeatable idioms being exactly equivalent, they are you.


God bless YouTube, the parodies of O'Donnell's ad have been rolling forth like a river with righteousness like a never-failing stream in recent weeks. Here are some of my favorites:

For what it's worth, readers, this blog is written by a woman. And you should keep reading, because I'm you. Now, be sure to brush your teeth before you go to bed, honey.

UPDATE 10/12/10: The hits just keep on coming! Here's a couple more O'Donnell parodies:

Friday, October 08, 2010

Bon Mots: Delueze on Philosophy

In the interview "On Philosophy" in Negotiations (Columbia University Press, 1995, p.136), Deleuze remarks:

Philosophy is always a matter of inventing concepts. I've never been worried about going beyond metaphysics or any death of philosophy. The function of philosophy, still thoroughly relevant, is to create concepts. Nobody else can take over that function. Philosophy has of course always had its rivals, from Plato's "rivals" through Zarathustra's clown. These days, information technology, communications, and advertising are taking over the words "concept" and "creative," and these "conceptualists" constitute an arrogant breed that reveals the activity of selling to be capitalism's supreme thought, the cogito of the marketplace. Philosophy feels small and lonely confronting such forces, but the only way it's going to die is by choking with laughter.

Thursday, October 07, 2010

The King, The Clown, The Colonel: Axis of Evil?

There are debates about a few really important issues that have a tendency after a while to fade into a kind of white noise for me. I generally find this to be true about debates over capital punishment, abortion, the existence or nonexistence of God, and the legalization of drugs. It's not that I think those issues have become less significant or worthy of attention, only that it's increasingly harder to train my attention on them when I feel like I've heard every argument ad nauseum already. And so, regrettably, they become like cultural "elevator music"-- I don't feel any compulsion to listen closely anymore, because all there is to hear are better or worse iterations of a song I already know.

I tend to think the same way about the exhausted (but inexhaustible) debate over meat-eating. [FULL DISCLOSURE: I do eat meat, though I don't have any particularly principled reasons for doing so. I am not in any way unsympathetic to those who are opposed to eating meat for whatever reasons (or no reason at all). In fact, I find many of the arguments for vegetarianism and veganism quite compelling.] Like many people, I imagine, I tend to think of my decision to eat meat-- more an unreflective default position than anything that might properly be called a "decision" most of the time-- is rather settled. I've yet to hear an argument persuasive enough to convince me to make the rather dramatic change to my living and eating practices that vegetarianism/veganism would require, and so I find myself engaging in conversations about it, when I do, in a mostly pro forma manner.

So, I was actually quite exicted to hear the recent debate between Jonathan Safran Foer (one of my favorite authors) and Anthony Bourdain (my favorite celebrity chef and a really fine writer himself) on CBC Radio. Their conversation, "Should We Eat Meat?," pitted Foer (vegetarian and author of Eating Animals) against Bourdain (perhaps the most evangelical carnivore on the planet, as evidenced by the title of his memoir: Medium Raw: A Bloody Valentine to the World of Food and the People Who Cook) and the resulting encounter was-- as is befitting the conviction, intelligence and imagination of its participants-- a remarkably novel contribution to an otherwise well-worn debate. They covered the usual ground, of course, including the standard arguments about the health benefits of meat-eating or not-eating, the moral status (or lack thereof) of non-human animals, the restrictions that poverty imposes on the diets of the impoverished, and the truly deplorable conditions that factory-farming creates for both human and non-human animals. What was most interesting to me, though, was the sometimes very subtle nuances that divided Foer and Bourdain in their convictions.

For example: both Foer and Bourdain (and myself) find the practices of what Bourdain calls "The King, The Clown and The Colonel"-- in reference to the commercial identities of fast-food triumvirate Burger King, McDonalds, and Kentucky Fried Chicken-- objectionable in every possible way. For Foer, these franchises' participation in factory-farming is a fundamentally ethical offense, creating as it does horrific conditions for the animals that are eventually slaughtered there. For Bourdain, the offense is more aesthetic-- factory farming produces food that tastes bad, that doesn't even taste like "food," and consequently serves to diminish our collective culinary palate-- though Bourdain clearly agrees with Foer that the conditions under which this bad food is produced create all kinds of other (political, moral, social) problems. The interesting difference between Bourdain and Foer is that on principle, as far as I can surmise from their conversation, how one prioritizes the moral and political objectionableness of The King, The Clown and The Colonel with regard to their aesthetic objectionableness is dramatically different.

Bourdain's position, in summary, is that there is no moral, political or even aesthetic reason that could supercede, as a matter of principle, the fundamentally social good of "eating [whatever one eats] together." Bourdain flatly says as much in his conversation with Foer (my emphasis added):
To me, the human experience, human communication and curiosity, trump any ethical concerns one might have with killing and eating animals... I can understand entirely why one would choose to live a certain way in the privacy of one's own home, making personal choices about what you eat. But once you go out the door, the notion that one would shut oneself off from communication with others and essentially reject a form of communication-- meaning food or bonding-- offends me.
Foer rightly objects that there are few, if any, situations in which the bond created by (humans') eating meat together could not be accomplished by other means. But the devil is in the details here-- "few" occasions is a long way from "none"-- a perhaps nit-picky cultural and economic point that I think Bourdain is justified in exploiting. Because the truth is, as Bourdain belabors, the privilege to NOT eat meat remains, as regrettable as it might be, a privilege. Bourdain is not morally defending the "Axis of Evil" (The King, The Clown and The Colonel), but rather questioning the privilege of placing that particular moral objection above other, arguably equivalent, social and cultural goods.

I'd like to say that I don't really have a dog in this fight (the moral permissability of eating dogs, which Bourdain and Foer also debate, notwithstanding), but I recognize that as long as I continue to eat meat, while at the same time ceding the many and severe ethico-political problems I see with doing so, is more than a bit of a bad-faith gesture. Interestingly for me, I find myself fundamentally disagreeing with Bourdain's argument on principle, even as I find myself agreeing with it in practice. That is to say, I don't think it is the case that cultural or aesthetic or social goods always trump "any ethical concerns one might have." But I do believe enough, I suppose, in the cultural or aesthetic or social goods that Bourdain advocates to find it very, very difficult to prioritize my moral objections over them. Oy vey.

At any rate, I recommend listening to the exchange between Bourdain and Foer if only because it offers a rare, and much needed, novel interruption to the tired carnivore v. vegetarian white noise.

Tuesday, October 05, 2010

Deconstructing Sasha Fierce

I'm guessing that many of us have those fleeting fantasies from time to time in which we conjure up what we imagine would be the AWESOMEST. COURSE. EVER. For example, my fantasy courses: "I'm Not Here To Make Friends: Ethics and Reality TV" (sort of a cross between ethical theory, applied ethics, and existentialism), or "This Land Is My Land: Classical Liberalism and Its Critics" (a history of social-contract political philosophy from the Enlightenment to now, as narrated through American popular music), or "Leave The Gun, Take the Cannoli: Community, Individuality and Authority" (deontology vs. utilitarianism via The Godfather trilogy), or "The Anatomy of An Illusion: Philosophy as Prestige" (already laid that one out on this blog before). Anyway, my most recent addition to my Fantasy Course List is this: "Deconstructing Sasha Fierce: Feminism, Complicated."

Just in case you've been living in a culture-deficient cave for the last few years, I'll remind you that "Sasha Fierce" is the sobriquet of Beyoncé Knowles, pop icon and reigning Diva Extraordinaire. Beyoncé is the total package: drop-dead gorgeous, fashion forward, business-savvy, (sufficiently) talented as a singer/dancer/actress, all the while maintaining the public personality of an approachable and eminentaly likable Girl Next Door. (Not exactly your door, of course.) She's the former member of chart-topping R&B supergroup Destiny's Child, happily married to rap and hip-hop mogul, Jay-Z. And also, by the way, she's just so happens to be a feminist. Sort of. I mean, maybe. Accidentally.

In a recent interview with UK-based mag You, Beyoncé was quoted as saying:
"I think I am a feminist in a way. It’s not something I consciously decided I was going to be..."

Well I'm here to tell you, Sasha Fierce, you are a feminist. Sure, you're not exactly the poster-girl for feminism, and it's likely that your unrivalled good looks and paramount popularity will cause many a raised eyebrow among the guardians of Feminism's ancien regime... but, even if you've only "unconsciously" happened upon feminism "in a way," you've definitely situated yourself quite comfortably in it. I offer up for the jury's consideration the following evidence:

Exhibit A: "If I Were A Boy"
Anyone who has ever taught a class on feminist theory knows that the very first lesson one must impart to the skeptical involves some kind of a reasonable rejoinder to the protest: But why do we even need feminism? It's just so passé! The answer, for better or worse, is as clear as it is complex, as obvious as it is invisible, as simple as it is torturously convoluted and intricate. We need feminism because the deepest struts and girders of our social ontology are governed by patriarchy. You may be 18 yrs old and fully confident that you needn't marry, that you needn't stay at home barefoot and pregnant in the kitchen, that you can educate yourself and get a good-paying job and be a success without ever having to encounter the sexism that so regrettably shackled your mother (or, er, grandmother?) in any really existentially-felt way. But, Beyoncé and I are here to say that if you think you're anything like an equal to the men in your world, you are sadly, sorely, mistaken. Enter Sasha Fierce, whose song imagines-- as sweetly and sentimentally as a "girl" is able-- what it might actually look like "If I Were A Boy." Oh, snap. That's patriarchy, sistahs.

Exhibit B: "Irreplaceable"
Oh, Sasha, now that we can see clearly that it JUST. AIN'T. FAIR!... whatever is a girl to do? Surely there must be some way to re-value what it means to be a woman. Surely there are options other than to just keep-on-keeping-on in the same old unhealthy, pathological, self-sabatoging, asymmetrical bu(llsh*t)siness-as-usual manner. Surely we girls can up the ante somehow, make it more expensive to treat us like discardable, recyclable objects. Isn't it THE fundamentally unearned privilege of patriarchy for men to act like they're, well, irreplaceable? What would happen if we acted as if we were, too? Maybe, just maybe, we'd pack all his sh*t up, everything he owns, put it in a box to the left, and say to him and the whole system that makes him what he is: "Keep talkin' that mess, that's fine. But could you walk and talk at the same time?" Indeed, you must not know 'bout me.

Exhibit C: "Telephone"
Now that we've decided that we women are, in fact, irreplaceable-- and now that we've kicked that nuisance patriarch to the curb-- maybe it's time to bring in some backup. Who should I call? Oh, I know, Lady Gaga! (For you instructors, this is the glorious point in the semester when the light-bulbs flash over all of the students' heads and, even if only for one brief moment, you've got a classroom full of bona fide militant feminists.) All bets are off, gentlemen. Everything is permitted. Hollaaaaahhhh.

Exhibit D: "Single Ladies"
Whooooaaaa, Beyoncé, hold up a sec! Have we just inadvertantly committed ourselves to man-hating? To bra-burning and folk-singing and rally-marching? To irrecoverable singlehood? Or worse-- gasp!-- to being a LESBIAN? (Though, okay, we'll admit that prison stuff with you and Lady Gaga was AICH-OH-TEE-TEE, even if I did leave my heart and my head on the dance floor.) Anyway, no thank you very much, Sasha Fierce. HELLZtotheNO! I can be all "down with patrirachy" only so far. So what if I want to get married? So what if I want the white wedding and the sparkly ring and the poofy dress with a crinoline slip? Is that so much to ask? I'm a single lady in total solidarity with my other single ladies, whatever it is that they want to do (even if that is play softball). At the end of the day, I'm just asking for something really, really simple: I don't want to be treated like a streetwalker and tossed aside. I mean, if you liked it then you shoulda put a ring on it, right?

Well, here's where the deconstruction comes in. (DANGER, WILL ROBINSON, DANGER!) Feminism isn't just a one-size fits-all phenomenon. (Would that it were!) So, sure, you can still be a feminist, even if you want to be a wife (and/or a mother), and even if you want look pretty or dress sexy, and even if you actually do care more about the white wedding and crinoline slip than you do about equal pay for equal work. The truth is, we HAVE come a long way, baby. And that means, among other things, that it might actually be (strangely, ironically) MORE difficult for women who find themselves independently, autonomously, inclined toward the "traditional" to claim the title "feminist." There are problems with the whole put-a-ring-on-it institution to be sure, but that doesn't mean it doesn't still have some value. So, to each her own, Sasha says:

There's more than a semester's worth of feminist material to cover in just these selections. And, what's more, I'm convinced that Beyoncé offers a lot more opportunity for complicating the traditional feminist narrative than the current scholarly literature does. But, whatever, we've all got our fantasy courses, I suppose. I'm not holding my breath that "Deconstructing Sasha Fierce" gets adopted.