Monday, February 3, 2014

Classical Music is Undead

Okay, so that thing about classical music dying? It's not true. Classical music is not dying. For classical music to be dying, it must have been alive first, and here's the dirty little secret: classical music was never alive! If classical music was alive, then it wouldn't have that adjective in the front, "classical." That's the whole point of something being classical; it's dead. If it was alive, we would just call it music.

That's precisely why the Scene That Must Not Be Labelled--you know, those kids in NYC with their fancy training and willful disavowal of generic boundaries--resist giving their music a name with such evangelical fervor. It's like the opposite of a farmer naming his pet pig; if you name something, it's going to die. (Although I hate to break it to them, the farmer's pig dies either way.)

Historical interlude: where does classical music come from? From the middle of the nineteenth century. In a nutshell, you had all these protomodernists arguing for a radical new vision of music making, a music of the future that would be transformative and break with the past in various interesting ways. In response, conservative critics and [most? definitely a lot] audience members more or less said, "wait, we don't like this crazy new stuff! Whatever happened to Beethoven?" And so, the audience retreated into no longer listening to new music, but to old music they still liked, largely by dead composers. The "classics," if you will. More structurally it was about the emergence of the bourgeoisie, but I'll let you read some real musicology for that story.

So if we're going to argue that classical music is not dead, then we aren't actually saying that classical music is alive, we're saying that it's undead. The actual question is, what species of undead?

 Most smart critics, and me as a teacher, participate in the Frankenstein model. We staple together a bunch of dead body parts, give it a zap of electricity, and call it alive. "Look," we say, "It's walking, it's talking, it's accidentally killing people it loves, it must be alive!" I think it's the spirit behind Andy Doe's infographic, and it's what I more or less do in the classroom. You teach, say, Notre Dame polyphony*, or Monteverdi, or whatever, and you point out how its self-conscious avant-gardism made people very alarmed. Silly old historical people! And then you get up to the present day, and you talk about how thriving and wonderful new music is. That's what I teach at least. And this gets magnified in the other wings of the music department, where the singers and the violinists and everyone else all get taught this weird version of history where choice bits of meat from the past thousand years of European music-making get nicely presented in little sanitized bits, and a Downton Abbey-like glow of comity surrounds all.

And when done right, when the staples are securely fastened, it's a pretty awesome sight. But I do feel a little bad, at least for all these students I'm sending out into the world to try for orchestra jobs or to teach it themselves to another generation. Frankenstein means really well, but Frankenstein is not big enough for them all to get jobs. They will get trained to perform at this amazingly high level, with just phenomenal technique, and they will bring incredible passion to what they do. But let's be clear: it might be on accident, and with the best intentions, but Frankenstein will still kill you. You will be chewed up and spit out, and you will make a living doing something very different than what you trained to do. You will tell yourself that you love music just for music's sake, and that you are whole-heartedly committed to doing what you love, but you will then not make rent. That's why I married a veterinarian.

Zombie is another option. @violetinbloom pointed out that zombies might better describe classic rock. I fully agree. My local "adult alternative" radio station is doing a weeklong tribute to the Beatles' arrival in the US, and I definitely feel as though there is a teeming mass of human carnage swirling around trying to devour me. No, classical music is no zombie. Zombies are for mass culture, and undead classical music is not mass culture. Every devotee of undead classical music who wants to save it from charges of elitism will point out how expensive tickets are for a Bruce Springsteen concert or a football game. And it's true, opera is a very cheap date.

But, let's be real about how power and privilege work. In fact, let's go back and be real about who we classical music fans were in high school. We were not exactly cool, were we? Maybe some of you were, but most were not. What psychological role did being a fan or maker of classical music play for us in the onslaught of our teenage years? I guess I shouldn't speak for you, but for me, and for many, many of my friends, being a classical music person was a source of inner self-superiority that we carried around with us. Our peers might be having much more fun than we were, and getting beat up much less, but at least we appreciated the highest art! Basically, Finch from American Pie, without the sex.

And so we satisfy ourselves with our custodianship of an ancient tradition, and, if we have the ability, we are thus tricked into giving--not buying, but just giving!--hundreds of millions of dollars to the Met so that they can put on a ridiculous production of the Ring. If popular culture is a kind of opium for the masses that tricks everyone into forking over their money, classical music is just higher grade. Basically, heroin. You don't need as many customers, just some really good ones. Also, tax breaks.

Which brings us, obviously, to the one major species of undead left out there. It's my favorite undead, actually. Undead classical music relies upon its aristocratic patina and social connections to sneak into our bedrooms at night, and when it bites us on the neck we kind of want it. We are all Lucy, Mina, and Sookie Stackhouse. Don't worry, undead classical music will never actually go away. But that doesn't mean it is alive. Check your neck for toothmarks.

*I meant to say ars nova. Can you believe I get paid to teach this stuff?

Wednesday, June 19, 2013

After Father's Day

My father and his father. (And sister.)

Late, but here is a Father's Day anecdote: I am tall, as is my father. It runs in the family; at six feet two inches I'm on the shorter side for the Gentrys. I'm not sure if the tallness is genetics or clean living. One would normally presume the former, but if you meet the extended Gentrys, who are the descendants of  homesteaders in Western Nebraska and are inveterate community volunteers, teetotalers, Boy Scouts, Methodists, country doctors, and so on, you get the sense that Mendel was wrong. (Which might explain why I'm so short, for a Gentry.)
But tall in the normal scheme of things, and thus the anecdote. I have no idea how old I was or what the context was, but my father and I were standing in a crowd watching something, and we were near the front. Noticing that we were blocking the view of those behind us, my dad said something to the effect of, "we're tall people, it's important for us to stand in the back so that others can see." And so we moved to the back.
That's pretty much my dad in a nutshell. 
On Father's Day, when everyone is posting pictures of themselves with their fathers, or writing heartfelt status updates about what their fathers meant to them, it seems silly to pile in with more, and in the face of #nodads it seems gauche and naive. But this Father's Day, how can I not? Because as everyone in my social media circles know, for the first father's day I am myself now officially a father in the technical sense of the word. Our son was born on June 5, and he is currently sitting next to me in his vibrating monkey chair, swaddled up for his morning nap while his mother is upstairs sleeping off a night's worth of nonstop feeding. Objectively and without exaggeration I can say that he is the cutest baby every born. Meet Wilfred:


So yes, how can I not write about fathers and their sons; it's all I think about right now. The semester and my book project seems very far away. Even the little repairs I need to do on the house we just bought, or the dinner I should cook for tonight, they seem off on the horizon. All I can do is look at Wilfred to make sure he's still breathing, and wonder why he's making that strange sound, and dear god surely it's not already time for another diaper change.
That shift in attention makes me think about something I learned from my dad, one of those important lessons that oughta be a status update or something. My father spent the first part of his career as an executive for a health care company, with the daily suit-wearing and company car kind of life you'd expect, and tremendous security. Then, when I was a teenager, he left that job for a series of more experimental pursuits—some consulting jobs, some internet ventures, and teaching classes at local colleges on an adjunct basis. It meant much less consistency, and coupled with my mom losing her job due to downsizing in the 1990s recession, it meant much less financial security.
But even as a teenager, and amplified today, I'm so proud of my father for making the choice. It's not so much the particular choice, but the act of making it. I imagine it would be so easy to do well in school, jump through all the right hoops, get a good job, and rise up the ranks, and eventually make it to the top. Certainly that's what we academics hope for, to a tee. We were all good at school, and good at impressing the right people and working hard, and we doggedly cling to our assumption that we can be happy if we just jump through the right hoops. But as plenty of smart people have pointed out, we academics too often conflate "happiness" with the simple act of having a stable middle-class job, with the correct title.
And we are just dreadful at making choices to actually be happy. We roam around the world living in horrid little places, we divorce ourselves from our loved ones on opposite coasts, we spend the the best years of our lives in abject poverty. We cultivate a snobbish mentality of personal elevation to hope against hope that there is some sort of uplifting intellectual reason for all of it. Certainly so much of this can apply to me. It's a devil's bargain, where either you spend your decades doing this for nothing, or you spend your decades doing it for that stable middle-class job. I've been on the job market since 2007 or so. I've been lucky to have had full-time work at some great institutions during this time, and even more lucky to have a partner with a real job to fill in the gaps, but the luck has never extended to having the correct job title of "tenure track." 
Maybe it will still happen, but I'm kind of over the whole thing. This spring, after the latest round of job market disappointments, we decided to just say, well, fuck it. (Sorry--another way I am like my parents is I don't swear often, but sometimes the sentiment fits.) No more living temporarily. She's got a great job, we live in a great city surrounded by family and friends. My own job is great in its way, even if it's not tenure-track. So we bought a house in our favorite neighborhood in our favorite city. Given the state of the academy, maybe I won't be able to continue in my current position forever--that's what you lose when you're off the tenure track, the security--but if that happens, I'll figure something out.
So that's why I think about my father's choice a lot, and how amazing it was. In the midst of success as defined by career progress, to actually say to yourself: maybe I should be doing something different. How often do we take control of our lives, especially by giving up control? Like my dad, I'm able to do it for a number of reasons, most importantly because I have a loving and supportive family. (The amazingness of my mom is a whole other blog post someday.)
So that's one father's day lesson that's particularly apt for me this year.I don't know what sort of big lessons Wilfred will get from me someday. My current master plan is that he should be either a poet or a painter, so I'm occasionally reading modernist poetry to him while he sleeps, and pointing out to him that our new house has an attic with good light, should he need a studio. Oh, and if his parents' genetics succeed at all, he will be a really good rower, and so he should really get on soon that so he can get into a good college. 
Actually, I'm beating around the bush. You know how we're always talking to ourselves, in our heads? Maybe carrying on a conversation, or narrating our activities, or asking ourselves questions For me, it's always felt more specifically like writing in my head. I'm always trying out particular turns of phrases, or crafting little stories to explain what I'm doing, to myself. Just a silly habit.
For the last year or so, as I've been writing away in my head, I've crafted one particular sentence that I keep refusing to ever put down on paper. Sometimes, in the midst of trying to write an email or grade a paper or carry on a conversation, it's the only sentence that comes to me, and I have to swallow it back down and remember where I am. But after Father's Day, it's too much not to write: my father is dying of cancer, which is the hardest sentence I've ever had to write.
This is no great surprise for many who know me, as my dad has been very open with his illness, and in fact has his own blog where he explores the vagaries of diagnosis and treatment. He has sarcoma, which is cancer of the soft tissue, and unfortunately is rather rather so there's not much in the way of data for crafting a prognosis. So when I write that my father is dying of cancer—there it is again—it's hard to know what that means with precision. Maybe there will be some miraculous occurrence, but more likely time is limited. The word is "terminal," I believe.
I didn't want to write that, not because writing makes it true—it's true enough without writing—but because of the novelistic clichés of it all. My son being born, my father dying, and my career at a crossroads. The irony is that I feel miraculously self-assured as a parent. I don't mean to be blasé about the upcoming challenges, but I have this strange inner sense of self-confidence that I will mostly do the right thing. And although I don't know where my professional career will go in the future, for the first time in years I'm comfortable with that not knowing. I might not have the correct job title, but there's actually nothing in my life that I would want to change right now.
Except, of course, facing the loss, some day, of the person who made these strange self-confidences possible.

Me, in 1980.

Friday, January 4, 2013

New Year's Day in Philadelphia


I feel distinctly, and uncharacteristically, unable to assemble coherent critical thoughts about the Mummers Parade in Philadelphia. Having lived in this city for five and a half years, I have of course been to see a few of the parades, especially to show out-of-town visitors. The main feature of the parade is that it is rather boring. You hang out on Broad Street, fighting off pretzel vendors but not much in the way of crowds, and wait for a long time. Occasionally, a group of costumed performers amble up the street, set down some cheap-looking sets and props, and do a lackluster little song and dance number easily bested by your average high school marching band. Everything seems a bit ragged and homemade. It doesn't make much of an impression on you, but it's one of those things that make Philadelphia kind of cute in an earnest sort of way.

My experience with the Mummers is a pretty typical one. Let me interrogate that experience a little bit, however, because my engagement is ultimately mediated by some crucial factors. This is going to seem silly, but here's the literal geography of my Mummers experience, a map of me walking from my home to where I viewed the parade:

Philadelphians will look at this map and immediately predict the point I'm about to make. For the rest of the world, here's what this map means: I live in Fitler Square, a small affluent neighborhood in the western half of Center City. My walk down Spruce Street, up 20th, through Rittenhouse Square, and then down Locust takes me through the most famously and visibly wealthy part of the city. The streets I walk on are lined with beautiful old brownstone mansions and expensive apartment buildings. I walk by nearly all of the most noted institutions of high culture in Philadelphia, all built in the 1850–1950 glory days of the city: AVA, Curtis, St. Mark's, the Academy, the Walnut Street shopping district, and the Union League. From my final vantage point I look to my left and see the towering City Hall, and to my right to see the Kimmel Center.

On the other hand, here is the official parade route:
As you can see, the Mummers' walk is rather precisely perpendicular to my own walk. Beginning in the depths of South Philly, the Mummers perform their way a solid three miles up Broad Street to the judging platform by City Hall, before adjourning to the Convention Center for more performances. If nothing else, this explains the rather listless performances up in my neighborhood; the guys have been walking for several miles, and are saving their energy for the performances that matter.

Needless to say, however, the real fault line here is the cultural intersection of these two literal routes. South Philadelphia is the traditional home of blue collar Philadelphia, my own neighborhood the traditional home of white collar. While today members of the Union League sit outside on the balconies of their club house to enjoy the festivities passing by (and indeed, hold fundraisers for the perennially underfunded parade organization), that peaceful intersection belies a long history of cross-class conflict, often still very present. Last year, for example, Philadelphia magazine, which caters mostly to the suburban Main Line crowd and wealthy Rittenhouse dwellers (like me), called for "getting rid" of the Mummers. (As the Naked City blog amusingly put it, "Philly Mag Conundrum: Hates Mummers, Loves White People.") As both an outsider to Philadelphia and as a card-carrying member of the elite, the Mummers Parade is not for me, but aimed at me.

 Mummers
The Philadelphia Mummers Parade is of course just one in a long line of such carnivalesque spectacles, a history I don't need to rehearse here. Susan Davis's excellent history (extracted in American Quarterly if you have JSTOR) gives the context of this particular parade, which originated in rambunctious nineteenth- century Christmas street celebrations corralled by the beginning of the twentieth into more orderly New Year's Day pageants.

Most carnivalesque spectacles depend heavily upon upending various oppositions, from political power to race and gender. The traditional Bakhtinian idea is that in a carnivaleqsue performance, the normal order is reversed. The poor become kings for the day, and masks are worn to disguise true identities. As Davis points out, for the Mummers upending these oppositions has expressed itself since the nineteenth-century in both drag and blackface performances. And this is where yesterday's Mummers Parade got interesting.

I spent New Year's Day at my in-laws in DC, so I was not a spectator this year. But through various social media outlets—and crucially so; as usual the traditional coverage in the newspapers and local TV stations remained silent on the subject of politics—I began to hear about some interesting performances.  First came from Mark Segal, the publisher of the Philadelphia Gay News, and a longtime local activist, writing on his (public) Facebook page:
OK America, the nations largest Drag Parade is now on the street. It's the Mummers parade in Philly, and we are officially in the "Wench divisions." Thousands of drags, mostly non gay, but this year a surprise. I'm honored to be a judge at Conventon Center at 5p, [sic]
So, we have an interesting acknowledgement on the part of the Mummers that there is some sort of relationship between Mummer drag and and gay drag. Thinking somewhat hazily about this, I would see this as part of the mainstreaming of a certain kind of gay activism in Philadelphia over the past few years. Segal, for example, has been involved in creating a senior housing complex oriented towards LGBT people. Anytime you're successfully working with construction and federal funding in this city, you're going to end up involved in the upper echelons of mainstream municipal political power, and Segal is now a regular player in that sort of world, of which obligatory fealty towards Mummers is an annual ritual. And, I guess it can go the other way as well; the Mummers paying a small kind of tribute to the political success of (a certain kind of) gay activism by including Segal, and also, I gather, a small contingent of "actual" drag queens in the parade itself.

The other stories that began trickling my way on Twitter and various blogs was more fraught with unease. There seemed to be one performance that involved fake American Indians stealing jobs from an Asian Indian call center, somehow involving Gangnam Style. More striking, however, was the performance by the Ferko stringband, titled "Bringin' Back Those Minstrel Days." Apologies for the quality of this guy's YouTube video shot off a TV screen, but you get the idea.




Whoa, right? As you'll see in a lot of commentary around this performance, actual blackface was apparently banned starting in the 1960s, although it has cropped up occasionally since then. But if actual blacking up has been rare, the discourse of blackface minstrelsy has nevertheless provided the foundation for the world of the Mummers, just as much as drag. As many scholars argue, the entire enterprise of reversing the "natural order" in the nineteenth-century parades hinged upon creating whiteness out of blackness, of melding together working-class immigrants from Ireland and elsewhere into a position of power by defining themselves against black Americans. Just as the drag performances of the Mummer Wenches deepens the inscription of masculinity into its performers, the blackface performances deepens their whiteness. Today they usually paint their faces any color besides black to avoid the official ban, but the point is the same. One often hears the argument that blackface minstrelsy still lies at the heart of a great deal of popular culture in this country, which I think is mostly true, but in the case of the Mummers Parade, minstrelsy lies even closer to the surface. The Ferko performance simply makes this heritage explicit.

I can't read into the hearts and minds of the Ferko band's intentions, although as Samantha Melamed points out, some of the signs carried by the band were literally copied from a web site called "black-face.com." And it is not hard to suss out racial context for their performance; simply go to the comments section of any newspaper or community bulletin board having to do with Philadelphia, find an article having to do with a black person, or crime, or schools, or sometimes just the existence of the city itself, and you will find your context. More so than in any other city I have ever lived in, there is a raw, grotesque, openly-racist resentment on the part of some white citizens (frankly, seems to be mostly ex-citizens who now live in South Jersey) of the mostly-black political establishment that is thought to run Philadelphia. Ironically, that attitude puts the Mummers in the reverse of their nineteenth-century position. Blackface is no longer parody of those below themselves on the ladder of political power, but parody of those imagined to be above. Again, I can't read the minds of Ferkos, nor do I know any of them personally, and for all I know the intent of their performance was merely ignorant of history. The context is unavoidable, however.

Needless to say, a lot of people got pissed off, and rightfully so. Samantha Melamed at the City Paper covers the controversy, including a link to some of the original 1960s blackface controversy and a predictable quote from the Ferko people. ("heritage blah blah blah never occurred to us to be offended blah blah blah.") A particular interesting reaction is that of Joey Sweeney at Philebrity, who makes the provocative move of calling upon all of us nouveau Philadelphias (such as myself), the ones who are gentrifying this city, to reject this peculiar heritage. On the other hand, some have seen the inclusion of identifiably-gay drag as evidence of liberal progress; historian Jonathan Zimmerman's piece for the Inquirer sets up a narrative of such progress, in which this year's inclusion of gay drag is matched with the official ban on blackface as "another welcome blow against bigotry." Zimmerman is uncritical (and his history a little vague) but for another queer defense of the mummers, see the poet CA Conrad's writing in response to the Philadelphia magazine attack mentioned above.

What to make of all this? This is where the fast-moving landscape of race, class, and sexuality, pressed down by the weight of history, renders me somewhat speechless. The ease at which some members of the gay community gloss over the blackface issue in favor of some limited kind of notion of "progress" troubles me, but so does my own position as someone who is very much the target of the Mummers tradition, and an outsider to that history. Ideally there would, I think, be some sort of "oppositional spectatorship" as Tavia Nyong'o puts it in his writing on racial kitsch, but as he points out even at its best that kind of oppositionality might not be "overcome its ability to reproduce scapegoating." (388) And in my limited experience in Philadelphia, there has not been such oppositional work; I'm not aware of attempts (please let me know if there have been some!) by African American performers to undermine Mummers blackface through further parody and commentary, and that's quite understandable. It's one thing to curate collections of racial kitsch, but another to be faced with actual living racist kitsch.

We find ourselves mired in this political stasis where on New Year's Day in Philadelphia, the black and white, and the poor and wealthy remain poised to attack one another. It would be one thing were this limited to New Year's Day, but we also have to read these divisions against the myriad other faultlines of the city: the neighborhood battles over gentrification in Point Breeze, the assaults on Asian kids at South Philly High, the labor disputes of the Goldtex project, the continuing defunding (by the state) and privatization (by the city) of the public school system, just to name a few. And the epic minstrel show that is the United States of America continues on.




Wednesday, October 24, 2012

Editing Sound

Mitch Miller and Johnny Ray listening to the taped playback of a recording session.

As the previous post might indicate, my current book-writing-related obsession is tape recording. This is because the advent of tape recording in the late 1940s is one of the many factors that transforms the music industry after World War II. It was, in fact, directly related to the war: Allied troops entering Germany in 1945 discovered that German scientists had advanced magnetic recording technologies far beyond anything in this country. A few of these German machines were semi-legally-smuggled into the US where they became the basis for the Ampex machine popularized by Bing Crosby, and which by 1954 had completely replaced phonographic lathe machines in recording studios. (Although phonographs remained the dominant distribution medium for several decades.)
Surprisingly, the impact of tape recording has not gotten a ton of scholarly attention. There are a few studies here and there, but nothing like the work that has been done on either of the sound packaging systems that bookend magnetic tape, the phonograph and digital audio. But magnetic tape is just as revolutionary. What particularly interests me is the concept that sound could now not only be captured, but edited. Editing sound previously meant going to difficult and cumbersome lengths with record cutting, or in the case of steel wire recording, literal welding. Now, anybody with a pair of scissors and some tape (ideally Scotch Tape No. 41, which did not ooze up around the sides) could do it. Sound was never the same.
Which, unsurprisingly, caused a certain amount of anxiety. A 1954 consumer manual:
Tape recording allows for considerable latitude in corrective re-recording and sound editing. This fact has been a godsend to many recording artists who cannot or will not present a competent beginning-to-end performance of a work of music for recording....This results in a record which is possibly better than any single public performance ever given by the artist. And it is really a synthetic performance. It may sound fine, but is it music? Is it art? Is it high fidelity?  
My chapter therefore looks a series of moments in the early 1950s that outline the philosophical issues raised by editing sound: Les Paul's multi-tracking experiments, the controversial 1952 Furtwangler recording of Tristan and Isolde in which a young Elizabeth Schwarzkopf sang the notes an aging Kirsten Flagstad was unable to hit, and the modernist tape compositions of John Cage, Otto Luening, and Vladimir Ussachevsky.

If you're in the Philadelphia area, I'm going to be previewing some of these ideas in a talk at St. Joseph's University on November 8 at 7:00 pm. It's provisionally titled "A Short History of Magnetic Tape in 1952."

Citation: Donald Carl Hoefler,  Hi-Fi Manual (Fawcett Books, 1954). Italics in the original. Photo at top from the same volume.


Thursday, October 18, 2012

Imagining the Mix Tape, c. 1948


German-American engineer S. Joseph Begun, writing in 1948 on magnetic recording:

In analyzing the particular advantages of the magnetic phonograph, one will first think about its usefulness as a long-playing instrument. While some of the magnetic recordings are now limited to 1/2 or 1-hour recording time, this is not a basic limitation. The playing time, within reason, could be made as long as one desires, for example, as long as the longest symphony ever written, or, if one wants to go to the extreme, even as long as the longest opera ever written... 
Classical music, however, constitutes only one type of musical program to which people want to listen in their homes. There are still many more dance records and light musical selections sold than symphonies and operas. True enough, one could have small reels, each of which would accommodate one tune, but in this case the user would have to change the reels after each playing....No doubt, in time to come, some ingenious engineer will think of a construction that will permit loading a magnetic recorder with a number of reels in such a fashion that after one reel is exhausted the mechanism will automatically seize the recording medium of the next reel, thread it through the system, and start to play it. 
Particularly when tape is used as the recording medium, editing is simple, and it would not require a major effort on the part of the user to join a number of selections together on a large reel, thus achieving a series of preselected short tunes. This method may be the answer, but it is not believed that the average person using such an instrument will subject himself or herself to such trouble, however small it may be. It is because people are by nature lazy that so many gadgets can be sold today, and only an enthusiastic amateur who is willing to add a personal touch to his equipment will go further and record and edit his reels.
S.J. Begun, Magnetic Recording (New York and Toronto: Murray Hill Books, 1949), 222–23. Picture is of a Tonschreiber Type B, a field version of the Magnetophone used by the Nazis during World War II.

Monday, October 1, 2012

A History of Political Dogs, 1952-1953

Ever curious about the connection between Nixon's famous "Checkers" speech (Fall 1952) and Patti Page's hit "(How Much is That) Doggie in the Window" (Spring 1953)? Short answer: there isn't one. Longer answer: Read my post over at the IASPM-US blog as part of their curated series "Political Machinations of Popular Music" in honor of the presidential election.

Saturday, September 22, 2012

Einstein on the Beach and Preemptive Nostalgia

I was intrigued by NY Times critic Zachary Woolfe's account of experiencing Einstein on the Beach live for the first time after many years of knowing only the music. Fresh off of admitting that he doesn't like Billy Budd all that much, Woolfe proceeds to deftly puncture the NYC music scene's nostalgia for the Downtown heyday of the mid-seventies, when Patti Smiths roamed wild in the streets and Philip Glass made extremely simple diatonicism revolutionary:
I was surprised to discover that for all those years I had been missing so little. The really important component of the opera turns out to be the music, the part that has always been with us, preserved on those recordings. The other, more evanescent elements — the eternally open horizons of Mr. Wilson’s blue-lighted backdrops; his twitchy movement vocabulary; Ms. Childs’s swirling dances — feel, in the final estimation, dazzling yet dispensable...
Actually, though, I suspect the part of Woolfe's piece that will cause the most heartburn for avant-gardists of a certain age is not his boredom with Wilson's staging, but rather his description at the beginning of how he has been listening to Einstein all these years:
I can’t count the number of times I have sat at my computer, opened YouTube and typed a strange phrase into the search box: “knee 1.”
Ouch! Yes, the nightmare scenario has come to pass: even the Times music critic, the last one standing apparently, consumes music much the way my undergraduates do. Not by carefully cultivating a collection of music, let alone CDs, let alone LPs, but through the vagaries of YouTube. In this music in the age of mechanical reproduction, the experience of Einstein goes something like this:

The first hit is actually a video about knee injuries. The second hit indeed gives us Knee Play 1, uploaded by some guy named "MountGoth." In the video, the 1993 recording is presented (without attribution or any credit to the performers or label) to a static picture of a lake in the mountains.


The second hit gives one a version accompanied by some other random guy's artistic interpretation of vaguely Einsteinian themes. The third hit, a picture of Philip Glass, and then we're back to knee injuries and something about the Battle of Wounded Knee.

As I say, I am aware that this is precisely how my students—currently born in the mid-1990s—consume the music I require them to listen to. Theoretically they are supposed to purchase these CD sets that come with our textbook, but it runs upwards of $150 to buy all three sets, and I completely understand that this seems ridiculous, especially when all they are going to do is rip the CDs into iTunes anyways. For the conscientious-but-thrifty students, I suspect they band together and purchase one set of CDs to be ripped to a dozen computers. For those who are just thrifty, I think most of them go onto YouTube to do their required listenings, a suspicious confirmed when they are deeply confused by issues of orchestration and tempo by the "official" recordings when I play them in class. Personally I am somewhat old-fashioned, and in the classroom I often try to instill in my students at least a small amount of affection for the material objects of sound recording. We pass around Edison cylinders, 78s, 45s, fragments of actual magnetic tape, and nowadays, compact discs. It's not a bad thing that music circulates differently today that it once did, I tell them, but it is indeed very different.

However—and this is a big however after all of the above—I totally appreciate Mr. Woolfe's perspective. Or rather, part of it; I haven't yet seen Einstein on the Beach. I was born four years after the original production, was still only twelve years old for the last one in 1992. I like to think I was a fairly precocious new music kid, but not that much. The Verges/Obennhaus documentary from '84 gives me a sense of the production, enough that I confidently (albeit polemically) tell my students every year that Einstein on the Beach might be the greatest opera in the twentieth century.

But I did by the end of my teenage years discover Philip Glass, first in a new music concert when I was 16 (the same in which I heard 4'33" for the first time, incidentally), and then more systematically in college, where as a freshman I did a class assignment that involved knocking around in the world of 1+1. Then in graduate school I was lucky enough to be immersed in an entire seminar on minimalist theater lead by the indomitable Robert Fink, and although Glass is not a subject I do active research on, it's been a regular part of my classroom ever since. And not that long ago, I saw the Sesame Street video that circulates occasionally on Facebook, in which Philip Glass's music provides the backdrop for a mesmerizing series of geometric abstractions. It first aired on Sesame Street in 1979, the year before I was born. In some ways, Philip Glass has been in my ears for my entire life. And Einstein on the Beach had better not disappoint me.


This, of course, is what Woolfe is speaking to. Those of us too young to have experience the glory days and battles of New Music in the pre-John Adams era can't help but want to experience those thrills for ourselves. Not to sound like Woody Allen in Midnight in Paris, but there is a feeling that surely things were more alive back then. Woolfe:
With “Einstein,” the fantasy is the return to that bohemian, avant-garde New York, so full of excitement and possibility. Both those who were there and those of us who were not want badly, for our different reasons, to conjure an event, a moment, even an entire city that now exists only as a memory. It is Mr. Glass’s music that is more than that, and it remains as close as your computer.
This is an odd sort of nostalgia, but it is nevertheless nostalgia. Nostalgia usually implies personal association with the sentimentalized remembered pass, but it needn't necessarily. And as a form of nostalgia, you have to go the next step and think about what ideologies produce that nostalgia. As Melissa Harris-Perry famously said, "history understands where nostalgia obscures." As I wrote before, I think we need to be more open to nostalgia; it has potential to reveal good in addition to obscuring past evil. The nostalgia for Einstein's radical purity certainly obscures some basic facts of music history.  In the multi-racial, liberatory, free-wheeling world of the 60s and 70s avant-garde, it's not an accident that an opera with music written by a straight, white, Juilliard-trained man remains widely-heard four decades later, as does the music of...Steve Reich...John Adams...hmmm. Ask the ghost of Julius Eastman where avant-garde nostalgia has gotten him in the history textbooks.

And yet, Hilton Als reminds us in his fascinating New Yorker piece on Einstein that Robert Wilson's contribution, precisely what Woolfe did not care for, is by far the most critical and challenging part of the opera. Wilson's work confronts those very social issues that Glass tends to portray in abstraction. That abstraction is the source of Glass's longevity, it's why Glass's music works equally well circulating in mainstream movie soundtracks as it does when you're knocking out additive patterns on an amplified tabletop. But my own personal nostalgia is that it will be Wilson's vision that might reveal something about the avant-garde past that might be worth remembering, even in the gauzy, soft-focus light of nostalgia.

I've got tickets for the very last show, Sunday's matinee. Here's to hoping.

Monday, September 10, 2012

Nixonian Listening

Factoid of the day: Richard Nixon liked to work while listening to Richard Rodgers' (and Robert Russell Bennet') score for the early 50s television series Victory at Sea.

As I write in my chapter on performances that remembered the Pacific Front, Senator McCarthy was also fond of dressing up and playing soldier, a role he called "Tailgunner Joe." He even received a belated Distinguished Flying Cross in 1952, largely on the basis of falsified records. His famous war injury was actually, biographers believe, the result of a hazing ritual.



Thursday, August 30, 2012

Rogin's McCarthyism

There's a lot to read about McCarthyism. When I was finishing my dissertation proposal circa 2006, I felt reasonably sure I'd read most everything out there, from old-fashioned catalogues of misdeeds like David Caute's The Great Fear to many studies on specific industries like Hollywood or academia or even the discipline of anthropology, to rigorous analyses of institutions like that of Ellen Schrecker, whose work is still the gold standard in my opinion. And there's the pro-McCarthy stuff, like M. Stanton Evan's revisionist biography of the man, or Ann Coulter's Treason. Of course, more stuff is always coming out at a rate that's hard to keep up with. In the "currently-on-my-bedside-table-waiting-to-be-finished" list there's Haynes Johnson's history that attempts to connect McCarthyism with the War on Terror, and David Everitt's very promising looking account of blacklisting in the broadcasting industry. I'm also trying to track down a copy of Richard Power's history of anticommunism, which I gather from reviews is sympathetic to the McCarthy cause.

What most of these works lack, however, is intelligent analysis of the big picture. The trend has definitely been to look at the small details, examining how blacklisting worked in particular contexts. That's fine as far as it goes, but it seems that many scholars have given up trying to explain why McCarthyism as a phenomenon happened the way it did, and when it did.

Perhaps part of the reason is that the early phase of McCarthy historiography dwelt a bit too heavily on the big picture, at the expense of reality. As early as 1954, a group of left-leaning (at the time; many would drift into neoconservatism) scholars convened a faculty seminar at Columbia to try to explain McCarthyism. Their responses were published in a book edited by Daniel Bell, The New American Right. Largely, their focus was on psychology, asking what it was about the psyche of the American people that lead to McCarthyism. In essence, they decided it was a combination of authoritarian personalities (borrowing from Adorno) and status anxiety on the part of newly-emboldened middle-class ethnics, revolting against the old WASP elite. In their telling, the old agrarian radical populist tradition was fundamentally transformed into a right-wing movement. The New American Right makes for juicy reading, and its authors went on to great prominence in political and academic worlds, but it's hard to take their analyses seriously today, especially after a critique launched at the book by the then-young scholar, the now-late Michael Rogin.

Rogin's doctoral dissertation, later published as Intellectuals and McCarthy: The Radical Spectre was a full-throated takedown of the New York intellectuals work, crucially coming from a leftist perspective. Rather than admit McCarthyism's success as a social movement, he showed (in painstaking empirical detail) that there was no continuity between previous radical populist traditions and McCarthy, nor was there genuine mass-based support. Rather, McCarthyism was pushed by the same right-wing elites who have always existed, aided both by a timid liberal press and a Republican Party savvy enough to use McCarthyism to win the 1952 election, and then dump it as soon as he encroached upon conservative elite institutions such as the Army and the Executive branch. In other words, McCarthyism was not a genuine social movement, but rather a manipulation upon the part of conservative elites to create the impression of one. The New American Right authors took McCarthyism as an excuse to reject overheated populism, and started their slide towards Humphrey-Nixonism; Rogin showed that it wasn't democracy that was the problem, but its perversion by center-right and center-left.

Rogin on McCarthyism is not as fun and sexy as his later work, but it's still quite relevant today, especially since the right wing in this country has very successfully claimed the mantle of agrarian populism: in the conservative evangelical movements of the 70s and 80s, in the red state-blue state binary of the 2000s, and in the Tea Party today. One of the depressing things about reading early Rogin, however, is how much has changed since he wrote the book in the early 1960s. As much as we talk about astro-turfing in the Tea Party today, I don't know of many who would say that right-wing populism isn't a real thing, a genuine social movement. A wrong-headed and dangerous one, but one that has a considerable amount of staying-power and traction amongst a large chunk of this country's population. I think Rogin was very right that McCarthyism did not fall into that category, but, depressingly, I do think that by the 1970s the conservative elite had their way, and a right-wing mass-based social movement was born, and, so far, here to stay.

Tuesday, August 14, 2012

Reboot; and, The Investigator

[yeah yeah yeah, apologies for long blog absence, grovel grovel grovel]

Other projects have kept me away from my book this past year, but as the summer comes to an end I'm finally turning back to it. Large chunks are written, and with a slightly less trying teaching schedule this year, I think if I can focus (always difficult for me!) and buckle down, having a complete draft by the end of next summer is not unreasonable. So as I try to think of ways in which this blog space might be useful for me, it seems like rather than providing an outlet for procastination, it could actually help me narrow and focus my writerly energies a bit. So, blog readers (if any be left), keep me honest: I'm going to post regular updates on my writing progress, and keep the content focused on the subject of my book: music and the cultural politics of McCarthyism. If I write anything here about music after 1954, slap me in the face, electronically or literally depending on your geographic distance. Okay? Alright.

So let me start off with my current task: re-writing the introduction. In addition to doing the usual literary maneuvers of an introduction--a snappy beginning, a discussion of theoretical apparatus, an overview of the chapters, etc.--one of the things my introduction needs to do is talk about "actual" music and McCarthyism. The scare quotes are there because not a whole lot of my work is actually about McCarthyism, per se, in the sense of discussing blacklisting of composers, or musical representations of McCarthyism, or heck, Joseph McCarthy's own personal musical taste. (I've never found any biographical discussion of the last point, but boy would I like to know more.) I'm not going to go into the lengthy discussion in this space of why that is (short version: it's not very interesting), but I do feel a bit obligated to cover some of that stuff in the intro. I talk a little bit about blacklisting in popular music, but also about this amazing artifact:


There is very, very little music that is literally "about" McCarthyism, in the way that the film and theater world very quickly responded to the situation. But here is one! To a point--it is actually a radio play, although it does have an original score. It was written by Reuben Ship, a leftie writer from Canada who was deported out of the US in the early 50s for working to unionize the television industry. His response was The Investigator, a very thinly-veiled portrait of McCarthy, broadcast by the Canadian Broadcasting Corporation in 1954. Although never broadcast in the US at the time, an LP was released, and I bought a copy on eBay. For more, including the complete recording, see Gerald Gross's short article, or this very thorough Wikipedia entry. Or, read my book someday!