QOTD (2011-01-25); or, Nostalgia and the Homoerotic Literary Tradition

It’s disconcerting how much I identify with a teenage Terry Castle, as represented in her memoir, The Professor:

In high school I had been almost freakishly solitary and skittish, with no idea how to comport myself in ordinary-teenager fashion…. Bizarre as it sounds, by the time I left for college I had never once called anyone on the telephone or invited a classmate over after school. Nor had I myself been so called or invited…. On the contrary: I’d been reclusive, a regular Secret-Garden-Frances-Hodgson-Burnett-Girl-Hysteric-in-Training. At seventeen, I remained passionately (if uneasily) mother devoted; frighteningly watchful, in school and out; abnormally well read in Dumas novels, G.K. Chesterton’s Father Brown stories, H.P. Lovecraft, and the lives of the poets…. I began devouring certain louche modern authors in secret: Gide, Wilde, Thomas Mann, Hermann Hesse, even Yukio Mishima, then at the height of his celebrity in the West. Sexual deviance, or at least what I conceived it to be, began to exert a certain unhallowed, even gothic allure—a glamorous, decayed, half-Satanic romance…. Not least among the attractions that such literary homosexuality proffered: some drastic psychic deliverance from familial dreariness and the general SoCal strip-mall stupor…. As for “homosexual practices”—and I confess I wasn’t exactly sure, mechanically speaking, what they were—they sounded sterile and demonic but also madly titillating…. Anything could happen, it seemed, in the fascinating world of sexual inverts. Lesbianism didn’t figure much, if at all, in these early reveries: one of the oddest parts of the fantasy, I guess, was that I was male, dandified, and in some sort of filial relationship to various 1890s Decadents. I knew more about green carnations, the Brompton Oratory, The Ballad of Reading Gaol, and the curious charms of Italian gondoliers than I did about Willa Cather or Gertrude Stein—not to mention Garbo or Stanwyck or Dusty Springfield.

I was already feeling nostalgic about high school from earlier this afternoon, when something one of my high-school friends posted on Facebook reminded me of those lonely quiet days of negotiating friendship for the very first time, bumbling step-by-step out of the world of Musketeers into the bright cloudless San Diego shopping-mall sunlight, learning for the first time at the age of 16 or 17 how to socialize. At the very same time as I was learning how to socialize, I was learning how to study sexuality—putting off the terrifying process of having to confront myself and who I was and what I wanted by discoveries no less thrilling and rewarding. I can remember where I was, how I felt, when I read “Reading Gaol” and the Calamus poems and Howl. I remember lugging around school volumes of Krafft-Ebing and Kinsey borrowed from the UCSD library, because I thought it made me cool. I remember sitting up in a dark bedroom at 3 am on a hot summer night in the dead-quiet suburbs, talking to my high-school friend on AIM while we simultaneously watched Shortbus together. I remember being trapped by the walls, by the air, by the cars we all were stuck in all the time when we drove down Interstate 15 into or out of the city. I remember reading Viola in Twelfth Night for one English class. Teaching Whitman and Ginsberg in another. Adapting and staging and deliberately cross-casting The Importance of Being Earnest in another.

Like Symonds, like Terry Castle, like me, so many of us believe in another world, a place where glamor and panache and camp and beauty take the place of gender conformity and Hollister clothing and pink stucco houses. So many of us read our Plato, read our Wilde, lived our pretentious little teenage lives in a performative effort to clap! clap if we believed in fairies! So many of us grew up and didn’t quite ever find that the real world lived up to the hazy, lilac-scented Arcadian visions of the earliest chapters of Brideshead Revisited and The Picture of Dorian Gray. (As a teenager, I was never quite able to read all the way to the ends of those books, and watch the paradise slip away into madness.) But I’d like to think we all cultivate our own gardens, all find our paradises within, happier far. I’d like to think that Symonds’ creation of a homosexual culture and his affair with his gondolier at long-last helped to put some of his demons to rest; I’d like to think that Castle’s wonderfully dry self-deprecating humor bespeaks contentment with the material niceties of her life as a successful academic, far from the barren stucco San Diegan wilderness (mentally, if not physically. But I swear, Palo Alto’s nicer than University City anytime). And I dream that someday someone will pay me to introduce the homoerotic literary tradition to the young people who desperately need a little camp and glamor in their lives. But the thing is, even if plans B through Z fail and no one ever does, I’ll still be the 14-year-old kid who strutted over the hills of a southern-California suburb one September in floral-patterned knee-breeches, black lace-up boots, a frilly shirt and doublet and a broad-brimmed hat with an enormous purple peacock feather pinned to the brim. “Notes on Camp“? Baby, who needs ’em? I’ve got the text memorized!

I’ll end this reverie on the perfect note that Madonna’s “Vogue” serendipitously came up on shuffle.

A Word From Your Friendly Neighborhood Peer Academic Advisor

As I spend more time reading the professorial blogosphere, I find myself more frequently tempted to comment on academic questions I, as a college junior, am far underqualified to have an opinion about. I may be awfully opinionated, but you should probably listen to the professionals if you actually want to learn anything useful about life in the academy. That said, though, I was just barely, bureaucratically ineligible from becoming a peer academic adviser for freshmen in my college this year, and I have plenty of thoughts about what I did right and wrong in my first two years of university that may be worth sharing with my now-nonexistent advisees.

In the spirit of Historiann’s recent post about undergrad satisfaction and regrets, Tenured Radical’s advice to faculty academic advisors (no, I still don’t know whether “advisors” or “advisers” is correct, so I’m using both), and the multiple letters I’ve already gotten this summer from frosh and sophomores who want some advice on choosing classes; and in order to offer a more constructive tone than that of my whiny distribution requirements post of a couple weeks ago, I offer here some thoughts directed at first- and second-year university students trying to navigate a new academic world. These thoughts are probably better-suited to academically serious students for whom college is more about learning and intellectual development than it is about anything else (not to say that’s what college has to be; some students feel that way and some don’t), but I don’t see why it shouldn’t apply to anyone concerned about making the right choices and learning to decipher academia.

Listen to the experts. As Tenured Radical indicated in her post, the online rumor-mill is of limited use in determining which classes to take, especially if you’re looking for good classes and not just easy or fun ones. But many’s the time I’ve ignored the advice of a professor or grad student who knew me, knew the person teaching the class, and knew that I wouldn’t find the professor or the material a good fit for me. Work on politely phrasing questions such that you can ask a professor not what she thinks of her colleague’s teaching, but whether her colleague’s class would be a good fit for you. And if she says it wouldn’t, pay good attention to that recommendation.

Keep an eye out for professors’ names. Often I’ll ask frosh who’s teaching a particular class, and they’ll say they’ve forgotten the professor’s name. But a class taught by a fantastic professor, even if its topic is outside your immediate area of interest, is a better use of your time than a class in your area of interest taught by an unremarkable professor, and so it’s advisable to remember those names. This is where listening to the experts comes in, as some sources on who the best professors are will be more reliable than others. If you’re torn between the professor and the subject matter, take the professor every time. And be aware that a lot of different people teach, e.g., Victorian literature, or the American history survey, or SOC 101, and you might want to wait to take the class until the best professor is teaching it.

Be careful about what you can handle. Starker than the social divide between undergrads and “sketchy” grad students is the divide between the humanities and the sciences. If you’re a humanities major like me, you probably grew up thinking that as a “humanities person,” you couldn’t possibly be any good at math or science. You may have picked your first semester’s courses thinking that since your SAT math score was on the low side, you couldn’t possibly handle a college-level quantitative class, and so you decided to sign up for the easiest quantitative class in the whole university, a computer science class whose syllabus explained quite clearly that it was going to repeat a lot of material you’d already learned in high-school computer science. (This may or may not have happened to me.) That syllabus, dear frosh, is a good indication that you’re not going to learn anything from the class, and that you should consider taking one which will teach some new concepts.

This is not to say, however, that you need to choose the most challenging thing in all areas outside the ones in which you’re confident. To fulfill my lab science requirement, I took physical anthropology and environmental science: not taxing in the same way university-level physics or chemistry is, but nevertheless useful, interesting, and well-taught, and therefore not a waste of time. If, when you’re honest with yourself, it seems that it would take more work to pass intro physics than it would to get an A in your required departmental seminar, it’s probably best to leave yourself the time to get the A in your required departmental seminar.

Plan ahead. It’s probably just a tad neurotic to make a plan for what you’re going to take every semester for the next four years (which is not to say that I haven’t done it…), but you’ll find that it will benefit you to think farther ahead than the next semester. By the time you’re halfway through college, the number of course slots you have left will start to look increasingly finite (especially if, like me, you’re planning on a semester abroad), and you’ll find yourself having to make difficult choices between queer theory and colonial American history, or suddenly realizing that the course you’ve wanted to take since you sent in your matriculation forms is only offered in one of your four years. It might be worth looking through the course catalog, making a list of all the classes you feel as if you can’t possibly graduate without taking, and keeping an eye out for those titles every semester.

Start a new language. Obviously if you’re an engineer or premed or have three majors this is more tricky, but college is really the best time in your life to start a language you missed when you were young, and I regret only continuing the ones I began in junior high and high school. You may want to think about which new language will help you most in your future areas of academic or professional interest, but studying a language for which you can’t see any possible “use” is still worth it, and is “useful” for its own sake; I really regret chickening out of starting ancient Greek. Which brings me to my next point:

College doesn’t have to be vocational school. College students seem increasingly to be thinking of their bachelor’s degrees as discipline-specific professional credentials which will prepare them for specific career paths, or which just sound vocational (first-years of the world, academic economics is not the same thing as business or accounting!). There’s nothing inherently wrong in this, but you should know that there’s no reason to feel pressured to study something “useful” or something which has the same name as a profession. Not only can you certainly have any kind of successful professional life with an undergrad degree in any field, but studying what you love is important in and of itself. You should figure out which classes you enjoy the most and find most intellectually stimulating, and then continue to take those classes. You’ve got enough time to develop a career—right now, it’s time to learn how to think, and how to love to think.

This goes doubly for grad-school-bound kids. Just because you’re majoring in a not-usually-vocational subject doesn’t mean you can’t make it vocational by locking yourself into a path focused solely on grad school admissions and on making preparations to succeed in the professional world of academia. Your professors can advise you on what you need to do now to be prepared for grad school (and indeed whether you should apply at all), but it doesn’t hurt to distinguish undergrad from the rest of your life. Your undergraduate thesis is not a dissertation, your A- in a departmental seminar will not sabotage your chances of getting into a top program, and trying out courses across the curriculum won’t prevent you from being good at your intended field of study. You’ve got 5-10 years in grad school to become a specialist and to lose sleep over the job market; undergrad is not the right time.

Try out possible majors early. If your system is like the one at my school and you have to declare a major the spring of your sophomore year, you’ll probably want to take introductory/survey lectures in a variety of different departments your first few semesters. In terms of figuring out what you want to learn about for the next few years and possibly longer, doing this kind of exploration is more important than knocking out core-curriculum requirements just for the sake of knocking out requirements. While I regret some choices I made in my requirement-juggling, pushing the philosophy and science requirements till junior and senior years in order to try out sociology and English was not one of them. By taking sociology early on, I avoided making a terrible mistake when I discovered that I actually don’t like data; by taking English early on, I found a second home which has enriched my study of history in countless ways. And, indeed, don’t just stick to intro classes: by making time in my freshman-year schedule for an upper-division history course, I came in through a back door which got me much more enthusiastic about the discipline than subsequent more intro-level courses have.

However, there’s no need to take this selection process too seriously: your undergrad major does not determine the rest of your life. As per the comments about vocational education above, your undergrad major will probably have very little impact on what you do as an adult, even if you’re grad-school-bound. I know so many academics who have changed fields, it’s not funny—so study what you want to study right now, and let the rest follow.

Be skeptical about all-freshman programs. Your university is probably selling you a line about the “first-year experience,” and about how rewarding taking a freshman seminar would be, but I’ll be frank: a class entirely populated by first-years isn’t going to challenge you very much. This is not to say that just because you’re an academically serious student you’ll be better at college than everyone else in the class, but taking a lot of all-freshmen classes, while less scary than being in classes with mostly older students, can limit your opportunities to seek out mentors among the older undergrads and grad students who, in my experience, will make the difference in your undergraduate education.

The bright side of special small classes for first-years, particularly if you’re in a field or at a university which doesn’t otherwise offer a lot of small seminars, is that they can get you in contact with faculty early on, which is much harder to achieve in intro lectures with hundreds of students. I became a research assistant for the professor of the freshman seminar I took my first spring. Helping him do archival research and organize his primary sources that summer not only convinced me I wanted to be a historian and, practically speaking, taught me a lot more about research skills than I’d gotten in my classes so far; it also gave me a lasting mentor on the faculty. Such opportunities are not to be sneezed at, and can be worth 12-15 weeks of not learning a whole lot from your classmates.

Only compare yourself to yourself. In my first year I wasted hours sobbing to myself about whether my comments in class discussion were as clever as my prep school-educated classmates’, or whether I deserved to be at Princeton even though I couldn’t reference as many post-structuralists in casual conversation as some of my more pretentious classmates could. But I’ve learned not to worry: when professors evaluate your work, they’re not doing so on the basis of how frequently you can name-drop Lacan. As a first- or second-year, you cannot expect yourself to be as well-versed in disciplinary methodology or jargon as older students who have been in your department for a couple years and have done a lot more work in the discipline. Just make sure that you’re consistently putting in the most effort and turning in the best work that you can sanely manage, ignore the students who are obviously just bullshitting, and allow the ones who really know what they’re talking about to teach you how to talk the talk of a budding historian, or whatever it is you should happen to be.

Have fun, but carefully. For the academically serious student, a creative non-fiction writing workshop is a good “fun” class, and a worthwhile addition to your schedule. A 450-person children’s literature lecture largely populated by jocky fraternity and sorority members who spend the entire lecture talking about their upcoming rager may be more frustrating than “fun.” (I’ve done both.) It’s not wise to take only the most challenging classes, especially if you’re taking more than the required number of courses/credits; you’ll burn out. An arts class in which you turn in a painting or a performance can be a much-needed change from a barrage of 8-10-page analytic essays. But “easy” and “fun” are very different things. You’ll regret “easy” halfway through the semester when you’re in discussion section, no one’s done enough of the reading to have a conversation, everyone’s checking Facebook on their phones, and the poor instructor has long since given up holding the entire class’s attention. You’ll find yourself wanting to check Facebook, too, and let me tell you: it’s all downhill from there. If you’re uncertain about whether a class will be “easy” or “fun,” ask for advice.

And the moral of the story is…

Talk to adults. When you start college, you’re still a kid. You think the way you were taught to think in high school; you’re unused to making decisions (whether academic or otherwise) for yourself; unless you’re an academic brat, you’re probably unfamiliar with the arcanities of academic culture. Obviously, this is not your fault; it’s just the way things are, and at times academia can be a bit too impenetrable for its own good. But your next four years will be a lot more pleasant if you can crack the system, and it’s faculty and staff members, graduate students, and older undergrads who can help you make this transition both to adulthood and to an academic community. If you’re an academically serious student, regardless of whether you want to spend your life in academia, I can guarantee you that your life will be changed and your worldview will be opened if you allow your path to cross with those of older friend-mentors. Visit office hours. Accept dinner, lunch, and coffee invitations. (In my first year, I declined a coffee invitation from a grad student. I was shy and hadn’t yet figured out the social rules of meeting people for coffee, and that he was being friendly, not creepy. He could have been my friend, and I regret it to this day.) If you go to the sort of school where grown-ups eat in your dining hall and grad students and faculty members live in your residential system, sit down at their tables or knock on their doors. (If you don’t go to this kind of university, it’s certainly more difficult to meet grown-ups, but I’m given to understand it’s not impossible.) Ask them about your courses, but also talk to them about the books you’re reading, the things you’ve seen in the news, the brave new world you’re just beginning to puzzle through. Ask them about their work: you might discover a new area of interest. It’s not every four years that you’ll get the chance to live in a community populated by people in all different stages of life and intellectual development, and this is the most valuable thing you can get out of college. It certainly has made all the difference to my undergraduate education.

In fact, I think Tenured Radical’s academic-advising post made this point most effectively:

Needless to say, I made some spectacular errors in that first two years and had some great successes, all of which had to do with the opportunities and pitfalls of a large university. Would things have been different with a more attentive advisor? I doubt it. It wasn’t until, entirely by accident, I fell in with a group of graduate students and became invested in being regarded as — not a good student, but scholarly — that things straightened out for me.

This is actually the story of my life, so I feel qualified to endorse the strategy of seeking out mentors and not worrying too much about whether you’ve correctly distinguished one core requirement from another. Focus on having the time of your intellectual life and allowing your world to be opened and changed, and the rest will follow.

And dear readers, if you have any of your own advice for the Class of 2014, do leave it in the comments!

QOTD (2010-08-17), Continuity and Change Edition

In A Problem in Modern Ethics, Symonds, in his detailed discussion of German sexologist Ulrichs’ arguments for homosexual tolerance, reminds us just how little has changed in 120-odd years:

As the result of these considerations, Ulrichs concludes that there is no real ground for the persecution of Urnings [his word for men-loving men] except as may be found in the repugnance by the vast numerical majority for an insignificant minority. The majority encourages matrimony, condones seduction, sanctions prostitution, legalises divorce in the interests of its own sexual proclivities. It makes temporary or permanent unions illegal for the minority whose inversion of instinct it abhors. And this persecution, in the popular mind at any rate, is justified, like many other inequitable acts of prejudice or ignorance, by theological assumptions and the so-called mandates of revelation.

This fin-de-siècle argument is, other than some stylistic markers, barely distinguishable from federal judge Vaughn Walker’s decision in Perry v. Schwarzenegger—whose unveiling the other week reignited a conversation about equality, tolerance, and the nature and role of religion and morals in a society which includes men who love men and women who love women. Since there has been a thing called “homosexuality” (or “sexual inversion,” or “Greek love,” or “Urningliebe,” or other terms more recognizable to a Symonds or an Ulrichs), those who make a life out of thinking and writing about it have tried to puzzle through what its relationship is to the rest of our society. And it is remarkably striking that, just as surely as a discussion of men’s love for men will before long come back to Plato (even if by a circuitous, modern, post-classics route), it seems it will come back to marriage and divorce as well.

I remember how surprised I was, a year ago at the Smithsonian, to see a 1963 issue of One magazine with the cover text “Let’s Push Homophile Marriage,” a political rallying cry which, though advanced six years before the advent of gay liberation, though using the vocabulary of a pre-“gay” era, sounded disconcertingly familiar to 21st-century ears. I am even more surprised to see marriage rear its head in the equal-rights discussions of the 19th century: has the movement really, in 120 years, not come so far as all that? Is marriage equality as a 2010 cultural touchstone really so close to the cultural touchstones of 1890 as to make the lasting accomplishments of gay liberation seem like an illusion?

Obviously the past 120 years have brought decriminalization and the eradication of sodomy laws, a product of 1980s Britain and 2000s America largely unthinkable in Symonds’ and Ulrichs’ day, as standard as it already was in some European countries by the end of the 19th century. And yet despite shifts in public opinion and in the law, we seem to be having the same conversations, still unable to make up our collective cultural mind as to whether male homosexuality is a crime against nature or a psychological problem or just the way some people are; whether it’s fundamentally the same as or fundamentally different to heterosexuality; how the law should respect these categories and whether it should notice them at all; how we define male homosexuality as a cultural as well as a psychological category; and why, indeed, the hell it is that we get so exercised about male homosexuality while female homosexuality fades into the background! To the proverbial Martian anthropologist, our microbiological searches for the “gay gene” and endless academic psychological studies would likely seem as strange as Ulrichs’ obsessive taxonomizing of sexual behavior or Krafft-Ebing’s psychological-physiological quackery and a touch of the Freud about all their contemporaries’ early-childhood theories. We have come no closer than Symonds and his contemporaries to understanding why people are gay, and yet we seem just as wedded to an idea of the characteristic’s biological immutability as most homosexual-sympathetic sexologists of the late 19th century were. I suppose a good question to ask would be, why do we keep going around in circles? Why is it so difficult to construct a narrative of the history of homosexuality in which the arc of history bends as unremittingly towards progress as it’s supposed to?

And Christ, reader, have I really signed up to write a thesis about all this?

QOTD (2010-08-12), Thesis Research and Queer Theory Edition

We normally think of J.A. Symonds as one of the pioneers of a modern theory of male homosexuality, and my thesis presently (that is, until I change my mind again next week) hopes to discuss how Symonds’ work and life prefigured the gay identities of the 20th and 21st centuries. He was in many ways an extraordinary pioneer, and I don’t believe the importance of his work to modern queer scholarship has been fully realized. Sometimes, however, he says things which mark him as far away indeed from the mainstream of modern queer thought. For example, from Chapter 4 of his short 1896 book A Problem in Modern Ethics:

… as is always the case in the analysis of hitherto neglected phenomena, [German doctor and sexologist Casper’s] classification [of “congenital” and “acquired” sexual inversion] falls far short of the necessities of the problem. While treating of acquired sexual inversion, he only thinks of debauchees. He does not seem to have considered a deeper question—deeper in its bearing upon the way in which society will have to deal with the whole problem—the question of how far these instincts are capable of being communicated by contagion to persons in their fullest exercise of sexual vigour. Taste, fashion, preference, as factors in the dissemination of anomalous passions, he has left out of his account. It is also, but this is a minor matter, singular that he should have restricted his observations on the freemasonry among pæderasts to those in whom the instinct is acquired. That exists quite as much or even more among those in whom it is congenital.

By “freemasonry,” Symonds means the tactics “pæderasts” use to recognize each other (dress, ways of looking at each other, linguistic cues, etc.), which so happen to be a major interest of mine, so that’s cool. But I’m actually far more intrigued here by Symonds’ endorsement of Casper’s inclination to break the population of men-loving men into those in whom the trait is inborn or developed in early childhood, and those in whom it is “acquired” and to a certain extent voluntary. To endorse such a taxonomy flies in the face of most of what we think of as standard in modern homosexuality, and it would be natural to dismiss it out-of-hand as late-Victorian wackiness. There are three reasons, however, why I think we actually ought to give Symonds’ raising of a “deeper question” more thought.

The first is a fairly straightforward historical-context point: it was commonly understood in 19th- and early-20th-century American and European culture that men not generally disposed to have sex with other men might do so in extraordinary circumstances when there were no women available—the best examples being sailors on long voyages, boys and young men at single-sex boarding schools and universities, and the still-common trope of the men’s prison. Hence the sense Casper and Symonds have that some men’s sex with men does not necessarily stem from any deep-seated physiological or psychological characteristic, even though–as Symonds says at the end of this chapter–“‘the majority of persons who are subject’ to sexual inversion come into the world, or issue from the cradle, with their inclination clearly marked.”

The second is a point of semantics and close-reading: at first glance the kookiest of Symonds’ suggestions in this paragraph is that “these instincts are capable of being communicated by contagion to persons in their fullest exercise of sexual vigour.” It’s a strange sentence, one which seems to embody all the worst quackery of Victorian medical “knowledge” and to bear a disconcerting resemblance to modern warnings about the “homosexual agenda.” But putting Symonds’ suggestion in a slightly different context changes its meaning: how many stories do we continue to hear day by day about adults who come out reasonably late in life? For every modern teenage boy who grows up watching Logo, attending Pride parades, and looking for porn on the Internet, there is a middle-aged man who takes half a lifetime to realize or to determine that he is gay–and sometimes, I would imagine, this is because he has a sexual experience which spurs him to connect feelings he’s had all his life to the larger concept of “homosexuality.” It seems this could be a modern way of phrasing the problem which Symonds raises: how will society taxonomize the man who, though heretofore “normal” (in 19th-century parlance), has sex with a man and determines himself an invert? How would someone’s identity and sense of self be reshaped by having a homosexual sexual experience? It does not strike me as surprising that Symonds, who is fundamentally concerned with ideas of identity and the parts of history, culture, medicine, language, etc. which compose an identity for the sexual invert, would find this question important.

The third point is larger in scope, involving broader implications to Symonds’ observations which I don’t have the queer theory to attack properly, but which I think enormously important to understanding homosexuality both in its natal decade and today. By invoking “Taste, fashion, preference, as factors in the dissemination of anomalous passions,” I like to think of Symonds as raising the point I am fond of making that there is an extricable division between physiological/psychological, immutable homosexual sexual orientation and the much more mutable entity commonly known as “gay culture.” I am fond of pointing out that not all (male) homosexuals are participants in “gay (male) culture” (we can quibble about what that means, but regardless of what “gay culture” is, I think the point stands), while not all participants in “gay (male) culture” are (male) homosexuals themselves–I count myself in this group. And I am tempted to think of a man from Los Angeles I met in a gay bar in Paris who seemed to be enjoying the efforts of a couple guys to hit on him before confessing that he wasn’t gay himself. I think Symonds is right to point to “taste” and “fashion” as elements which make up an identity and a culture as much as something immutable does, and I think this is something which all of us who consider ourselves interested in the matter of queer identities could do a little more thinking about.

I think I’ll end on the note that I personally would like to do a lot more thinking about this word “taste,” because it could represent the simple uncontrollable desire of sexual orientation, but also quite obviously seems to connote a desire colored by preference. After all, homosexuality may not be a choice, but how many people are given to couching their sexual preferences for partners of a specific race, partners who prefer to engage in specific sexual acts, etc. in this same language of non-choice? Sometimes orientational models of these things surface–the orientational model of dominance and submission is beginning to catch on in many circles, while we nearly always speak both historically and contemporarily of pedophilic desire as something uncontrollable, only condemning in criminal terms the acting-out of that desire. But I think the language of “taste” colors many discussions of sexuality beyond LGBT activists’ “being gay is not a choice” mantra, and considering it a factor “in the dissemination of anomalous passions” could help us develop still further our understandings of how queer identities are shaped. I would think that “taste” would be incredibly important to men like Symonds who came to construct a theory of homosexual identity out of an intellectual trajectory heavily influenced by the British aesthetic movement, who joined Wilde and Pater in the study of Greats and the Renaissance, and who had much to say about art and its criticism. That Symonds’ origins in and study of this tradition related directly to his theory of homosexuality is something I hope to weigh in on in my thesis, and I hope that doing so can help us to consider what bearing “taste” has on homosexuality today, removed as it seems from ancient philosophy or Renaissance art.

Foxes and Hedgehogs; or, An Excess of Guilt and What to Do About It

I have been following avidly blogger-historian Notorious Ph.D.’s series of posts on “The Fox and the Hedgehog,” an Aesopesque woodland-creatures metaphor for two kinds of scholars: foxes, whose expertise emphasizes breadth over depth and could encompass projects on completely different thematic areas; and hedgehogs, who burrow deeply into one subject area and become experts in that subfield. Across the historians’ blogosphere in the past couple weeks, there’s been a lot of discussion about areas of specialization, workload, and the gendered manifestations thereof, and Prof. Notorious neatly synthesizes it all in a question I find critical and salient:

… we all choose whether to be foxes or hedgehogs, but women’s/gender/feminist(/queer/?) historians who want to be foxes may feel that there is a moral obligation to be a hedgehog. If we don’t do this very important work, who will?

[…]

Now, I know that there are plenty of women’s historians (and in other fields too, of course) out there who are joyful hedgehogs by choice; we owe them a great deal as both scholars and feminists. And I also know that women’s history is a subfield big enough that you can be a fox within it. But I’m not talking about them — I’m talking about the feminist fox who feels pressured to be a hedgehog, to continue working in a field that is politically and/or personally important to her, when she’d rather be off writing about municipal institutions or poison or siege techniques of the Hundred-Years’ War.

Prof. Notorious included “queer” with a question mark, not certain whether the politics of queer history are quite identical to the politics of feminist history, but I think there’s some overlap. Since I decided that I was going to write my undergraduate thesis on gay men, and indeed since I became publicly known as something of a professional queer, I’ve been making my way through very related dilemmas without clear ethical answers. For months I have been trying to balance my fears of being ghettoized as someone who writes on queer issues (both in my undergraduate department and when I apply to grad schools) with my conviction that the stories I want to write need to be told; and to balance my belief in the good and necessary work of queer studies with the growing sense that I won’t want to work on queer topics for the rest of my professional life. In recent weeks, too, I’ve found these dilemmas compounded by an additional one: I’ve realized that my thesis project will be more focused and easier to pull off in a year and a hundred pages if it only deals with men-loving men, and not women-loving women during the same period—but what political disservice does that do to the Sapphic sisterhood, of which I’m theoretically a member? Do I have an ethical obligation to tell the stories of lesbians and proto-lesbians? If my thesis will be about queer men, do I need to ensure that one of my pieces of junior independent work is about queer women? Am I really just a sexist pig, like one of my friends told me in eleventh grade when I told her I hated being a girl?

Of course, the answer which Prof. Notorious gives, and the answer which I rationally know is the most sensible, is that we should write what we want to write, we should study what we love, and we shouldn’t feel bound by any sort of artificially-constructed external standard of what we “ought” to study. Prof. Notorious wisely writes that “Just because you can, doesn’t mean you have to,” and I think that she is right that women can write about men, and teh gayz can write about teh strayhtz and maybe everyone will end up more sensitive about each other in the process.

I’d add that in my experience, the most important work of visibility and tolerance is not done through academic study of people-who-aren’t-straight-white-men, but rather through convincing by our presences: being there for our peers (in my case) and our students (in the cases of the professors whose blogs I read and who are my mentors in real life) and making it clear that women or queer people—or people of color, or others who are left out of the straight-white-men version of history—really do have a place in the world. Of course, always being there to be the poster child for minority identity can let you in for the pitfalls which the blogger-historians have been describing of being overwhelmed by a false moral responsibility to spend all your time teaching tolerance, but I am tempted to say that it is the least you can do to spend just a little time being present so that the people around you can take courage from it. I am always the most moved and inspired—and confident—when I spend time in an academic setting (a classroom, a meeting, a departmental function, or even a dining hall meal) with people who look like me. This world does a very good job of tricking me into believing that because I am a woman, or because I make a professional and social identity out of identifying as queer, or because I am not conventionally attractive or feminine, I will not be successful, respected, or taken seriously. Reconstructing how 19th-century men-loving men understood their sexual identities is fun and engaging and illuminating about 19th-century culture, but it has never moved and inspired me the way seeing a woman who looks like me in a position of academic success and power has done. Even if I don’t—as I probably won’t—turn out a hedgehog, writing about the historiography of homosexuality forever, I hope that I can devote the same political energy to being there for my students (when I have some) as my professors have been there for me.

A Few Scattered Thoughts on Tenure

The higher ed blogosphere, to which I’m addicted, has been abuzz this week about the future of tenure, a discussion spurred in large part by a Chronicle of Higher Ed article which cynically predicted the demise of the institution. In addition to a 173-comment-long discussion on that article itself, conversations have spun off onto the Chronicle blog, the NYT, and (of all places!) Megan McArdle’s blog, where she airs that dumb, tired argument that tenure is bad because it prevents the firing of ineffective professors with low output—which is not only wrong when it’s trotted out in the K-12 education reform debate, but even more so when mapped onto higher ed. The practices of scholarship and education cannot be discussed in the language one would use in the business world, in which “outcomes” and “output” and “results” can be easily measured in fiscal terms and used to evaluate the “effectiveness” and “efficiency” of employees. There is no viable way in which to calculate whether a scholar or a teacher is using her time in an efficient way, especially from outside the academic world. From an extramural standpoint, for example, a professor might not have published in ten years, but perhaps that’s just how long it’s taken to carry out the lab experiments necessary to prove her hypothesis so that she can publish; or perhaps she’s been on a decade-long scavenger hunt for primary sources. Perhaps she gets poor teaching evals because she’s a tough grader. The point is that those like McArdle who are inclined to oppose tenure because it would seem like a poor corporate practice have forgotten that universities do not, in fact, have to demonstrate profit to their shareholders, and that what looks like wasted time to anyone who isn’t part of the academic world may not necessarily be (the “summers-off” myth is another good example of that).

The Swarthmore history professor Timothy Burke has a great response to the tenure discussion on his blog, in which he both discusses how tenure has benefited him and raises some of the more legitimate concerns about the institution (it may not really help to protect academic freedom; although it protects tenured professors from having to constantly demonstrate that they’re producing knowledge, it puts intense pressure on junior faculty to have a book in the works straight away, which may not be too different from the “demonstrated outcomes” b.s.). However, Prof. Burke, like those with whom he disagrees, discusses tenure in business-like terms, comparing it to systems of incentives or traditions of institutional investment in the corporate sector and the government. And while I think I see the case he’s trying to make, that tenure isn’t so very different from the employment practices of other professional institutions, but to me this is not the way to make the case that the university is not, in fact, a business.

I’m stuck on this idea of anti-corporatism because I see it as vital to ensuring the survival of the university as institution, and also to securing what little is left of a culture of intellectualism in America. The more we treat education like a business, and advocate that those who work in education be treated like corporate employees or like commodities (witness the strategy the neoliberal/conservative ed reform movement advocates of bringing recent college graduates into the K-12 system, working them on a 14-hour-day schedule until they burn out, then bringing in new ones), the more we erode any sort of lingering cultural belief that knowledge is good for its own sake. The more we chip away at tenure, the more education will take place in for-profit, online institutions staffed by adjuncts, offering courses of vocational merit but not those which teach methods of humanistic inquiry and aesthetic appreciation. Tenure, despite its problems (publish-or-perish being the most significant one I can think of) is both a visible and a practical reminder that scholars and educators and intellectual culture are necessary to a well-functioning democracy, a bulwark against cuts in funding and skepticism about those weird people who write books no one can understand and get three-month vacations every year (haha! I say to myself as I think about the thesis I’m not writing right now).

As I write this, I am sitting in a coffeeshop at UC San Diego, the university I grew up in before I set out to become an academic on my own. UCSD is a top-ranked research university, struggling along despite its serious funding troubles, and most of its faculty are—for the moment—tenured. But plenty of programs are nevertheless staffed by lecturers and adjuncts who can be eliminated at the drop of a hat; should those serious funding troubles dictate that their programs be cut, they could wake up without jobs. They are second-class citizens, dispensable and disposable, and that means that their programs, their labs, their centers are too: should the State of California decide it no longer cared to support higher education, only those with tenure would have a prayer of holding onto their jobs as experts in their fields, as educators, and as the physical manifestation of the once-so-great, once-so-holy, American university.

Those who advocate the abolition of tenure from outside academia, I suppose, care too little about university culture for it to matter to them whether universities largely staffed by adjuncts can survive in today’s America, where there is little money to be found for education that is not immediately and obviously vocational. But I shudder to think what our country and our world will be like without people whose job it is to study it, its peoples, its cultures, its art; to publish that knowledge; and to teach the next generation—encouraging them to think about themselves, each other, and the world they inhabit. Maybe the Megan McArdles of this world see tenure as inefficient because, dazzled by the sparkling beauty of capitalism, they don’t see the need for humanistic inquiry. But as anyone who has ever read a novel, listened to a piece of music, seen the beautiful pattern of a mathematical algorithm, watched life begin in a petri dish, or observed her fellow humans be human can surely attest, we cannot allow these things to be lost to a culture of economic profitability and demonstrated efficient outcomes. Whatever it takes—whether it’s tenure as we know it today, or something else like it—we need to preserve a method of ensuring the safety of intellectual culture.

Having It Both Ways; or, In Which I Try to Get an American Liberal Arts Education

or, a rant in which I sound like an entitled Ivy League student (my apologies)

Last weekend, Tenured Radical (of whom I’m quite an enormous fan) posted a review of Louis Menand’s most recent book, The Marketplace of Ideas. Amid criticism of Menand’s views on interdisciplinarity which largely went over my head (when I submitted my application to the Princeton Program in American Studies, I certainly didn’t know this was what it meant to talk about interdisciplinarity!), I was struck by TR’s comments on Menand’s discussion of the general education curricula at liberal arts institutions. TR quotes Menand as saying that “A college’s general education curriculum, what the faculty chooses to require of everyone, is a reflection of its overall educational philosophy, even when the faculty chooses to require nothing”; she then goes on to say that she believes it “intellectually lazy not to have a core curriculum of some kind.” And when I read this I knew I had to write a post of my own in partial response. The thing is that I, in theory, would like to believe this too. I place great store in the values of liberal arts education and in the greater social necessity of cultural literacy across disciplines. I have the greatest fondness for the tradition in American higher education of balancing a higher-level general education with specialized training in a chosen field of concentration. And yet, as someone making my way through this system right now (indeed, I’m between my sophomore and junior years, having just declared my major three months ago: precisely on the cusp between general education and specialization), I want to talk about just how difficult it is to achieve this balance in the four years the American higher-ed system allots us, and how even those of us undergraduates most dedicated to the principles of liberal-arts education can find the system failing us. I find that while discussions like this often feature very productive dialogue between faculty, administrators, and other higher-ed professionals, they rarely involve a student perspective, and so I hope that mine as an undergraduate who genuinely wants to work hard and to see the system helping me out in my efforts to do so can be useful to the adults considering these questions of intellectual development.

I matriculated at Princeton in the fall of 2008, essentially starting my liberal-arts education from zero. Like a fairly significant number of my colleagues, my secondary education was at a large, public California high school: not one of the best in the state, located in an area whose high property taxes provide for smaller class sizes, lots of AP or IB classes, and arts programs; but a good-enough school with a small gifted program which was reasonably safe and sent a lot of its graduates to the local state universities. I’d had enough general education to get me the basics: AP English and history, the “honors” track in lab sciences, community-college calculus, a mishmash of largely self-taught French and Latin, and a few scattered classes here and there in music, theater, and computer programming (I also played the violin and viola outside of school for about ten years). However, while this level of education would have placed me out of nearly all my gen-ed requirements at a University of California campus, Princeton (like most other highly selective colleges and universities) doesn’t take transfer credit, and with good reason: Princeton’s 100-level math sequence is much more difficult than community-college calculus; a passing score on the AP French Language test placed me into the middle of the introductory French sequence. And, most relevantly, you cannot use AP scores or community-college credit to place out of Princeton’s “distribution requirements,” a system which labels nearly every course in the catalog with a subject area like “Literature and the Arts,” “Quantitative Reasoning,” or “Epistemology & Cognition,” and mandates that all undergraduates take one or two classes in each area, preferably in their first two years. In the fall of 2008, full of optimism and faith in liberal-arts education—and, critically, as yet without a major—I happily signed up for a freshman seminar in higher-education policy (Social Analysis), the French course into which I’d been placed (no distribution requirement), an easy computer-science class for non-scientists (Quantitative Reasoning), and the required freshman composition seminar, and I waited patiently for those first college classes to give me the university-level general education I knew that high school hadn’t.

But as I’m sure you’ve already guessed if you know anything about these things, it didn’t quite work that way, and my optimism and faith didn’t last long. Not only had I, heeding advice from my academic advisers to select a non-demanding course schedule, picked a joke of a programming class (known, unfortunately, as “emails for females,” and easier than the programming class I’d taken in high school) instead of something useful like social-science statistics or something challenging and rewarding like multivariable calculus, therefore demonstrating my innate lack of enthusiasm about the value of a rigorous general education, it also didn’t take long before my elective interests took precedence over my belief that the Princeton distribution requirements were for my own good. My first spring, I took courses in American history, British literature, American politics, and creative writing—including the ones which caused me to decide that I wanted to be a professional historian. I checked off my Literature and the Arts and Historical Analysis requirements, and I’ve gone on to fulfill them over and over again in the semesters since, choosing the upper-division seminars in history, English, and American studies that have been and continue to be one of the greatest sources of joy and fulfillment in my life. I’ve befriended my professors and become their research assistants; I’ve grown to love things I never before knew existed, like literary theory and cultural studies; and of course I have become convinced that I could not be happy without being able to stay engaged with texts and primary documents for the rest of my life—but I have never taken a single “real” college math or science class.

“Emails for Females” set the trend for my efforts to fulfill the requirements that come less easily. Last fall, to be honest, I put in just enough effort to pull a B in an anthropology course which satisfies the Science and Technology With Lab requirement; this fall (well into the second half of my college career, the part where I’m supposed to focus wholeheartedly on my major), I’m signed up to take an environmental sciences lab pass/fail—and I chose it, I’m ashamed to admit, because I thought it sounded fun, not because it will be an intellectual challenge. I’ve been scared of physics or math: scared of the appearance of the pass/fail indicator on my transcript, and indeed scared of failing. I am not good at math and science; if these quantitative indicators are worth anything, my math SAT score is far below the Princeton median. I would have to work very hard, putting in more hours than I put in for the classes which teach me the kind of work with which I want to spend the rest of my life, in order to pass physics or higher-level math. And so I’ve become a terrible liberal-arts student, picking and choosing the classes I want and not the classes that will make me a more well-rounded person and a better thinker. I feel as if I am being hypocritical and dishonest, as if I am lying to the high-school senior who thought for months that she would go to the University of Chicago for the Core.

But now I’m going to be a junior at Princeton, taking pass/fail a notoriously easy lab alongside three upper-division seminars in history and English, and can you blame me? One seminar is a requirement for my major, and promises to help me produce my first piece of required departmental independent work, a 30-page piece of original historiographical research; the other two are taught by two of the best professors in Princeton, whose opinions are some of those I value the most highly of anyone who has ever read and evaluated my academic work. Why, pragmatically speaking, would I want to limit the amount of time I can spend learning how to be the best historian I can be by spending hours trying to study for the notoriously difficult intro courses in physics, chemistry, or biology? There is no reason to think that my high-school science and math education (most of which I’ve forgotten by now) was sufficient to understand the conversations that my scientist and engineer friends have about their work; by rights I should be working to step outside my disciplinary box. But Princeton also expects me to write a senior thesis in a year, and so it is difficult to know where to focus my attentions. There are only so many hours in the week, and the system seems to force a choice between doing well across the general-education board and doing well in your discipline (which, naturally, entails a higher standard of mastery than does an introductory general-education curriculum). I’m a serious student, and spend very little time not working; I think I spent about three Saturday nights last academic year not doing homework. If I seem lazy about fulfilling requirements outside my comfort zone, or as if I am exaggerating this choice between my major field of study and my general education, I can’t think that it’s entirely my fault. I can’t think that I am to be blamed, entirely, for leaving my Epistemology & Cognition requirement till my senior year, my thesis year, because I am going abroad next spring. It can’t be just me. It must be the system too.

I am tempted to put the blame on my high-school education. After all, by the standards of the men who created Princeton’s earliest general education curricula, I am woefully undereducated. To be sure, there was much less to learn then, and particularly in the sciences, but it’s certain that in the fall of 2008 I would not have been able to pass the examinations once administered to incoming freshmen in literature, history, philosophy, and the classics. Of course, when those exams were required, a middle-class Jewish girl like me wouldn’t have been able to take them in the first place, and the wealthy young southern gentlemen who did take them tended to have the advantage of a private prep-school education. It was a system that perpetuated elitism in a disgusting and horrible way, and I am glad it no longer exists—but if incoming freshmen today were better read, perhaps we could consider ourselves generally educated and move on to the business of the major field of study. Perhaps wanting to do so wouldn’t make us bad mini-intellectuals.

It occurs to me that the British university system manages to get along just fine without requiring distribution requirements of its students, and perhaps that’s made easier by the GCSE and A-level system, which seems to teach to a higher level of proficiency than the American system does. However, I also understand that if you want to specialize early it is very easy to do so—no doubt if I’d been educated in the British system, the same temptations that have led me to drop serious math and science at Princeton would have led me to drop them at 16 in the UK. In reality, I think it must be as difficult to get a liberal-arts education in the UK as it is in America, if not more so. And while I strongly suspect that I might have come through the British system a better historian, that is only because the system is weighted so heavily towards specialization—I certainly wouldn’t have solved the Princeton problem of taking-the-easy-lab-for-the-requirement, and I certainly wouldn’t have moved beyond my disciplinary box.

To be sure, Princeton distribution requirements have so far pushed me to take one course which I wouldn’t have otherwise taken and for which I’m very grateful. The Ethical Thought & Moral Values requirement led me to continental political theory, and a basic understanding of Kant and Hegel—which I really did need a professor’s help to muddle through—has not only transformed how I engage with the British and American intellectual history I study, it has helped me to engage with my friends (and family members!) who study and talk about philosophy and theory; it has helped me to learn to write clearly about complex philosophical concepts; and it is valuable for its own sake. Educated people, I tell myself, have read Kant and Hegel, not to mention Rousseau, Marx, and Nietzsche. And I think of myself as a more literate and educated person for having struggled through those texts, which I hope I will remember as an experience at least as beneficial as the course I took which cemented my desire to become a historian.

But that was just one course, and pragmatically I could take the time to take political theory seriously in a way I can’t for fields with which I really struggle. In the end, in a system like Princeton’s, there is just not enough time in four years to become culturally literate and to achieve competency in your field. I want to be able to do both well, I really do. Between meeting all Princeton’s requirements and my own high standards, though, I don’t really feel as if either is going to happen. Not quite two years out from my thesis due date, I feel as if I could spend my entire life researching and writing that project and still not be able to turn in a project of sufficient quality or originality come May 2012—and forget my junior independent work, forget my departmental coursework, forget my semester abroad, and forget that one outstanding distribution requirement in Epistemology & Cognition which I’ll have to fulfill my senior year—probably through an easy course I’ll take pass/fail, because do I, grad-school-bound as I am, really want to reduce the amount of time I can spend on my thesis?

I have the sense that educators and administrators sometimes think that those who object to liberal-arts distribution requirements or core curricula just don’t want to do the work. Sometimes I see concerns about distribution requirements held up as evidence of the downfall of intellectual culture, as part of a package with students with entitlement issues who just want to pay for a 3.5 GPA or with students who think college is only important if it serves some sort of immediate and obvious vocational credentialing purpose. I think it’s worth thinking, however, about the fact that distribution requirements can reduce even the most serious students with the highest-minded educational ideals (and the most feminist principles!) to “Emails for Females,” and that a distribution-requirements system may not necessarily be the best way to help these students (who are not always innately skilled at choosing the courses which will challenge and reward them) get the best undergraduate education they possibly can.

Puberty, Perversions, and Allen Ginsberg; or, My Life as a Teenage Sexologist

I was in tenth grade, in my teenage counterculture phase, when I discovered the sexologists. Most teenagers, I think, go through a counterculture phase, when they devour Ginsberg and Kerouac and at least think about drug-taking and free love, even if they’re too dorky to really go through with it. Most teenagers go through a phase of coming to terms with an idea of themselves as sexual beings, as proto-adults, as people newly able to relate to their fellow proto-adults in entirely different ways. Most teenagers today, I think, must see something of themselves in the Beats, because today it is cool to be a misfit and cool to write poetry or songs as an expression of why your misfitedness is how the whole world ought to be. And so did I: Allen Ginsberg has been easily one of the single most influential people in my life. Long before I knew what it meant to be “blown by those human seraphim, the sailors, caresses of Atlantic and Caribbean love,” I imagined myself to be an “angelheaded hipster” in my own suburb. I read Ginsberg under the table in American history class until I’d learned “Howl” off by heart; and when we had to do five-minute individual presentations on poems in my English class, I took up half the period subjecting my poor teacher and classmates to a lecture on how “A Supermarket in California” ripped off Whitman and Lorca.

But the piece of Ginsbergiana of which I’m reminded today is a few paragraphs in Bill Morgan’s very long biography of the poet, which I received for Christmas in 2006 and spent another few months reading under the table in class. The book is in a box in Princeton now, but those paragraphs talk about when Ginsberg was living in New York in the ’50s, at the height of the Beat era, and his drug-dealing dropout friend Herbert Huncke introduced him to a researcher named Alfred Kinsey who was writing a book about Sexual Behavior in the Human Male and looking for interesting case studies. Kinsey and Ginsberg had dinner in a diner near Times Square and Kinsey got plenty of juicy material about the poor poet struggling with his sexuality and always falling uncomfortably in love with his straight male friends. It was fascinating not only to realize that Kinsey’s research involved paying underworld types like Huncke to seek out countercultural men with more colorful stories (hardly a scientific way to do a survey of the population!), but also to realize that such stories existed. Wikipedia research, of course, ensued, and I soon learned that individual anonymous case histories were a staple of the history of sexological research in Europe and America. I read tantalizing snippets on the Internet of archaic-sounding sexual autobiography, and I was hooked. For the rest of high school, whenever I went to the university library, I would sneak guiltily off too the wing on the sixth floor where I knew the “sex books” were. I’d take the English translation of Krafft-Ebing’s Psychopathia Sexualis off the shelf and sit on the linoleum poring over one of the earliest systematic studies of sexuality, shame causing me to slip round the corner into the history of marriage books if anyone came by.

Sexology was my puberty, my coming of age, surely as transformative to me as any teenager’s first girlfriend or boyfriend. I leaped into a world of “deviance” and “perversion,” because to a sheltered and easily-shocked 16-year-old, any sexual behavior was as weird and perverted as any other. As I branched out—to the 18th-century pornography I brought with me on my orchestra’s trip to Europe, or the Olympia Press books I read for my (totally-embarrassing-not-a-real-piece-of-history-writing) European history research paper—I developed easily the most infamous reputation that any late-blooming girl without a first kiss behind her has ever had. Like teenagers “who chained themselves to subways for the ride from Battery to holy Bronx on benzedrine” without having injected or ingested a single real drug, I carried around Kinsey for the caché, and I sat alone at lunch and actually slogged all the way through Leopold Sacher-Masoch’s unbelievably boring novel Venus in Furs so that I could tell people where our word “masochism” (a Krafft-Ebing coinage!) came from. And when I left for college, I guiltily hid my little, growing collection of 18th- and 19th-century sex lit and 20th-century Kinsey-spinoff studies in my backpack; and I slipped away from the bustle of orientation weekend to make my first foray into the university library system, getting lost looking for Havelock Ellis on Firestone B-floor.

Over time, I got used to visiting the stacks shelved under Library of Congress classification HQ, and over time I stopped running away if I saw another person standing there poring over a book about lesbians. As the months went by and I made some friends and found a new home and took a great, great class called “Gender and Sexuality in Modern America,” homosexuality started to separate itself out from all the other strands of perversion. Through the lens of homosexuality, I discovered more memoirs than I’d ever read in my life, by people not too different from me who’d used the primary sources of previous eras to access a subject they didn’t otherwise know how to address. I read more and more, and became conversant in the secret languages of those who lead double lives, and in the not-so-secret languages of those who create new languages in order to bring those lives out into the sunshine. I read about those who discovered Plato the way I’d discovered Krafft-Ebing, and Whitman the way I’d discovered Ginsberg. I grew to care very deeply indeed for the centuries of lonely people who snuck off to libraries as a way to confront shame and fear, and to admire beyond measure those who thought to write about it in the hopes that the next generation might not be so embarrassed by it all. I grew to see it as my 21st-century job—at least for now—to keep their stories alive, and to do the thinking, writing, and talking necessary to bring their work to a new generation living in a new social context. It’s been sexology ever since—and now Ginsberg’s lines about the men “in public parks… scattering their semen freely to whomever come who may” stand out to me more than those about drugs and jazz.

Today “Howl” does not hang over my desk as it once did, and I am not quite so inseparable as I once was from the old City Lights paperback. But Ginsberg is part of the literary lineage through which I have found a sense of purpose, and he is never far away. Today, reading about the 19th-century sexological intellectuals whose story I hope in a few short months to begin to tell, I pieced together the elements of the sexual lineage Ginsberg believed tied him to Whitman: Ginsberg had slept with Neal Cassady, who had slept with a man named Gavin Arthur (incidentally, the grandson of the president), who may or may not have slept with Edward Carpenter, who may or may not have slept with Whitman. These claims get trickier as you get farther back, and it’s harder to figure out what counts as sex at the turn of the 20th century than it was in the Clinton White House. But reading that today, I thought of how Ginsberg used that lineage to make sex a process of literary and spiritual inspiration, and what a tool the intellectualization and academization of human sexuality has been to those who seek to understand themselves, their world, and their place in it.

I am writing this, probably the most particularly sexual blog post I have written in years, as I look out my bedroom window to a suburban street, young voices shouting at each other and skateboards clattering on the pavement as the neighbors’ kids play in their driveways. I have sat here for the past two weeks, too, on a crash course through the intellectual history of 19th-century sexuality, and I plan to do the same until the end of the month when I take off for Canada. I can’t help thinking, as I look out the window, of freshman-year fall break, the week before the 2008 election, when our neighbors with the skateboarding kids had “Yes on Prop. 8” signs in their front yards, when I waited in the living room to give out candy on Halloween and heard parents telling their children to pass by our house because our sign said “No on 8.” As I go back to reading about intellectuals like John Addington Symonds, who seem to have believed that if they wrote enough about homosexuality it would cease to be a crime or a disease, I can’t help reaffirming my belief in the good work that words do, and the good work that scholarship does. And I can’t help but think, as I go to open yet another book about sodomy, and look out the window once more, that maybe I managed to find in me a shred of real countercultural subversion after all.

QOTD (2010-07-07); or, Another Problem in Greek Ethics

From the second chapter of Jonathan Ned Katz’s Love Stories: Sex Between Men Before Homosexuality:

In December 1837 at Yale, Dodd composed or transcribed a revealing, rhymed ditty, “The Disgrace of Hebe & Preferment of Ganymede,” about a dinner for the gods given by Jove, at which the beautiful serving girl, Hebe, tripped over “Mercury’s wand,” exhibiting, as she fell, that part “which by modesty’s laws is prohibited.” Men’s and women’s privates were on Dodd’s mind—eros was now closer to consciousness. Angry at Hebe’s “breach of decorum,” Jove sent her away, and called Ganymede “to serve in her place. / Which station forever he afterward had, / Though to cut Hebe out… was too bad.”

Considering Dodd’s cutting out Julia Beers [his girlfriend] for John Heath, Anthony Halsey, and Jabez Smith [fellow college students for whom he professed love or intense friendship], his poem shows him employing ancient Greek myth, and the iconic, man-loving Ganymede to help him comprehend his own shifting, ambivalent attractions. At Yale, Dodd read the Greek Anthology and other classic texts and began to use his knowledge of ancient affectionate and sexual life to come to terms with his own—a common strategy of this age’s upper-class college-educated white men.

This passage leads to the sort of thesis-related observations I usually try to keep off this blog, but given the relationship of the work Katz does to the questions of close-reading I contemplated yesterday, and the broader implications of his thesis about the historical contextualization of identity categories, it’s worth discussing here. Briefly, Katz has written this book to talk about men who loved and desired other men in America, but before such a thing as homosexuality existed. Through detailed case studies and work with both literary and more traditionally historical sources, he makes a case for a 19th century in which men’s sexual identities and relationships to sexual identity were very different from those of men in our own time. He fights against an essentialist reading of homosexuality across generations, and focuses instead on how 19th-century American men perceived their own relationships with each other, not what modern Americans might read into them. He finds (well, so far; I’m only on Chapter 4) that men often lacked the appropriate vocabulary to define their love for each other, but that they certainly did not see it as part of a distinct identity, evidence of pathology, or indeed reason not to love, desire sex with, and marry women—and that goes for both Abraham Lincoln and Walt Whitman!

And so I’ve been getting myself in this “men before homosexuality were nothing like men after homosexuality” mindset, which is productive in that it allows me both to find Katz’s book very persuasive and to further my own thinking about what changed, theoretically and culturally, when homosexuality did emerge as an identity. But I found that easy suspension-of-my-own-sense-of-identity-categories challenged by the passage I quote above, simply because it does not sound like the experience of a man from before-homosexuality. It sounds like the experiences of young men from across the history of homosexuality (though particularly in its early history) who came to understand their sexualities as sexualities through the frame of classical literature and art. I titled this post the way I did because one of those men was John Addington Symonds, whose 1901 tract A Problem in Greek Ethics is about Athenian pederasty, and implicitly makes the argument that because Athenian sexual mores were different from late-Victorian Britain’s, there was no reason why Britain and its legal system shouldn’t change to accept himself and his fellow “Uranians” into the fabric of society (this was an argument Symonds certainly expressed in so many words in private, if not so explicitly in print). The secondary “Problem in Greek Ethics,” however, would seem to lie in Symonds’ adoption of ancient Greek sexual practices which do not map precisely onto modern homosexuality and would not have been considered “homosexual” then to seek cultural and artistic validation for a modern form of sexual deviance. This sort of essentializing is, it seems to me, in some sense endemic to being homosexual in the modern western world—and it is this sort of essentializing which Katz’s book fights against in part precisely because it is so endemic. (You know how academics are.)

And so to read about Albert Dodd’s bawdy Ganymede poem, and Katz’s observations about Dodd’s interest in Greek matters in 1837, far before the word “homosexual” enters the language or before the sundry proto-homosexual scandals of late-19th-century Britain get going, and to read them occurring out in the provinces, in New Haven, far away from the theoretical and academic and cultural work done to create homosexuality in London and Paris and Berlin, is to me to profoundly trouble the neat homosexuality-didn’t-exist-and-then-it-did narrative. It is to question what is homosexual and what isn’t, to challenge my coding of problems in Greek ethics as homosexual, and indeed to question Katz’s thesis (with which I otherwise agree strongly) about the mutability and historical contingency of identity categories. Is a search to understand one’s erotic impulses through ancient Greek literature something enduring across time, no matter what words exist in the language to describe it?

Blogger Historiann wrote yesterday about the importance of using “sideways” methodologies in building the narratives of people(s) and events (such as women, or people of color, or working-class people) whom written sources sometimes leave out. Sometimes these methodologies come with their own problems in ethics: Historiann gives the example of using the recorded lives of men in order to make inferences about the lives of women, but what does it do for women’s history if it’s still only told through the eyes of men? It seems as if the kind of work that Katz does moves similarly sideways, getting around the obvious lack of forthright records of 19th-century men’s sexualities by inferring and reading between the lines; I’m hoping to learn from his books how to employ similar strategies when writing my own thesis. But it seems to me as if there is always a tension between too much inference and too little (something else I learned from literary studies!), and that playing the inference game carries with it problems in ethics, Greek or otherwise. Am I undoing Katz’s work by assuming homosexuality on the basis of Hellenism? Or could he possibly be the one who is reticent to make a necessary logical leap. (What is truth, anyway?!)

This is a tricky business in which to get involved—and we should never lose sight of the concrete social and political ramifications our quirks of interpretation can have, when they make their ways beyond the ivory tower.

Chaucer, Milton, and the Greater Good; or, Happy Belated Fourth of July

Atlantic staff writer Heather Horn has jumped, and jumped hard, on the “I don’t understand the relevance outside the academy of the methodology of humanistic study” bandwagon. Her particular permutation of the bandwagon, while not precisely about the utility of STEM fields or the necessity of building career skills or the frustrating intangibility and apparent impermeability of humanistic study (as if STEM study is ever accessible to the untrained layperson!), is nevertheless pretty well intertwined with all these issues: she’s arguing for eliminating the teaching of close-reading in high-school English classes:

We should end it. Students almost universally hate close reading, and they rarely wind up understanding it anyway. Forced to pick out meaning in passages they don’t fully grasp to begin with, they begin to get the idea that English class is about simply making things up (Ah yes–the tree mentioned once on page 89 and then never again stands for weakness and loss!) and constructing increasingly circuitous arguments by way of support. (It’s because it’s an elm, and when you think elm, you think Dutch elm disease, and elms are dying out–sort of like their relationship, see?)

[…]

If a few students really want to do close reading, they can do it as an elective or jump in head first in college. Otherwise, let’s chuck the concept. We gain nothing by teaching kids to hate books–and hate them s-l-o-w-l-y.

I shudder to think what English teachers Horn must have had in high school and college to have gained the impression that close-reading entails “simply making things up,” but perhaps the curricula she learned under were not dissimilar from the AP English Language and Composition curriculum used by my English teacher in my junior year of high school. Taught to a test supposedly devised and graded by college professors, we were drilled in recognizing different forms of figurative language, and describing setting and tone. We were taught to write a formulaic five-paragraph essay in 45 minutes, in which we would argue that the author used the stylistic devices we had been taught to recognize in order to convey a specific theme (“the author’s message about life,” is the phrase that still rings in my ears), which we had also been taught to pull out and put on display. We were drilled in order to pass a test, mostly reading short expository prose passages instead of novels, plays, or poetry, and developed little understanding of how the work we were doing related to the study of literature outside the context of the exam for which we were preparing. Indeed, I always regarded that year of English as an aberration: in freshman, sophomore, and senior years, we read books and talked about them, and I never had the sense that what we did in those three years was in any way similar to the metaphor scavenger hunt we carried out junior year. And, like Horn, I graduated high school profoundly skeptical of English classes which did that kind of work.

But in the spring of my freshman year of college, I took my first college English class, and not only did my attitude towards the study of literature change, and my understanding of what I had and hadn’t been taught in high school change, my entire life changed. The class was a survey of 14th-18th-century English literature, which at Princeton all English majors are required to take—they’re fed canon in this course because it’s thought that they wouldn’t elect it otherwise, and because the Chaucer and Milton that the course assigns is good for teaching the fundamentals of close-reading. For this course also aims to prepare prospective majors to do well in their chosen field of study, and—as I think I must have been told in the first lecture of the semester—close-reading forms the backbone of the study of literature. As I soon discovered in my practice reading exercises for precept and in the papers I wrote about Alisoun in the Miller’s Tale and the narrative style in Book 6 of Paradise Lost, close attention to the detail of language completely changed how I engaged with texts. Once I had been persuaded (through admittedly, at first, a little willing suspension of disbelief) that Chaucer and Milton had carefully chosen every word, and that we should pay them the courtesy of reading their texts that way, I found not just important issues in the poems—tackling feminism and sex on one hand, and theology on the other—but true beauty. Neither my English classes which read for theme nor my English class which read for the metaphor scavenger hunt nor my own pleasure reading for plot had prepared me for the recognition of the transcendence of Milton’s blank verse, or the hilarity of Chaucer’s puns. Taking that class gave me—or perhaps restored to me—faith and joy in the beauty of language and the literature which puts language into a form that we can read and enjoy. I’ve read a lot of books in my life, and that transformation never happened through reading a light novel in an afternoon. It took being assigned weighty texts, and being asked to pay attention to their finest details, and to write on those details and consider what I thought about them. Contrary to Horn’s suggestion, I didn’t “[construct] increasingly circuitous arguments by way of support”: I believed in what I saw in the text, because I’d finally learned the importance of the details, and their connection to plot and themes. And now, Paradise Lost is one of the most wonderful pieces of literature I’ve ever had the pleasure to read. And despite being a history major, I have taken at least one English class every semester since, branching outside of the canon to women writers, African-American writers, literature for children, popular/pulp fiction, biography and autobiography, and (on my own time) queer literature. Since the first close-reading exercise I did on Chaucer, a silly scatological meander through the Miller’s Tale, I have maintained my faith that the careful study of literature will illuminate the high and the low, and that reading carefully and attentively—reading closely—will change the way you thought you understood everything.

For reasons I have gone into before, I am not an English major, but perhaps this has attuned me more closely to the need to involve methodologies taught in literature departments in the work of other disciplines. The history papers where I have deployed my close-reading skills in the service of addressing the subjectivity of primary sources have, I believe, been the ones where I’ve gotten the closest to saying something interesting; treating the themes and theories of the past like those advanced by fiction has spurred on my greatest intellectual passions. I don’t think I had necessarily forgotten the good that learning and knowledge and careful analysis can do, but being taught actual close-reading, and not just the metaphor hunt, reawakened my sense that reading and interpreting texts and text-like media (which is what we all do, in the humanities) is not a game of pleasure-reading, nor one of baseless speculation, nor one of stringing together citations, but rather a route towards understanding the world we live in. This realization, whenever I think about it, produces an attitude of utter pleasure of which I could never have conceived in high school. Just think what could have happened if my close-reading education had been well-executed then!

Today, in the course of having a political discussion/debate/thing, I found myself reduced to tears by the liberal guilt and 20-year-old’s existential confusion which I so often find is never far away from my mind. I asked myself—as I so often do—whether I am doing the morally right thing by choosing an academic life, whether such a life helps other people in addition to myself, and whether I do in fact have a greater obligation towards selflessness and utilitarianism by virtue of being an Ivy League legacy and the child of academic parents. Every time I find myself frustrated by this problem, I desperately seek for the validation that will tell me I am not selfish to want things which will make me happy, and that furthermore I can do good from behind a lectern, on one side of a seminar table, across an office desk, at committee in a conference room, or seated next to a student in the dining hall. But when I read about human rights abuses such as those being carried out against LGBT people in Iraq (aside: read that story and then call your Congressional delegation, or, if you’re British, your MP about it. I’m serious), I wonder whether I can continue to talk about unhistoric acts when the world is in such desperate need of some historic ones.

But—as it has proved to be in opening up the world of art—close-reading can serve to illuminate these more pragmatic matters too. Last week, the Supreme Court determined—though by a narrow majority—that the precise semantic significance of the 45 words in the First Amendment to the highest law of the land does not permit religious organizations at public universities to exclude those (like LGBT people) protected by the university’s non-discrimination policy. The Court has read those 45 words time and again, and whether they have been understood to strike down school prayer, or to uphold the right of the Ku Klux Klan to walk down the street, the nine (or more, or less) justices have historically achieved their decisions through the very close reading of those 45 words. Likewise, they have read the Fourteenth Amendment to uphold or to overturn public segregation; to uphold or to overturn sodomy laws; to uphold or to overturn miscegenation laws. Reading and misreading and re-reading, those nine (or more, or less) men and women deploy the skills of my freshman-year Chaucer-and-Milton class to change the way life is lived in America, to hold back or to push forward the progress of the arc of justice. If we are to take seriously (as we must) the mission and the necessity of the mission of the Supreme Court, we must embrace the value of a practice of reading which examines the significance, the meaning and the placement, the connotations and at times the critical history, of every word in the United States’ founding document. This has been part of the mission of our country since 1787, and continues to be the integral piece of making it a more equitable and fairer place for anyone to pursue her or his happiness—as integral, indeed, as close-reading is to the study of literature.

Perhaps Heather Horn never had the luck to enroll in an English class which taught her what I have learned in mine since that one disastrous experience junior year of high school. Perhaps she only experienced dull and poorly-taught classes which put her off the study of literature altogether, as happened to so many of my peers without humanities-professor parents and the odd teacher who really knew what she was doing. For that, I am absolutely sorry, because learning how to read literature (and, with it, art) has been my greatest source of joy and hope. I only ask Horn not to condemn the careful examination of literature out of hand—because without people in the public sphere willing to invest in the necessity of the humanities and the close study of texts which they entail, our society doesn’t just forfeit the personal pleasures of a few nerds like me. Our project of “form[ing] a more perfect Union” (note that “more,” edging the project ever onward—not just “perfect” but “more perfect”!), our very nation, is at stake.