Thoughts on ‘It’s a Sin’

A few years ago I took a position that I will not watch any more Holocaust dramas. Having been assigned to watch several in German language classes, I had grown exhausted and angry with their didactic tone. The intended audience was never, it seemed, those of us whom the Third Reich would have targeted, but rather those who might be inclined to sympathize with Nazis were they not told over and over how evil their crimes were. At some point, I couldn’t sit through another film designed to inform me, as if for the first time, that 6 million people like me were murdered.

Watching the mainstream press get hold of It’s a Sin, Russell T. Davies’s new TV series about the AIDS crisis,and make it about them—remark upon how important it was that the Great British Public be reminded of this episode in its history, which it had forgotten; make clumsy comparisons to the present pandemic—I feared, once again, a story that repackaged the horrors suffered by a marginalized group into clichés to feed a different audience. The people who would watch It’s a Sin at 9pm on a Friday on Channel 4 were, I imagined, the same people shown in its third episode, in front of their televisions in suburbia, reacting with shock and disgust when the famous “Don’t Die of Ignorance” PSA aired on TV in 1986.

It may well be that millions such people will watch the programme, though there is a part of me that hopes that they will be turned off by the quantity of sex scenes and by the unflinching, brutal truth of it all. Yes, there are AIDS clichés in It’s a Sin, and Russell T. Davies screenwriting clichés that those familiar with his oeuvre will recognize. But the series avoids mere didacticism. It doesn’t babysit the viewer with tedious dialogue that explains Kaposi’s sarcoma or AZT. Its everyman characters—a collection of friends in their 20s living in London through the 80s—simply, inexorably, watch their friends die. They fight against incredible odds for people with AIDS to be treated with dignity, and they process their own complicated emotions around having to deal with a crisis of such enormity so young. People will have different views on the “must gay characters be played by gay actors” question, and I don’t have a strong position on it myself, but the production team’s decision in this case to cast only gay actors in gay roles keeps It’s a Sin feeling like a series by and for queer people. 

Indeed, the programme is unsparing in its sharp criticism of those people outside the community who reacted to the crisis with fear and panic. There are some characters who step up: a couple supportive parents, some caring nurses and doctors. But the series devotes much more screen time to moments of focused, small-minded hate. The villains of the piece are the parents who cannot see that in their final agonizing moments, their young sons would rather be surrounded with chosen family in London than languishing at “home” in the provinces. “You’re just kids,” one mother dismissively says to her dying son’s friends. She then does not listen when they reveal themselves to be far more knowledgeable about how to care for someone with AIDS than she is, when it is clear that her son is only the latest of the friends they have buried. If there is allegory in It’s a Sin, it is perhaps not for the covid pandemic, but rather for the climate emergency and the sins of late-stage capitalism, in which so many older people have refused to listen to the intellectual and moral clarity of the young. A few years ago, when a friend suffered a very significant family crisis, I thought, “no 27-year-old should have to deal with this.” It was nothing compared to what Davies’s characters in their twenties endured, what real people like them in London and New York and Paris and all over the world endured.

I won’t say that, watching, I didn’t sometimes think of the present pandemic. Above all, I reflected on how people latch on to sanitizing surfaces as a practice that gives them a feeling of control, even when there is no evidence that an infection is spread through fomites—in the process, implying that people become ill through unclean habits rather than through viruses. And as I intoned over and over, “Thank God (or medical science) for protease inhibitors and for PrEP,” I thought about the gay men who have been shamed for wanting to have sex in this epidemic. I spent my twenties in New York City: if it were thirty or forty years ago, those would have been my friends dying. It only takes the tiniest bit of imagination and compassion to see how, now that HIV is a manageable chronic health condition and we have drugs that can with near-certainty prevent its transmission, people might want to do what they can to pursue their sex lives, even in the midst of a public health emergency. Why shouldn’t they want—and have a right to—a key form of pleasure that was denied their forebears not only through the ravages of a disease that killed millions, but also through shame and stigma?

I weighed whether it was appropriate for me to say something about It’s a Sin. It’s not my story to tell, and by rights it should be left to the survivors of that time, or the younger gay men who in a more fundamental way still live with its legacy. But Davies offers others of us a way in through the character of Jill, the one woman in the friend group at the story’s center. While many of Jill’s friends spend the early eighties in denial, she avidly seeks out all the information she can find about the strange new illness. She becomes her friends’ advocate to doctors. She runs interference with their parents. She answers the phones at a helpline (one imagines it’s London Gay Switchboard, about which there happens to be an incredible podcast, for those interested in learning more about the history behind the show). She gets arrested at ACT UP demonstrations. She’s the visitor whom every patient and nurse on the AIDS ward knows. There’s a cynical reading of Jill: the asexual fag hag saint, the one woman, a bit pathetic, her life subsumed into those of her friends. She and a disabled single mother who doesn’t disown her dying son—another saintly caricature—are the only significant women characters who aren’t villains in the story. 

But there is also a more generous reading of Jill, and it’s this that I most want to take away from It’s a Sin. Jill is a character with immense moral courage, who steps up when her friends and her community need her. If I had been there in the eighties, would I have been a Jill? I’d like to hope so, but of course it’s easy to say that in retrospect. It’s a Sin is Russell T. Davies’s memorial to his dead friends, it’s a nostalgic evocation of what it might have been like to have sex in the toilets at Heaven in the summer of 1981, and above all it is a heartbreaking, unflinching tragedy. If it offers any kind of didactic lesson, it’s a subtle one. But it’s this: what are you doing now to be a Jill in the present crisis? Are you caring for your community? Are you informing yourself about immunology, while also putting compassion first? Are you campaigning for equal access to medical treatments for all? Are you holding the government to account? Are you mourning the dead, while fighting like hell for the living? I’m not sure that I’m succeeding at this, but I’m trying. I hope that you are too.

Should I do a PhD in history? Which country should I do it in?

Between finishing my PhD last spring and teaching master’s students this term, I have had more conversations than ever before about whether to do a PhD in history and whether to do it in North America or the UK. (Many of the people I have spoken to this year are, like I was, Canadian or American students pursuing master’s degrees in the UK and weighing whether to stay or not.) Finding myself saying the same things repeatedly, I thought I would distill here some of my principles for thinking about these questions.

The usual caveats apply: I benefited from immense structural privilege throughout my higher education, am fortunate to have only a negligible amount of student debt, and do not have any dependents. Your mileage may vary depending on your economic and family situation. 

Principle 1: A PhD is an investment in you, not in a future employment outcome—though it might also be simply the right job for you right now.

Hopefully I am not the first person to tell you this, but there are no jobs. The number of people earning PhDs in history every year far, far outstrips the number of traditional full-time, long-term academic jobs available. This was true before the pandemic and is even more true now. The majority of people who earn a PhD in history statistically will not be able to work in academia long-term, no matter how much they might wish to, how elite the program from which they graduated, and how qualified they might be in other respects. 

Therefore, pursuing a career in academia should never rank top on the list of reasons why you want to do a PhD. Some compelling reasons to pursue a PhD might be because you feel intrinsically excited about the prospect of studying history at graduate level, because there is a particular long-term, book-size research project about which you feel extremely passionate and motivated, or because—regardless of what you wind up doing for a living after—a PhD is the right job for you right now. Graduate school might be the right job for you for all kinds of reasons. A term of at least five years (North America) or three years (UK) of guaranteed employment at a certain minimum standard of living might be more job security than you’ve had previously in your working life. It might be a vehicle for you to move to a certain city to follow your partner’s career or simply for the satisfaction of living there. You may have a clear sense of how you will balance developing a more precarious career (say, as a writer or artist) alongside your PhD work, which might provide an opportunity to pursue that creative work that you might not otherwise have had. (Though doing a PhD can be very all-consuming, and you might find that multitasking more difficult than you anticipated.) Few of us do the same job all of our working lives these days, and there are all kinds of reasons that starting graduate school might be a sensible, stable, and exciting choice for several years, provided that you are clear-eyed about the reality that it will not easily translate into a specific career outcome afterward, and that the work you will wind up doing after may well not be something for which the PhD has specifically qualified you.

For these reasons, I can think of very few circumstances in which one might reasonably want to pay for a PhD. Maybe if one is very wealthy to the point that money is absolutely no object; or if one is older (retired, say) and has a pot of money available specifically to put towards pursuing a passion project. But if you are thinking of starting a PhD in your twenties or thirties, and there is no reason to think of a PhD as an investment in a guaranteed future career outcome, it simply doesn’t make any sense to go into debt for it. You can’t look forward to a return on your investment in a way that you might, say, for American graduate professional degree programs like law school or medical school. I would recommend only pursuing a PhD if you are funded at a standard that is financially feasible for you to live on (taking into account other relevant circumstances like a partner’s income, etc.).

There are, of course, things that you can do both before and during grad school to make yourself the most competitive candidate that you can for academic jobs. Discussing these is somewhat outside the scope of this post, and would also be presumptuous coming from someone who hasn’t (yet?) secured a permanent job in academia. But when you’re just considering whether to apply, the important thing to remember is that there are no guarantees—and indeed, the odds are stacked against you—even if you do everything right.

Principle 2: There is no huge rush to start a PhD.

I often speak to prospective grad students who are anxious about the time commitment involved in a PhD and are eager to get started right away. I would advise against rushing simply for the sake of rushing. Again, there may well not be a long-term career in academia waiting for you on the other side of the PhD, so it is worth taking your time to consider a wide range of career trajectories and try out possible paths. Right after college is the time that it’s easiest to get a short-term fellowship or paid internship to try out a career in journalism or teach English in another country. You can be a paralegal or an editorial assistant for a year or two, then leave it behind if you don’t care for it. All of that is much harder when you’re in your 30s.

Furthermore, starting a PhD in your late 20s or your 30s might be beneficial. You might have some emotional distance from college, and be more prepared to treat the PhD as a job and not as an all-consuming experience (which will help to guard against burnout). You might have had the opportunity to pursue higher-paid employment, or to live with family rent-free, and thus to put some money aside or pay down some debt. To be sure, there’s a balance to be struck here—by the time I finished my PhD at 30, I was very excited to be rid of the minor indignities of being a student and to make slightly more money (as I am extremely fortunate to do in my present position but is of course not guaranteed). And of course not everyone has other job opportunities, and grad school might be the best way to nail down an income right now, to secure a necessary immigration status, or other practical considerations.

If you’re able, though, taking a year or two out after undergrad or a standalone master’s will give you the time to make a considered decision about which programs to apply to and what broad area of research you would like to pursue. (If you are applying to UK PhDs, I would think you would definitely need that time to develop a sophisticated and thorough thesis proposal.) It will also allow you to draw on a completed undergraduate or master’s thesis in your application, showcasing the best evidence of your capacity to take on graduate-level original research. The programs will still be there next cycle, and a year is not very much time. If you don’t quite feel ready to give the applications your best shot, and you’re financially able, then wait.

Principle 3: A UK PhD and a North American PhD are not, actually, the same qualification.

As you probably know, the structure and norms of the PhD in the UK and in the US and Canada are very different. The PhD in North America includes two years of coursework, comprehensive exams, language training, and a fairly substantial teaching load that is at best extremely valuable professional experience and at worst exploitative grunt labor (sometimes it’s both at once). It typically takes 6–7 years, with a standard funding package guaranteeing funding for five of those years and the rest made up with full-time teaching or external fellowships. While at many UK universities grad students do increasingly assume (increasingly exploitative) teaching responsibilities, the degree is still primarily oriented around exclusively writing a thesis. There are no coursework or exams (the one-year standalone masters does not, in my view, approximate the extent of the work one does in the first two or three years of a North American PhD). The degree typically takes around four years, with three of those funded in a standard package. These are different and on some level incomparable experiences, from which one leaves with different skills and having been prepared for different job markets.

My experience of doing my PhD in the US was that the coursework, exams, and teaching experience were absolutely invaluable both in preparing me to write the best dissertation possible and in preparing me to work in academia, including in my current role at a UK university. Like most US PhDs, I did not write the dissertation I imagined that I would write when I applied to my program. The intervening years allowed me to develop a larger and more sophisticated project and to conduct the huge amount of archival research necessary to execute the project. It is difficult to imagine how I could have done as well as I did on the US tenure-track job market last year if I didn’t have the breadth of teaching experience and other professionalization opportunities that I gained through my US PhD; that experience has also served me well in being able to pursue teaching and other professional opportunities within my UK university. My sense, based primarily on my own experience, is that if it is important to you to be legible to and competitive within the academic job markets on both sides of the Atlantic, the North American PhD is the better bet—and well worth the added time commitment, not least because that’s three more years during which one has more secure paid employment than one might have otherwise, which is not something at which any of us can turn up our noses these days.

That said, optimizing your life around the tenure-track academic job market is, for the reasons discussed above, in some sense a fool’s errand, and there are several countervailing reasons one might wish to do a PhD in the UK. A particular supervisor(s), research group, or large research council-funded project who happen to be based at a UK institution might be exactly the right person/people for you to work with. You may actually know that you do not wish to pursue a career in academia after the PhD, and that what you need out of grad school is more of a residential fellowship to write a book, for which three years of income and access to a research library is actually more useful than all the trimmings of a North American PhD. There might be personal/family reasons for you to be living in the UK, and settling in the UK long-term (with an academic job or not) might be your desired outcome. It is always fine to prioritize personal considerations in terms of where to live for grad school! Though sometimes it involves taking into account tradeoffs of various kinds, and it’s important to be clear-eyed about what those tradeoffs are. The important thing to bear in mind is, I think, that if you start a UK PhD with the desire ultimately to seek academic employment in North America, you should be very aggressive from the start about maximizing opportunities such as transferable teaching experience, connecting with North American mentors and peers, etc. so as to counteract some of those transatlantic differences in the nature of the qualification and the academic culture.

Destroying the Nuclear Family

I walked down to the end of Christopher Street Pier on Monday afternoon. It’s not really Christopher Street Pier, of course, not as it used to be; it’s a gentrified park with a neat little rectangle of grass, which in this strange pandemic summer has been marked out in circles of paint spaced six feet apart. Yet each circle, near about, was occupied by a man or a small group of men, sunbathing in underwear or a very small bathing suit, social distancing but on display. “Nature is healing,” I texted a friend, in reference to the meme that has been deployed since March to reference everything from wild animals roaming city streets during the lockdown to flour and toilet paper returning to supermarket shelves.

The last day I went to work in person, I distributed my dissertation to my committee. The day after that, I started listening to Maurice—a book I reread about once a year—on audiobook on the daily walks in the local park that I used to get out my nervous energy during those days in March. Since then I have revisited all of the collected works of Hollinghurst, Sedgwick too; caught up on Andrea Long Chu and Daniel Lavery; watched every gay TV show available on a streaming service; watched documentaries about Keith Haring and Roy Cohn and dozens of videos by Nelson Sullivan. Before it got too hot, I walked every day in the park, sometimes with a friend and sometimes on my own. Amidst the small family groups we saw in March and April and May, we always noticed the gay couples, more than we thought we would see in our part of upper Manhattan. And then there were my friend and I, not a family unit, but thrown together by the pandemic. For weeks we kept each other sane: talked about what we had done that day, what we would do tomorrow, and about gay literature. One day in April or May we walked over a long bridge suspended high over the Hudson. We counted the tugboats and kayakers. We walked to Yonkers to buy my friend a bike, and then we biked all the way from the Bronx down to lower Manhattan on an errand. On the way downtown I said to my friend, “My goal for this summer is to destroy the nuclear family.”

I think many of us had the experience at the start of the pandemic of feeling suddenly extremely connected—if anything, perhaps too connected. I have split my life between two countries and know a lot of people. There were Zoom reunions in abundance, emails from people I hadn’t heard from in years, a group text of college friends that sprung into solicitous action when one of our number came down with COVID. Not all of these communiqués lasted, but one re-establishment of connection, with a dear friend in another country with whom I had been keeping in touch less regularly, solidified into a weekly Skype call that I cherish. In the spring and early summer, my friend and I talked about how this time was clarifying for us. Alone in one-bedroom apartments, both long-term single women, we talked about how we couldn’t imagine sharing our hard-won and cherished space with another person in lockdown. Yet at the same time, we talked about the kinds of intimacy we wanted for our futures, the values we shared, the people we loved. Both of us had quickly decided we weren’t really going to spend lockdown on our own. Before public health experts made talk of the coronavirus “bubble” commonplace, we came to ask after each other’s bubbles as if they were family.

I have long wanted a chosen family, and my coronavirus bubble of two New York friends is the closest I have ever come to one. I came better out of the pandemic than most: I have kept my employment and my apartment, and have had a lot of help to manage the challenges of an impending international move. But more than that, my life has been richer during the pandemic than it was before, by dint of the addition of the two members of my bubble. I cherished the intimacy of seeing the same people every day, of knowing what they had been doing, of knowing that they knew what I had been doing. I found meaning in being the person who could be called on to help if any help were needed. In March in New York, when there were refrigerated trucks parked outside the hospital near where we live and it seemed as if there was a possibility that any of us could fall very seriously ill, one of my bubble friends and I exchanged emails with our families’ contact information and informal health directives. That moment was scary and serious. I wrote a will that day, and I felt dizzy. But it opened up a new horizon of possibility for me. As someone who will probably never get married, it had not occurred to me that I might not have to die alone. As it happened, bathetically, I haven’t died of COVID yet, but the other day I did trip in the street and sprain my ankle. While I was disoriented in pain, those same two friends bundled me into a taxi and took me to the urgent care clinic. They waited for me, even as I texted them from the exam room to say I would be fine and they should probably go home.

When I was maybe 16 or 17, my cousin Julia came to stay with us in San Diego. (I hope she won’t mind me telling this story, as the only non-anonymous person mentioned here.) We had gone to one of the used bookstores in Hillcrest, and I walked out of the store but she was still taking her time. When she emerged, she bestowed upon me an entire set of the collected works of Armistead Maupin, the chronicler of queer San Francisco who popularized the phrase “logical family.” Julia has since become famous nationwide, adapting her insights from public health research on HIV to the present pandemic. I tell everyone I meet about her essays in the Atlantic; more often than not, they have heard of her already. She has also profoundly informed my own understanding of how my bubble and I, and my wider social circle, manage risk and harm reduction around COVID transmission. When a friend I had had a drink with a few weeks ago texted me to say that they thought they might have come down with COVID symptoms, I thought, what would Julia do? Chosen family can include biological family too.

Equally, back in early June or so, not long after Larry Kramer had died—the one celebrity death in these months (and not even of COVID) that really shook me—a columnist in the New Statesman opined that people wouldn’t forget the coronavirus pandemic in the way that they had easily forgotten AIDS. I raged and raged on that day’s walk to the park. I have thought about AIDS every day since March. I can’t stop thinking about AIDS. I think about AIDS every time I put on a mask, every time I hear another friend has fallen ill, every time I read a statistic about the worse health outcomes of low-income people and people of color. I think about AIDS every time I wonder what will happen if I get COVID. I think about AIDS every time I wonder whether the risk of COVID means that it is worth leaving my house, or taking the subway, or moving to England. I thought about AIDS when we walked through the streets of lower Manhattan on the last Sunday in June, those streets for the first time in decades filled with angry chanted slogans, not corporate floats. I have thought about AIDS all this long hot summer, my first and probably my last full summer in New York, devouring every scrap of gay literature and film and television I can lay my hands on.

It seems important to emphasize that destroying the nuclear family does not imply any rejection of my parents and my sister, who remain some of the most important people in my life. It means, rather, to say that I love my parents and my sister out of free will, not out of obligation. I should like to think that my love for them is better for knowing that it does not stem from social convention or mere filial piety. Moreover, I have known ever since I was old enough to know such things that I would never have children and very likely would never marry. When you destroy the nuclear family, you have the opportunity to start anew, to imagine other possibilities. The last time I wrote substantively on this blog, last summer, I suggested that the dominance of the “born this way” narrative in framing claims for LGBTQ civil rights in twenty-first-century America may have led us to overrate the potential for choice in queer identities and communities. What is queer community if not a radical openness to loving as we are called to love, to loving out of freedom and not out of obligation? What is it if it is not alternative kinship structures, staying friends with our exes, caring for the young people in our lives regardless of whether they are our own children, the potential to love the one-night stand you never see again as much as you do the partner with whom you spend decades of your life? Of course, twenty-first-century queer people invented none of these things. But a respect for queer heritage can remind us that they are available to us.

The day that I walked down to the end of Christopher Street Pier, I saw another friend, for the first time since the pandemic, as well as the first time since I’d defended my dissertation and gotten a job. “What have you been thinking about?” he asked me, as we did a loop of his block downtown on the hottest day of this long hot New York summer. “I’ve been thinking about destroying the nuclear family,” I found myself saying again. I meant it as a glib and witty remark—I have known this friend since I was nigh on a child, and I had long aspired to make remarks as glib and witty as his could sometimes be—but at the same time I really meant it. From the days in the beginning of April when the three of us in my bubble glanced furtively around to see if anyone was going to shame us for interacting with people who were not our spouses or children, when it sometimes seemed as if we were the only people in our peer group who had not moved back in with their parents, I’ve been thinking about how little it takes, really, to live a life that is quietly disruptive. And I have been thinking about how much of how the pandemic response has been organized, in various countries, has been designed to privilege the nuclear family and to subjugate other forms of affective relations. All this talk about “households,” about “staying home.” All this shaming of low-risk behaviors like meeting your friends in the park or at the beach. All this assumption that one parent or caregiver can be available to homeschool children. The isolation of the most elderly and vulnerable in nursing homes, the lack of social safety nets to support those whose multigenerational families live in overcrowded conditions. The rules in the Schengen area that you can cross national borders to visit your romantic partner, but not your dear friend. Surely we have the capacity, at the very least, to imagine a public health regime that seems less explicitly designed to punish difference. And surely, if we are—whether we like it or not, whether we have a choice or not—living our lives, as ordinary as they may otherwise be, outside of normative kinship structures, we have a duty to be brave and bold, cherish every moment we have on this earth, love one another, and convince by our presence.

Wikipedia, hilariously, informs me that St. Christopher is the patron saint of bachelors. He is also, more famously, the patron saint of travelers. When I came back from the pier on Monday afternoon, I walked the entire length of Christopher Street back to Greenwich Ave: ritualistically, committing it to memory out of a sense that I might never be back there again. I wanted to make sure that ages and ages hence I remembered that I was here, this summer, the summer of the pandemic, and that in this summer I had lived my life in the best way that I knew how.

Coming Out, Again and Again; or, In Which I Demonstrate That I Can Spin Out 2,000 Highly Personal Words In Response to a TV Season Finale

I just finished watching Gentleman Jack, and my overall view of the final episode is: cloying Hollywood ending is cloying. Without question the best part of this episode is Sofie Grabol as the Danish queen; second-best is how, in just a few brief well-chosen scenes, the first half of the episode efficiently and effectively conveys how lengthy and arduous all journeys were before trains. The split wedding scene itself was okay, using the language of the BCP to good effect to show how—as was the case generally—people were creative in adapting this familiar, powerful language to the rhythms of their own lives. (I don’t think I have actually ever heard the 1662 Eucharist ever in my life before? But in those pre-Oxford Movement days how did they just walk into a random church in York and find a communion service?)

Last week at the conference I had the somewhat heady pleasure of being welcomed into a lunch circle of queer women and trans senior scholars who had all devoured this series, most of whom know the text of Lister’s diaries much better than I do, and were talking animatedly about the whole phenomenon. For me, it was an extraordinary experience to be welcomed and included in this conversation, but I did fit in: pretty much none of us liked the presentist depiction of the Annes’ relationship or the way that Lister is represented as making born-this-way arguments about a sexual-object-choice-based identity, though we did think the series gets some things right about Lister’s gender identity and we all loved the estate management/Tory politics stuff. (People thought I was funny when I called it “Gay Poldark,” which I stand by.) On reflection, though, I am kind of struck by the gulf between this collective opinion and a wider one, perhaps, of the “can’t we just have this one thing?” variety, which sees in Lister the potential for an inspiring, exciting, sexy fictional character. As I write this, I apprehend something of how annoying people probably find it to be married to historians, always coming along to spoil a pleasant night hanging out on the sofa by nit-picking about television, which is always of course necessarily fiction. That those of us sitting on the grass outside the Birmingham history department on Friday were nit-picking about how you might act on screen the idea of, in 1832, defining your identity in terms of gender inversion and not sexual object choice—and not, like, the costumes (or, you know, what “death recorded” means)—is perhaps no less annoying to those who admit that the point of fiction is that it can convey something other than what actually happened in the past.

This has been an extraordinary few months for my own sense of identity and political belonging, with June fifty-years-since-Stonewall at its center. The work I did in preparation for our Sexuality & Erudition workshop at Princeton in mid-May brought me, slowly, back towards primary and secondary sources I hadn’t thought about in years, and the whole Naomi Wolf contretemps caused me to reread my senior thesis and remember how much I cared about Symonds and also why it mattered so much to me that both academic pasts and usable pasts get the queer past right—for I feel such great love for the queer past that I hate to see it manipulated or misconstrued, even when that’s in the interest of presenting a happier or a more accessible, less complicated story. The work I presented at the conference last week was kind of a confluence of the Sexuality & Erudition stuff and the axe I was grinding at Wolf—and while I wrote a dissertation chapter in the middle of all this too, about national politics and higher education policy in the 1900s-20s—it sure doesn’t feel as real or as pressing as the Sexuality & Erudition paper or the Wolf review or the paper for this conference or the job materials I have been worrying over wherein I pitch The Second Project.

On Thursday, when I had been sitting in classrooms listening to people speak about postwar British social history all day and was getting a little antsy, I spotted a Misuse of the Queer Past on Twitter and issued a snide and tetchy tweet about it, closely related to the paper I was about to give at 9 the next morning. This had a good reception on Twitter; my friends started making fun of me for bashing Stonewall (the organization); and then there was the paper and the heady queer lunch talking about Gentleman Jack. Somehow all of this caused me at some point, casually, to call myself a “professional queer,” words that I had not uttered in probably a decade. And that act of naming, in turn, unleashed a whole lot. I was still talking to my friends whom I get to see once or twice a year (another reason why this conference was to be cherished), and I heard as if someone else was speaking it the clarifying sentence—true, but strange to my ears—”I used to cover LGBT politics for a national DC-based magazine.” Was that really me? Yes, I know that I used to sing in an Oxford chapel choir—but Emily Rutherford, Dyke Reporter? In the days since, more scenes have flashed suddenly and vividly in my mind, some long forgotten. The time I, like every left-wing intern before or since, snuck into the CPAC conference and got thrown out. The time we carried a sign that said “Even Princeton” in the March for LGBT Equality in Washington. The time every student in Princeton woke up to my bowl-cut 12-year-old-boy head adorning a character-assassination piece in Princeton’s conservative magazine. But the one that stayed with me and kept me up late last night was in the spring semester of my freshman year when I met two women on campus, a professor and a senior administrator, who dressed like I did then, in men’s or men’s-type collared shirts and khakis. In those days I had worked in a cinema, as a theater electrician, and shelving books in the library, places where women regularly wear men’s clothes. But until I met that professor and administrator, I genuinely did not think that it would be possible to get a professional job, or one that involved any kind of public-facing work like lecturing, if you were gender-nonconforming. Thinking about Anne Lister, and about that extraordinary lunch on the grass on the Birmingham campus on Friday, what I get stuck on is the time in the spring of 2009 that I was sitting at a big table in a conference room during a committee meeting, and I turned to the person on my right, who was chairing the meeting, and realized she was dressed just like me. I feel a rush of emotion still, as I write this, many years since anyone has yelled at me for being in the wrong bathroom and so, so much older and more tired.

I have not been in a relationship in many years now—increasingly, and increasingly emphatically, by choice. For me, it in part follows from that state of affairs that I have felt more freedom to present as a woman without feeling that I have to get the illusion exactly right in order to keep the society of women from taking my woman card away. I remember being horrified when I was 14 and my mother suggested to me that I would not be socially ostracized for wearing a dress one day and jeans the next; fifteen years later, that is more or less exactly what my wardrobe looks like. On the other hand, as sex and relationships have faded farther from my purview, I have felt much vaguer about claiming any kind of sexual orientation, any kinship with others on the basis of sexual orientation. I have thrown myself into learning the wider field of British history, into travelling a lot and teaching my students, and into the minutiae of intramural politics at a few universities over a few decades in the past.

This has all, though, these last few months—the workshop in Princeton, and the intense and exciting collaborative experience of planning it with my co-conspirators; nominating myself on the internet as the guardian of John Addington Symonds’ legacy; that lunch on Friday—given me a yearning desire to belong to queer community again, to actually affirm that I am of my people instead of staying silent out of a kind of reticence or embarrassment or privacy (or just, like, dissociation from my body) and hoping people will assume the right thing. I am grateful to the queer communities who have accepted me through my implications rather than assertions up to this point, though I feel that I have to come clean that I am not, as I think many have sometimes assumed absent any kind of indication or clarification whatever on my part, a lesbian (but, you know, nor was Anne Lister, and she still gets a blue plaque saying she was, so).

In the past I have dated and been in relationships with men and women; the Kinsey Spectrum needle has fluctuated wildly and continues to do so on an almost-daily basis. I have, however, been single and celibate for nearly six years, and am increasingly affirmatively committed to celibacy as a concrete lifestyle choice and identity-political category. This does not mean I am asexual, but it does mean that I choose to organize my life around principles other than sex, dating, relationships, marriage, and children. This may change in the future, but it currently holds disproportionate significance for me given that I am nearly 30 and everyone is getting married these days, regardless of the gender of their spouses.

One of those women from college who wore button-up shirts said something beautiful once. We were at a professional dinner and another professor there asked her something about her wife. My mentor clarified that she and her partner are not married: “We’re old-fashioned gays, and we don’t believe in marriage for ourselves. But I’m so happy about marriage equality. There’s no use making a political statement to opt out of something unless there’s something there to opt out of.” I think the “born this way” narrative has led us to underrate the element of choice in queer lives, relationships, and communities. Queer people past and present do, all the time, make daily choices about whether and how to fit in or stand out, to break the law or not, to do so in secret or in the open. When we constantly walk the boundaries of what is acceptable, we know that every aspect of how we dress or how we walk or whom we look at is a choice fraught with meaning, sending a signal about cultural affinity both to those hostile to us and to those on our side. (I remember the electrifying cruising ground that the college dining hall could be, how expert I became at detecting a certain kind of gaze that one man would give another across the tables.) This is no less true in those places today where the medico-legal regulatory regime has been persistently and steadily domesticated to the point of collaboration with queer self-fashioning.

Which is all to say that I am still queer despite my celibacy, and furthermore that (I would contend) my celibacy is a kind of queerness, for all that it is a choice: something which positions me as just as at odds with homonormativity as with heteronormativity, a form of resistance to a kind of totalizing logic about what sex and sexual orientation has to do with one’s personhood. Which is all to say, phrased a different way, that Call the Midwife—and not Gentleman Jack or Fleabag or Killing Eve or The Handmaid’s Tale or Game of Thrones—is the most radical and remarkable show on television, and queerer than you might think.

On (Academic) Writing

For several months now, I have regularly been posting on social media photos and screenshots of my efforts to write my dissertation. Some might call it self-aggrandizing, but I have found this habit—and the “likes” and comments it generates from my Facebook and Instagram friends—to be a powerful source of motivation that keeps me cranking out words daily. Due to posting these photos, I often receive requests to describe my writing process, and in particular what I am doing with the hundreds of little slips of paper with which I cover my desk and bedroom floor. So (and really this is just an excuse to procrastinate on cataloguing the endless varieties of misogyny expressed by interwar male undergraduates, which I find rather wearing) I thought I might write something describing how I write.

It’s necessary to begin with several caveats. First, there are not objectively better ways to write—or to do research, or any other academic or creative task. People find the systems that work well for them based on their learning styles, writing styles, and other life circumstances (for example, if you have children, your writing process may look very different to mine!). Second, while I am probably fairly good at being a fifth-year graduate student, I am not the right person to advise peers at the same career stage as me how to write a dissertation, because I haven’t written one yet. Furthermore, due to a combination of the structure of my program and receiving an extra year of funding through an additional internal fellowship, I have not had to teach for the last two years and thus have been able to engage in an intensive period of research and writing. While I do part-time work to supplement my income, it amounts to less than ten hours per week, and I mostly do it in the evenings and weekends. I say all this not to brag, but rather to observe that structural privilege accumulated over many years affects our ability to write in the way that we would most like. For example, the flexibility of my schedule allows me to block out a few hours every morning, at the time when my brain is sharpest, to devote only to my dissertation, a circumstance available to few graduate students. I am thus not the right person to advise the many peers whose path through graduate school has been considerably more difficult than mine.

In general, most of the brain-work I do is not very systematic. I have a free-associating mind rather than an analytic one, whose natural tendency is to dive very deeply into one topic rather than assessing patterns across different cases or bodies of material. I mostly do history by collecting very large quantities of archival material and then simply telling the reader, at some considerable length, what they contain, before embarking upon an editing process in which I compress that description to a more manageable size. When I get round to writing, therefore, I have spent many weeks or months in an archive or several, and have hopefully had the time to organize my findings, taking apart the Word document of hundreds of pages in which I have recorded my stream-of-consciousness impressions of the archive and creating an entry for each document in my Zotero database. I don’t do much in Zotero beyond recording the correct citation information and organizing all the documents by repository and by date. But I can then print off that database—the poor trees, but for me the most essential part of the process (enabled by the laser printer I bought out of the research account I am exceptionally fortunate to possess)—and get on with writing.

The writing, then, starts with several hundred pages of source material, and a space like a floor or a big table, large enough to lay it all out and get a visual sense of the shape of it. Even a large computer monitor doesn’t quite provide the scope for this, for me. Spreading it out makes it possible to see patterns—for example, if student politicians across many different institutions all became preoccupied by a particular topic in a particular year—and also to trial different ways of structuring the argument and exposition that I might follow over the course of the text. It’s actually possible to skip this step if the piece of writing is very short (such as a conference paper or a blog post), as in that case it might be possible to hold the whole structure in my head at once and/or to take a less comprehensive approach to incorporating all the source material. But for a dissertation chapter, it is absolutely essential—I shouldn’t be able to do any actual thinking without it. I used to nail down a structure on the floor and then paste all the slips into a notebook, but more recently as I have been working with larger and more complex quantities of material I have found that I want to revisit and revise the structure as I write. I keep the slips laid out on the floor, or file them in a more mutable way with paper clips and folders.

So that’s what the slips of paper are. They become the raw material out of which the story is written up, and I interpellate in my own words context and analysis that lead the reader from one slip of paper to the next and sum up what the whole picture is. To draft, I use a piece of software called Scrivener, which allows one to do various fancy split-screen things that make it possible to see, for example, an outline, multiple chapters, and the footnotes all at once, and to save notes and chaff in different folders as part of the same database as the main text. (Scrivener costs $38 with an educational discount; I am fortunate to have been able to afford to buy it.) I write at the rate of about 600–1,600 words per day, which will take between two and four hours of concentrated, intensive thinking that typically leaves me too exhausted to do brain-work the rest of the day. I do this about six days a week, every week (it has been many years since I took a vacation). I usually write between 10am and 1pm, and do other work, paid work, or housework in the afternoons, but occasionally this varies or I have enough energy to be able to write all day. I try to push myself if I can, but listen to my body and stop if I can’t. I never write in the evenings, and only rarely, under extreme time pressure, do other work at that time.

While on the face of it this quantity of word production in this amount of work time might seem productive, it is actually not efficient at all. With so many slips of paper, it can take weeks to progress through a given section of the text as it is laid out on the floor. Almost every aspect of writing the dissertation so far has taken much longer than I had imagined it would, and my goal of finishing a draft by September has slipped progressively further out of reach. Moreover, this word production is the first in a series of revision stages. The first drafts of the chapters that have so far got as far as a first-draft stage have been 35,000–40,000 words long: baggy objects that are not only of an inappropriate length for the genre of thing that they are, but would be much too onerous to ask anyone, even my long-suffering advisor, to wade through. It takes another series of weeks to sculpt this material into something that looks like a chapter. My goal is always to get it under 20,000 words before I show it to anyone, through a painful and painstaking process of winnowing. Then comes the slow round of workshops and seminars, getting feedback from expert and general audiences that will be the basis for further revisions. In previous writing tasks this process, potentially endless, has been put to a stop by a university-imposed deadline or by publication of one sort or another (in a journal, on a blog). But in this case I am several years away from the book going to press which will mark the formal end of writing.

My way of thinking about and doing writing has been influenced—I am still realizing how much—by the semester I spent in John McPhee’s writing workshop in spring 2009. McPhee, a longtime New Yorker correspondent and the author of many works of long-form journalism, has taught a creative writing seminar at Princeton for decades. Most of the students are aspiring journalists, as I was then—though most were much better writers than I was, and I think McPhee was rather tired of my tendency to insert myself into every piece of writing I created for the class. The main lesson McPhee impressed upon us was about structure. Clarity, he said, comes from the structure the writer creates, which leads the reader through the events or concepts discussed in the text and slowly opens out its central meaning to the reader’s understanding. In class, we scrutinized the diagrams McPhee had drawn to visualize the structure of his own essays, often metaphors taken from the natural environment which has been the subject of so much of his writing: a snail shell, a wave. This class was my first realization of how important it is to be deliberate and disciplined about writing, rather than just expressing one’s feelings. I took to heart the structure lesson. It fell to the back of my consciousness when I became disillusioned with journalism and began to study history, however, and I was surprised recently when reading a New York Times review of McPhee’s new book to be reminded that his office, like my bedroom, is filled with little slips of paper that he obsessively arranges and rearranges in an effort to make meaning. Evidently, I had taken away more than I had imagined; evidently, too, there is more than one way to be a writer. But now I, too, tell my students about structure.

I have always had a natural aptitude for writing, though it has improved considerably as I received more formal instruction in the subject and have been prompted as a professional academic to think more consciously about suiting one’s writing to a specific task. As an adolescent, I had perceived writing primarily to be a means of self-expression, and rebelled at restrictive formats. But now I see it primarily as a means of communication, and am happy to embrace formats as constructs that convey meaning and aid the reader. A dissertation chapter, a grant application, an online essay for a general audience, an email all have different audiences and purposes, and the format and style one selects must vary accordingly to be effective at getting one’s message across. I believe it is possible to (strive to) marry beauty and eloquence with clarity and analytic rigor. I admire in others, and try to achieve, a style which is engaging but also articulates a clear argument and makes it possible for the reader to follow it without working too hard. I have had interesting conversations with colleagues who opt for what I might call a more subtle or literary style, to tell a story more than to deliver an argument. It is clear that this is a matter of personal preference that might vary, and that when workshopping others’ writing it is important to respect what their aspirations for their own writing might be.

It is difficult, however, to remain true to personal preference in the context of an academic environment, where one has the sense that one is constantly being evaluated, that extenuating circumstances affect one’s ability to follow one’s instincts. Almost daily, my feeling of pride at having created 1,000 usable, relatively intelligent words in a morning is subsumed in my feelings of guilt, anxiety, and exhaustion: are those words good enough? why am I so tired now? how might I summon up the energy to do an afternoon shift? how will my advisor, or a conference audience, react to them? am I making enough progress to finish on time? will anyone else find this interesting, or just me? If I had to live solely by selling my words instead of by the strange cocktail of things that might see one lucky enough as to win the academic jobs lottery, I might be asking different anxious questions. But I have often felt that if I somehow had income coming in regardless and could still be writing for my own gratification, I would be much more confident in my ability to do good work, and to stop when I am tired and it isn’t possible to do more. It does not help that many in permanent jobs seem often to advise advanced graduate students about writing in precisely the same tone they would take with undergraduates ten or fifteen years younger, forgetting that grad students might have extensive professional training and experience in writing and editing—or, as I indicated above, that writers’ own personal processes and styles may differ and that what works for one person may not work for another.

We all, whatever our level of seniority, I think, often forget that work does not always look like work, and that there are all kinds of non-obvious pursuits—baking, going for a walk, sitting in the pub with friends, playing video games, childcare—that for many are actually an integral part of the writing process. For those who do their best thinking while doing something else, it is necessary to set time aside in the day not only for office/desk time but also for these activities. It is necessary also, whether an individual graduate student’s routine includes caring responsibilities or not, to respect that a 9–5 office schedule is not everyone’s best way of achieving work–life balance. To me, the greatest benefit of academic work has always seemed to be the flexibility of its hours. While for some the babysitter’s schedule is the most important constraining factor, or the 9–5 really is an ideal setup, for others the time that has to be set aside is late at night, or as in my case the hours of 10am to 1pm and those alone. For others still, the luxury of being able to do the grocery shopping in the middle of the day on a weekday may be the one thing that allows them to get the housework done alongside their more-than-full-time job. Much as the writing process takes many forms, and not everyone may be best served by drowning themselves in little slips of paper, so too is it the case that different people work best at different paces or at different times or in different ways. We do not guard work–life balance in the way best-suited to making academia accessible to everyone if we insist that everyone conform to a 9–5.

And with that, I am off to play video games. A happy 2019 to all!

On the Centenary of “Nine Lessons and Carols”

Among this year’s many centenaries, English choral music nerds are celebrating the hundredth anniversary of the “Nine Lessons and Carols” liturgy as celebrated in the Advent and Christmas season in many parts of the Anglican Communion. Lessons and Carols has a lot to say about the intellectual, elite cultural, educational, and religious history of modern Britain. It has a prehistory, a prototype of the liturgy having been conceived initially in 1880 by the writer and churchman E.W. Benson. It was of a piece with the liberal Anglican Incarnational theology of the mid-to-late Victorian period, departing from strict conformity to the Book of Common Prayer to offer a modern, not an early modern, form of plainspoken English prose, wrapped up in one particular story of the Word made innocent flesh.

As the story goes, Dean of King’s College, Cambridge Eric Milner-White picked up Benson’s idea in 1918, and repurposed it for a new era. Everything about the Chapel of King’s is invented tradition: though construction began on it immediately after the foundation of the college in 1441, most of the internal decor comes from the mid-sixteenth century, when construction was completed and Henry VIII re-endowed the foundation. King’s as an academic institution in its own right, and not simply a retirement home for Eton scholars, is also an invented tradition, this time of the 1850s-60s; and the conceit of the Chapel as one of the nation’s great musical institutions is an invented tradition, too, borne out of the renaissance of High Anglicanism of the early-to-mid-nineteenth century and, among other things, out of the fact that fellow of King’s Oscar Browning rather fancied choristers. Of course there had been collegiate choral foundations for centuries, and Eton and King’s had always had choirs at their heart (the Eton Choirbook is our best source for pre-Reformation English liturgical music). But with the Victorian period and its romanticization of childhood came a return to the English male-voice choir, with its characteristically heavy treble sound borne of having about 18 boys on the top part and four adult men on each of the other three parts. New composers wrote new music for these reconstituted choirs: veterans of Anglican choral music will know C.V. Stanford’s canticles, for example, some of which were premiered by the King’s choir when Stanford was Professor of Music at Cambridge.

Back to Eric Milner-White: Milner-White became Dean at King’s in 1918, fresh out of army service, like a lot of other men who came back into the universities after the war, some of them starting fresh as mature students and some of whose studies had been interrupted. Many men in universities across Britain in the immediate postwar period had a great longing to return to a romanticized status quo ante, to imagine their universities as ancient foundations, filled with ritual and male bonding. They could be awfully mean to the women with whom they shared those universities. But at King’s there were, of course, no women to worry about (the larger university was another matter—Stanford was against women’s degrees, Milner-White in favor, to give you a flavor). For Milner-White, the task was to create something suited to the experiences and level of religiosity of the new generation of student-veterans, but which had the feel of ancient mystery. So he borrowed that Truro service, and added new elements, like the bidding prayer that (now traditionally) opens the service. It’s not hard to hear 1918 in the language “Lastly let us remember before God all those who rejoice with us, but upon another shore and in a greater light.”

It wasn’t all roaring in the Twenties: there was a slow ebbing of the high tide of immediate-postwar social liberalism, and there was the BBC, that great voice of high-minded national conservatism. There were multiple services of Anglican worship every day on the interwar BBC, and it first broadcast Lessons and Carols at King’s live to the nation and the empire on Christmas Eve, 1928. It has done so ever since, in latter years adding a prerecorded television broadcast. The sound of the King’s choir made its way onto records and cassettes and CDs and into the multivolume series Carols for Choirs. Generations at this point have grown up hearing, and perhaps learning, 1950s–70s King’s Director of Music David Willcocks’s arrangements of carols such as “God Rest Ye Merry Gentlemen” or “O Come All Ye Faithful.”

The World Service and the Anglican Communion are two of those puzzling vestiges of British Empire that still linger, conspiring both to make invented traditions appear far more ancient and to perpetuate an image of a chocolate-box, amusement-park quaint Britain, in which the revival of the King’s chapel and choir could certainly have nothing to do with Oscar Browning’s views on prepubescent boys and in which British power’s self-evident manifestation as violence could be forgotten. Nowhere was this more true than in the United States, where twentieth-century conservative Anglicans and Catholics and Anglo-Catholics more than ever revered an imagined, romanticized Britain. T.S. Eliot did, of course, and William Buckley, and many other people I have met in the years I have spent attending Episcopal churches in the United States, many of them my age or younger. If you want to hear a faithful rendition of Lessons and Carols on the far side of the Atlantic, the place to do so is at the corner of 53rd Street and 5th Avenue in Manhattan, where stands the US’s only Episcopal collegiate choral foundation. Most of the clergy and staff are English. The director of music there, late of Magdalen College, Oxford (another of the great medieval collegiate foundations) will be taking over at King’s next year. Entering that space off Fifth Avenue is like stepping into a parallel universe. The music sounds beautiful, but whenever I go I leave not knowing whether I liked it.

Most cathedrals in England and Wales, which daily preserve and rehearse the Anglican choral tradition, have in recent years come to recognize that there is little to no anatomical difference between the voices of prepubescent girls and boys, and manage to create the English treble-heavy sound with mixed choirs. Other churches around the country and the world are less attached to that sound get on just fine with adult women singing the top parts. But Stephen Cleobury, the outgoing director at King’s, whose daughters have both sung in girls’ and mixed-voice Anglican choirs, has said that there is a distinctive value to the King’s choir remaining single-sex. In this he sounds rather like C.V. Stanford, and other Cambridge men just exactly one hundred years ago, who held that it was important that there be one single-sex university left in England. It sounds like something that could conceivably be true, but that no one manages to prove quite so much as state over and over again. That this is the same conversation that was happening in Cambridge a century ago speaks volumes about the history of English elite institutions and the ways they continue to reinvent themselves.

Lessons and Carols is such a quintessential invented tradition that it is even cited in the introduction to the famous Hobsbawm and Ranger Invention of Tradition volume, which coined the term and which most history undergraduates will at some point be asked to read. What place invented traditions a hundred years on? We need to have conversations anew about the kinds of public ritual that speak to our present condition, and not hesitate to invent new traditions instead of rehearsing old ones out of a kind of misplaced empty piety. But what Benson and Milner-White and all their collaborators got right, I think, was that forms of music and art and ritual and worship created to speak more immediately to our times—to, in the quest of the Church of England ever since the Act of Supremacy, get bums on seats—must not condescend or presume to know their audience better than they know themselves. The Lessons and Carols liturgy is pitched high, but not abstrusely so, and it presumes that its audience can come meet it where it is. It tells us something about the interwar BBC that they thought broadcasting this worldwide was a genius idea, and what it tells us isn’t all good. But that people still tune in on the radio and TV tells us, I think, how many of us are still looking for a peace that is not to be found here on Earth, and not to be found with any certainty in megachurches any more than it is to be found in homeless shelters and war zones and refugee camps. There is no preaching in Lessons and Carols: it creates a space for us to be; and its layered liturgy centers, under the sediment of British history, on a family of refugees cast asunder by the administrative reach of a foreign imperial power. It is for candles and for music and for prayer and for peace, and rather oddly—and this is one of the mysteries that is, for me, at the heart of British history—it retains the power to bewitch, despite its deep association with the heart of a nation that has committed great acts of violence and done great evil in its citizens’ name. That, in its intention to tell “the tale of the loving purposes of God from the first days of our disobedience unto the glorious Redemption brought us by this Holy Child,” it should hit upon this particular national problem of evil as well, seems about right.


Note: There is extensive original archival research backing up the information shared in this essay, which is drawn from my Ph.D. dissertation. While I have not included citations in-text, I am happy to provide references upon request.

Had I been alive one hundred years ago, my life might have been a little like Rose Sidgwick’s. Sidgwick was 41 in 1918, and I am 28 now, but otherwise the similarities stand. Born in 1877, the oldest child of an academic family, Sidgwick had access to impressive educational opportunities, and finished her first-class BA in history at Oxford the age of 22. Structured PhDs were not yet common for lecturers in the UK, and after her first degree Sidgwick lived with her parents in Oxford, pursuing the mix of part-time work, further independent study, and semiformal education common among many young people at the turn of the twentieth century who aspired to an academic career. While doing a job at the Somerville library she met and began a relationship with the librarian and maths tutor, Margery Fry. Fry was shortly to take up a new position as warden of the new women’s hostel at Birmingham University; she negotiated a history lectureship—the spousal hire of its day—so that Sidgwick could accompany her. Sidgwick began her first (and what would turn out to be her only) full-time academic job in 1905, at age 28.

University House, the Birmingham hostel, must have been an extraordinary place to live and work. Fry, Sidgwick, and the like-minded women they hired to join their resident academic and pastoral staff sought to build a new kind of vibrant community for the young women in their care. Most of the first women’s halls of residence at UK universities, run by wardens of an older generation, were deeply worried about respectability, preoccupied by the need to assure parents that it was safe to allow their daughters to live away from home, and aware of their marginal (and sometimes contested) status within the university. The residents, mostly in their early twenties, complained that they were treated like schoolgirls. But Fry, Sidgwick, and colleagues such as Marjorie Rackstraw and Bertha Orange were part of a younger generation of women who had been to university themselves, and who were often inspired by the freer pace of academic and social life at North American women’s and coeducational colleges. Benefiting from the support of a vice-chancellor who prioritized women’s education and gave them a free hand, the University House staff treated their students with dignity while still looking after their welfare. Inspired, perhaps, in part by Sidgwick’s father, who had fostered a similar kind of community among his men students at his Oxford college, they opted for a kind of controlled silliness that had an implicit higher purpose. Students put on plays; they made a snowman in the image of the vice-chancellor; the staff could be relied upon to do a comic song-and-dance routine in the end-of-term entertainment. Through this, they knit together bonds that might sustain these young women in the difficult life-course they were undertaking. The overwhelming majority of women university graduates in this period did not marry; a strict binary pertained between marriage and a career; women who desired to work in professional occupations or otherwise pursue a public life faced an uphill battle. Single-sex community—and a physical space where young women could be themselves—was a compensation, but also an essential form of nourishment. The letters that Fry, Sidgwick, Rackstraw, and other Birmingham friends exchanged throughout their lives testify to the richness of the love that these women felt for each other.

A new-built brick building on a hill outside of Britain’s great industrial city is not exactly the archetype called to mind when one imagines nostalgia for the summer of 1914. But the safety and surety of this microcosm, too, would be fractured by the war. Shortly after the outbreak of hostilities, the building was requisitioned for a hospital. The male side of the university emptied, and many women students elected to train as nurses or pursue other forms of war work. The staff each faced a difficult decision about whether and how to help, which entailed consideration of deeply personal questions about religion and ethics, and about gender, that many felt torturous. Fry, a devout Quaker, seems to have fairly easily concluded that it was her duty to engage in nonviolent war work, joining the Friends War Victims Relief Committee to bring aid directly to those civilians whose homes and sources of income had been ravaged due to their unfortunate location near to the front lines of the conflict. As crop fields became battlefields, the FWVRC’s volunteers distributed food to starving villagers, and set up schools for traumatized children. Educated women had usually received extremely thorough training in modern European languages, and their ability to communicate with French or Russian peasants was in as high demand as their organizational skills. Fry—whose social circle seems widely to have perceived her as a paragon of selflessness—inspired many of her friends, including non-Quakers, to follow her into the FWVRC, where they spent the duration of the war in camp conditions encountering famine and devastation firsthand.

It had been decided that Sidgwick would remain in Birmingham to keep open University House, now squatting in some rented rooms that were not needed for medical purposes. Numbers of women undergraduates throughout Britain remained steady or rose during the war, even as their male colleagues, like other elite men, made up disproportionate numbers of the casualties on the Western Front. Communities of university women could be important sources of momentum for volunteer aid on the home front, as students undertook first aid courses, worked in national kitchens, picked crops in the university holidays, or used their degrees to enter graduate occupations previously closed to them. But Sidgwick struggled with guilt at staying behind. Her youngest brother, Hugh, fed up with what he perceived as the uselessness of his work as a civil servant in the Education Department, had joined up; her friends doing relief work were enduring daily hardships equivalent to his. As she received their letters and posted to France the blankets and toothpaste and candy they requested, she joined the League of Nations Union and lectured to her students and the Birmingham public about how Britain might participate in building a better postwar world—as well, of course, as doing a day job that before the war would have been done by three people. She also traveled back to Oxford very often to join the rota of mother and sisters caring full-time for her father, whose dementia was steadily worsening. But none of this seemed to her like a sufficiently noble sacrifice—and she keenly felt the widening gulf between her and her brother and closest friends, who were being traumatized by experiences she could not imagine. Her grief only worsened in September 1917, when Hugh was killed at Passchendaele, leaving behind his family and his fiancée, who worked adjacent to Sidgwick as a nurse at the University House hospital. Sidgwick felt obliged to put on a brave face, telling friends that she and her family were much more fortunate than others: the word come back from Hugh’s comrades was that his death had been quick and relatively painless. But she knew that in the last year of his life Hugh had been expressing increasing anger about the pointlessness of the war and the duplicity with which politicians had sold it to the nation, and I read a hollowness into how she sought, by repeating it to others, to tell herself that her brother’s death had to be accepted.

While fighting continued on the Western Front until the proverbial eleventh hour, the wartime coalition that had governed Britain since 1916 did so with an eye also to building the postwar order. It was with this in mind that, in September 1918, University House received an invitation from the Foreign Office: would they like to send a representative to join a British Educational Mission to the United States, a delegation of academics who would meet with American colleagues and politicians to determine how universities might participate in a postwar Anglo-American alliance? Fry was the first choice, but, exhausted from her war work and with responsibilities to her aging parents, she suggested Sidgwick go in her stead. Everything happened very quickly: the five men members of the Mission had already set sail (only belatedly had someone in the Foreign Office suggested that some women delegates would be a useful addition), and Sidgwick scrambled to find someone to cover her teaching for the autumn term, ensure her father’s care was in hand, and tie up her various responsibilities at Birmingham. She and her counterpart, Bedford College English professor Caroline Spurgeon, sailed from Southampton; at the beginning of October they were met at the dock in New York by the senior woman member of the official welcoming committee, Dean of Barnard College Virginia Gildersleeve. With Gildersleeve as their host and guide, they took on a whirlwind tour of over thirty campuses in the northeast and midwest, an itinerary more crowded than that of the men, since they had to squeeze in visits to women’s colleges alongside the predetermined official list of institutions. Sidgwick initially doubted her ability to do the trip justice, but she quickly rose to the challenge, gaining a reputation as an effective public speaker about women’s role in building a new, modern, internationally-connected educational landscape. She marveled at the sense of possibility and optimism in America, at the willingness to invest in women’s higher education, and at the social ease that existed among the students she met at women’s and coeducational colleges. She toured everything she could, museums and hospitals as well as campuses. It was October, and she looked out the train window, admiring the foliage. (A couple weeks ago, in late October, I took a train from New York to New England, a not-so-different view.) The trip was life-changing: a chance to do something, be something, make something away from the violent forces that had ruptured her family and her friendships.

I am not an epidemiologist, but it seems possible that, had Sidgwick not traveled to the US, she might not have caught the deadly strain of influenza that, in 1917–1919, claimed far more military and civilian casualties worldwide than the war. By late 1918, the infection had mutated to a less virulent form in Britain, but American soldiers returning from Europe re-imported the worst strain. The last entry in Sidgwick’s diary was a visit to the Metropolitan Museum in New York, a week before Christmas. Her immune system no doubt compromised by the punishing travel itinerary, she was admitted to the Columbia University Hospital with influenza, and died on December 28. She received the academic equivalent of a state funeral in the Columbia chapel, a High Anglican service (at that time, Columbia was officially affiliated to the US Episcopal Church) with high-ranking politicians, diplomats, and university administrators among the attendees. Her coffin—just like her brother’s—was draped in the Union Jack. Casualties of the First World War who died in the service of their country could take many different forms.

Though the chair of the British Educational Mission did not mention Sidgwick or Spurgeon in the memoir he wrote of the trip, Sidgwick’s death sent shockwaves throughout the community of university women. Numb with grief just after the funeral, perched on trunks in a New York hotel room, Gildersleeve and Spurgeon (who had just begun a decades-long romantic relationship of their own) resolved to found the International Federation of University Women in Sidgwick’s memory, tying together groups of graduates across Europe and North America in the name of internationalism and world peace. Its first act was to found the Rose Sidgwick Memorial Fellowship for a British woman pursuing graduate study in the US. At memorial services in Oxford and Birmingham, the tributes to Sidgwick poured in. Sidgwick had left her estate to University House; the new warden, Bertha Orange, decided to spend the money on a fund for books and travel for low-income students. A year later, after their father had died, Sidgwick’s sister Ethel made the trip out to New York to visit her grave at Woodlawn Cemetery in the Bronx. Ethel and their mother had written the inscription on the headstone, and Gildersleeve had seen that it was installed. It read, very simply, “In loving memory of Rose Sidgwick, of the Universities Mission of amity to the United States, died Dec 28, 1918, aged 41 years.”

I have told you Rose Sidgwick’s story on the eve of the centenary of the 1918 Armistice because not only those who take up arms are the casualties of war. Sidgwick did not have the pacifist convictions of her beloved friend Margery Fry, and remained genuinely ambivalent about whether the Great War was a just conflict. Her pride in Hugh’s sacrifice only turned sour after his death. But she, too, was a casualty of war, one whom we might care to remember this special Remembrance Day. The Sidgwick family lost two children to the war. Their loss was as real as that of any other family who lost a child—and as real as the loss that accompanied the fracturing of the University House community; as real as the deprivations endured by the families in France whom Fry and other Friends War Victims Relief volunteers sought to help, who saw their crops burned and trenches dug through their fields; as real as the grief of German-speaking parents who lost their children. Like many of the most ardent members of the League of Nations Union, Sidgwick sought a forgiving settlement with Germany, and a postwar order that would quickly incorporate it within a liberal community of nations. Like them, too, she probably gave less serious thought to the ways that postwar internationalism might align all too neatly with British and French imperial ambitions. But there can be no doubt that she, and the family and friends who survived her, knew that modern warfare does not discriminate in those on whose lives it drops (literal, figurative) bombs.

The Rose Sidgwick Memorial Fellowship was still going strong in 1933, when members of the Women’s Cooperative Guild adopted the practice of displaying white instead of red poppies to protest the ways that remembrance observances were being used to stoke nationalist fervor and justify rearmament. The following year, the symbol was adopted by the newly-founded Peace Pledge Union (nondenominational but founded in connection with the Church of England), which continues to encourage its use to this day. It is above my pay grade to re-litigate the story of the appeasement of Nazi Germany, and the place that popular opposition to rearmament had in that story. And it is somewhat outside the scope of this essay to consider the ways in which this kind of women’s political activism intersected with popular imperialism—(which it, like basically every other form of British politics, did). But what the Women’s Cooperative Guild understood—like, I think, the International Federation of University Women, like Rose Sidgwick and her friends—is that if war is in some times and places a necessary evil, it is always and everywhere an evil. There is nothing glorious about war. Mostly it is death, and it is mud. And even when we imagine the antithesis of this particular vision of war bequeathed to us by the Great War—the sanitized, automated drone killings of today’s wars—the pervasive stink of evil endures.

When I was the age Rose Sidgwick was when she took her degree, and I too was living in Oxford, I sang in a Remembrance Sunday service in which the processional hymn was “I Vow to Thee My Country.” I know it’s a state church, but the conflation of nationalist battle fervor with Christianity sickened me, and has sickened me since, all throughout these years when the red poppy has become ever more commercialized and ever more mobilized as a litmus test of empty, amoral patriotism. For people like Rose Sidgwick and Margery Fry, and many like them who lived through it or didn’t, the Great War gave lie to patriotism. This year I hope you will join me in remembering them, as well as the countless unnamed casualties of war before and since who were not given a choice about whether to identify emotionally with a conflict fought in their name or on their land or with their bodies. I hope you will join me in doing what you can to resist war and violence in your own lives and communities; in informing yourselves about the wars fought overseas, including those fought in your name, and whether you want to be party to such conflicts. There is far too much anger in our world, as there was a century ago. I hope that, like me, you seek to commit yourselves to peace.


Advice to Those Starting Graduate School

(geared—due to a friend’s request for such advice—to those starting a PhD in the humanities at an elite program in the US)

Always keep a sense of proportion and perspective, and keep alive your contacts with the world outside academia. Few problems you will encounter are unique to academia or uniquely bad within academia, and have much more to do with negotiating the working world/adult life generally, especially in our present blah blah neoliberal precarious gig economy which affects most of us under 40. Limit your engagement with academic social media and the social life of your department if you find that stuff only dials up your anxiety.

Take advantage of one of the really special things about academia—the opportunity for flexible working—but make it work for you. There will be times in the course of graduate school when your timetable is not your own and you won’t be able to impose limits on your work, but make sure you’re only working 24 hours/day when it is absolutely necessary. If you need to work 9-5 in order to impose boundaries around your work life, then great. But if you work better at other times, that’s great too. You do NOT have to work 40 hours a week, or any other set amount. Intellectual work happens at different paces for different people. You might find it more helpful to make to-do lists/set goals, and stop when you’ve finished them.

Some actors—university administrators, maybe your advisor or other professors—will try to treat you as if the category “student” means you’re not really an adult. But always act like the professional adult you want to be treated as, and insist on that treatment if it isn’t automatically given you. You have the right to dignity and respect, as do others—don’t defer to someone else simply because they are professionally senior to you, but be polite and professional if you disagree. Act like the dream colleague you want to have when you’re in your dream post-PhD job, including with your grad-student peers. While you may well become good friends with many of your peers, you aren’t obliged to hang out with (or even like) your whole cohort. You are obliged to be civil—unless someone has done something so egregious as not to merit your civility, in which case you shouldn’t give it them. (This will happen at some point—you may want to think in advance about where your boundaries are.) Like any office, the department will have a rumor mill. If you’re a budding academic politician you may wish to use its powers for good—but otherwise remember that if, for example, you date a classmate, everyone else will soon find out.

When you’re teaching, you are a staff member of the university. Dress professionally (within whatever remit makes you feel comfortable according to cultural norms/gender identity/whatever—I am continually surprised by how flexible these norms seem to be now, even for women). Don’t sleep with your students. Do join the union and insist that the university recognize you as the worker that you are.

Always remember that you have agency and power and independence. Disagree with your advisor if you want to take your dissertation in a different direction from what she prefers, or if you have aspirations for your post-PhD life that he hasn’t considered. Disagree with your cohort-mates if they’re causing drama about something that doesn’t merit it. Never lose sight of the job market, but I don’t mean freak out about it. Constantly ask yourself where you’d like to be in 10 or 20 years, and while recognizing that absolutely nothing is guaranteed or absolutely within your control, do your best to stack the odds to make that vision possible. If you want an academic job, it is your responsibility to seek out as many sources of information as you can, attend every professional development workshop your department puts on, stack up your publications and your teaching experience and your progress on your diss and your network of contacts. No one else is going to hand all that to you. If you don’t want an academic job, great—and your PhD journey and the scholarship you will produce along the way is just as valid and valuable as anyone else’s. But do what you need to do to gain a realistic picture of how to enter the sector that interests you when you graduate. Have back-up plans. You have the right to make whatever tradeoffs you need to make, and no one cares if you disappoint your advisor by not following the path she has imagined for you—but don’t expect you can both land a tenure-track job and keep your family in your current city where your non-academic partner is employed (or whatever).

There are limits to what it is reasonable to feel entitled to. If you are fully funded at a top program at a private US university, always remember that you are the most privileged of all possible grad students, and that if you don’t have dependents you are making a comfortable middle-class wage; and think about how you can help those both within your profession and outside it who are less fortunate than you.

You will encounter many people trying to peddle their crack system for note-taking or memorizing or organizing information or archive workflow. I can tell you about my generals study strategy or my archives system, but these sorts of things are highly individualized and through trial and error you will find the system that works for you. Coursework seems important when you start the PhD, but a year after generals you’ll have forgotten all about it. Remember that you spend most of your time as a grad student researching and writing your dissertation and teaching. Anything you find stressful or annoying about coursework will soon pass.

In conclusion, I really cannot emphasize enough perspective, autonomy, self-confidence, agency, dignity, and dialing down the drama whenever possible!

A Doubter’s Sermon for Easter Sunday

This Facebook post from Paul Raushenbush is the one Easter message I’ve seen this year that’s really resonated with me. I’ve been going to church for seven years, but in the last couple I’ve felt more as if I was going through the motions, as if the mystery and wonder of the Christian message was being drowned out by the evil in the world, and as if the gulf between me and my friends and co-congregants who are actually Christian and actually believe was wider than ever. This morning in church, while the preacher—an academic theologian—was selling real hard a message of joy and celebration, I was thinking about Mary Magdalene, about the anguish in “They have taken away my Lord, and I do not know where they have laid him.” If you had lost a close friend, perhaps the only one you had, to a painful and humiliating public execution, I don’t know how comforted you would be by the idea that he was “ascending to my Father and your Father, to my God and your God.” Surely you would still feel a great chasm in your life, and surely you would still feel doubt as to whether that great cause for which your friend made a big song and dance about sacrificing himself to was really so worth him abandoning his friends who loved him.

I watched Casablanca for the umpteenth time the other week, and I still think it’s one of the great artistic products of the twentieth century.But thinking about it as a version, in a way, of what’s going on between Jesus and the disciples helps to illuminate the doubting side both of the film’s message of moral clarity and of that in John’s Gospel. We are told in the film—which was created to encourage Americans to support the Allied war effort—that American self-interest and isolationism (personified by Rick) must be sacrificed to the higher end of helping Viktor Lazlo to carry on his great work in the Resistance. In Casablanca, as in the Bible, women are represented as pawns, and it takes some work to recuperate them as fully-realized, strong figures with agency. So Rick, having said that he will make the decisions for everyone, tells Ilse that it is her duty as a woman and a wife to go with Viktor and support his noble struggle for justice and righteousness, instead of giving in (perhaps?) to her baser desires to rekindle a sexual and romantic affair with someone who isn’t constantly spouting vague messages about what a hero he is in the great struggle. Of course, the film makes us want to believe that Viktor will win against the forces of darkness, it makes us want to continue that work, it makes us want to leap to our feet with tears in our eyes when they sing the Marseillaise in the café. But we also—at least, those of us who are modern European historians by trade—are aware of how much rhetorical and political work went into making World War II into a moral war of good against evil, and how much has been done since in the name of that narrative to paper over the sins of racism and imperialism and fascism deep within American and French and British politics and culture before, during, and long after the war. It is the work that story of the war has done that makes it hard today, in our communities, perhaps among those with whom we have celebrated the resurrection of our Lord, or the freedom of our people from bondage, these last days, even to see the evil and sin that is still co-constitutive with all that stuff about holy miracles and triumphing over death.

If any of the stuff in those stories that we rehearse year in, year out—those stories I have memorized word-for-word after seven years of churchgoing—ever really happened, I wonder too if Mary Magdalene, when she arrived at the tomb early that morning to take care of her dead friend’s body, was irritable and frustrated, empty and bereft, and if the disappearance of Jesus’s body felt like just another thing that she and the other Mary and any of the men who could be bothered to stick around and help had been left behind to deal with; while their great friend, who was as difficult and self-absorbed as he was charismatic and kind, had preferred heroics to the down-in-the-dirt daily work of sticking around to care for the sick and the poor. For, these days, when I kneel and someone recites a familiar list of prayers for church leaders, political leaders, zones of conflict, the homeless, those in the community who are sick and dying, I think it feels a bit like what Mary Magdalene might have felt when she saw the empty tomb—the divine presence has left us here, rather haplessly, to carry on its work, with little guidance and without even a body to mourn. It’s hard to imagine that a ghostly, glowing presence popping up from time to time to offer elliptical neo-Platonic remarks about ascending to the Father could have substituted in the minds of the disciples for the kinds of practical, down-and-dirty work for which Jesus became famous during his lifetime. What I still find compelling about the Jesus story is how practically-minded his miracles are. They’re about making sure people have enough to eat, a lot of the time; about easing their physical pain; about being kind to people—I think especially women—who were used to fading into the background; and along the way about giving life to a new kind of radical political message. I think, if I were Mary Magdalene, and it was that practical message—making a real difference in ordinary people’s lives, not least my own—that had drawn me to follow Jesus, I might be thinking, “Why did you have to go and make everything so much more difficult? Now how am I going to feed every beggar who calls at the door? What am I going to tell everyone? Yeah, yeah, I know, you have to ascend to the Father, you said, but does that mean you’re too good to do the washing up? We had to feed a lot of beggars this week, and there’s a lot of it.”

Whether Jesus lived or died, whether he was or wasn’t co-equal with the Father, whether we celebrate his resurrection or not, makes awfully little difference to whether all the beggars on this planet stuffed full to bursting have enough to eat, or whether we are capable of passing on to our children an earth worth inheriting. I am still inspired by what Jesus called on his followers to do, who he called on them to be. But even on the first Easter Sunday (inasmuch as there was one, etc.) Mary Magdalene still had to go home and get dinner on the table, perform emotional labor for all the male disciples as well as process her own grief, and get on with doing the work of living here on earth. She didn’t get to—we don’t get to—ascend to the Father. We have so much work to do, and somehow we have to find the strength to carry on.

Me, Too; or, Gender as a Category of Historical Analysis

The first week of my PhD, I was entering the elevator in the history department building when a male grad student—someone whom I had already identified as a bit weird, a bit icky, to be around—followed me. We nodded hello. The doors closed. There was a silence, which he broke. “So, you’re the one who works on sex,” he said. “No,” I said flatly, “I don’t”—for not only did every instinct I possessed scream that being alone in an elevator with a man who has just said the word “sex” is a situation that has to be shut down as quickly and coldly as possible; in those days, having just come back to the States from Oxford, I had absorbed certain preconceptions about working on gender and sexuality. I thought that leading with these subjects would make me be taken less seriously as a scholar, that the only way in with the Big Guns was to purport to be an intellectual historian. And I knew from experience, honed since I first started going to conferences and talking about my work in public fora online as an undergrad, that the “sexuality” research-interest box is so all-consuming that it is very easy to get typecast, very easy for colleagues to forget that you do anything else, very easy to find yourself only talking to interlocutors who do exactly the same thing—and who maybe take just a little too much prurient interest in rehearsing, say, the content of nineteenth-century pornography or the exploits of the men who populated London’s cruising grounds of yore.

This is, of course, a “me too” story, one of the large number of anecdotes women have been sharing—mostly to sympathetic audiences on Facebook—in the last couple days, I suppose as a consciousness-raising and solidarity-building exercise. Others have expressed sentiments that are variations on the theme: that they feel guilty saying “me too” because their daily experiences of sexism have never shaded into the horrific, the violent; that they are looking for a form of action that is concrete rather than symbolic; that they are experiencing cognitive dissonance between the righteous moral outrage of left-leaning metropolitan-elite social media and the impossible task of how they would ever broach subjects like this with their fathers or their brothers or their high-school classmates. Scrolling through my news feed and reading the litany of “me too” and the periodic expansions upon it, though, what came to my mind are the other things I think my anecdote about the Fayerweather elevator suggests: the ambivalence I’ve had throughout grad school about whether to identify as a gender historian and my simultaneous frustration with sexism in intellectual history; the ways that my research into gender and sexuality have often either symbolized or sublimated my personal relationship to gender and sexuality; and the way that experience has trained me to flatly shut down, run away from, and view with deep suspicion forever any man with whom I have a slightly unpleasant interaction. It doesn’t feel safe—for one’s career or for one’s emotional and mental health—to leave oneself open to the possibility that something might go wrong. (And, indeed, years later I learned that this particular grad student had a history of violating women’s boundaries in far more inappropriate ways, and that he did so consciously and not through social ineptitude—much as I later learned that the senior academic I backed away from because he really, really wanted to stand in the corner of a classroom and talk to me about how we had attended the same Oxford college fifty years apart also had a history of harassment.)

So, three years on, I lead with my research interest in gender, because I am not in Oxford anymore, and also I think times have changed across different institutions since I first started doing research in modern British history. I was at a discussion yesterday about a new monograph in economic and political history, and multiple participants—even those who don’t consider themselves to work on gender—suggested ways that the book might have incorporated a discussion of gender. That would just not have happened five, six, seven years ago when I first started to sit in the corner and listen to faculty and grad students talk about new historical research. (Maybe some will contradict me, but it didn’t happen in the places where I was at that time.) A colleague once said that when deciding which of her historical interests to pursue in graduate study, she made her decision based on which group of historians she’d most like to be in rooms talking with. The openness of modern British historians to a wide variety of thematic and methodological approaches, particularly at the present moment, is one reason why I’m loyal to my field. My colleagues who work on political thought can also think about gender; my colleagues who work on neoliberalism are alive to culture as much as to economics. (It strikes me that this is one reason to continue to promote national histories, even after the “global/international turn”—they can serve as useful containers in which to put an eclectic set of methodological approaches, which wouldn’t be the case in a field that is primarily organized around a methodology, such as intellectual history.)

I also find, as I gain more expertise in my subject, that I have more to offer to the wider public discussion we are currently having about how particular norms and practices surrounding gender structure our society. In due course, naturally, the outraged Twittersphere will move on to something else, but right now gender is having its moment. I became interested in the ways that gender difference, and gender segregation, loom large in elite British culture and the culture of elite educational institutions in particular because I have spent a significant part of my adult life living within elite British culture and its educational institutions, and it turns out that there are highly specific historical explanations for the reasons why this culture works the way it does. But once you’re attuned to reading documents for evidence of how gender works, you see it as much in the Greek-letter organizations of American universities, in the Boy Scouts, in the President, in your own social and professional networks, as you do in nineteenth-century single-sex colleges and student societies. The disadvantage of this, of course, is that you can’t unsee it—and I find that as a woman, it makes it harder for me to relate normally to men, to see them as something other than either research subjects or potential predators. This is a state of mind I find it unpleasant to live within, and I’d like to find some way of moving past it.

They say the public have had enough of experts. And that may be true, but sometimes I think we experts are not sufficiently imaginative about how we might reach people with our expertise and use it for good. Expertise is not something best promulgated through broadsheet op-eds or through written work intended largely for already sympathetic audiences; it’s not only a matter of theorizing about something abstract like, say, global economics that seems removed from the experience of daily life; it is not always about having more facts and about the top-down dissemination of them. Instead, experts can encourage non-academics to think analytically about something they might not have thought about before, and then to think with them as they bring that analytic tool to bear on new pieces of information or experiences or feelings. Coincidentally, this practice has a name: it’s called teaching.

Thus far in my academic career I have only given three undergraduate lectures. If I do not win the academic-jobs lottery, I may never give another one. But in each one, I have found a way of telling the story about the past dearest to my heart, the one where my interest in the past began. In each one, I have told fifty-odd young people new to thinking analytically about the past that our modern conception of homosexuality is historically constructed and contingent. I have given them some heroes overcoming adversity that they can take away with them, but also some ways the story is more complicated and less satisfying than that. Every semester there are a couple students who actually care—who use the content from that lecture in their final paper, who come up to me after class and ask a follow-up question. Hopefully, though, in still further students, the presentation of a new way of thinking about a familiar topic plants a seed that they might, unconsciously, come back to later, even if they no longer connect the thought to that particular lecture or to the history of nineteenth-century Europe. In many forms of psychotherapy, it is held that the conscious awareness of patterns of behavior or ideas or emotions that you may have unconsciously held since early childhood, and of how those tendencies might have originated, can help you to move on from tendencies that might be unhealthy or unhelpful and lay down new patterns. The more that we can point to the things we read in the news—or, more important, every single interaction we have every day—and say “look, gender is happening here,” the more we can recognize harmful behavior when we see it, and move outside boxes of gendered behavior that imprison and hurt everyone.