Coming Out, Again and Again; or, In Which I Demonstrate That I Can Spin Out 2,000 Highly Personal Words In Response to a TV Season Finale

I just finished watching Gentleman Jack, and my overall view of the final episode is: cloying Hollywood ending is cloying. Without question the best part of this episode is Sofie Grabol as the Danish queen; second-best is how, in just a few brief well-chosen scenes, the first half of the episode efficiently and effectively conveys how lengthy and arduous all journeys were before trains. The split wedding scene itself was okay, using the language of the BCP to good effect to show how—as was the case generally—people were creative in adapting this familiar, powerful language to the rhythms of their own lives. (I don’t think I have actually ever heard the 1662 Eucharist ever in my life before? But in those pre-Oxford Movement days how did they just walk into a random church in York and find a communion service?)

Last week at the conference I had the somewhat heady pleasure of being welcomed into a lunch circle of queer women and trans senior scholars who had all devoured this series, most of whom know the text of Lister’s diaries much better than I do, and were talking animatedly about the whole phenomenon. For me, it was an extraordinary experience to be welcomed and included in this conversation, but I did fit in: pretty much none of us liked the presentist depiction of the Annes’ relationship or the way that Lister is represented as making born-this-way arguments about a sexual-object-choice-based identity, though we did think the series gets some things right about Lister’s gender identity and we all loved the estate management/Tory politics stuff. (People thought I was funny when I called it “Gay Poldark,” which I stand by.) On reflection, though, I am kind of struck by the gulf between this collective opinion and a wider one, perhaps, of the “can’t we just have this one thing?” variety, which sees in Lister the potential for an inspiring, exciting, sexy fictional character. As I write this, I apprehend something of how annoying people probably find it to be married to historians, always coming along to spoil a pleasant night hanging out on the sofa by nit-picking about television, which is always of course necessarily fiction. That those of us sitting on the grass outside the Birmingham history department on Friday were nit-picking about how you might act on screen the idea of, in 1832, defining your identity in terms of gender inversion and not sexual object choice—and not, like, the costumes (or, you know, what “death recorded” means)—is perhaps no less annoying to those who admit that the point of fiction is that it can convey something other than what actually happened in the past.

This has been an extraordinary few months for my own sense of identity and political belonging, with June fifty-years-since-Stonewall at its center. The work I did in preparation for our Sexuality & Erudition workshop at Princeton in mid-May brought me, slowly, back towards primary and secondary sources I hadn’t thought about in years, and the whole Naomi Wolf contretemps caused me to reread my senior thesis and remember how much I cared about Symonds and also why it mattered so much to me that both academic pasts and usable pasts get the queer past right—for I feel such great love for the queer past that I hate to see it manipulated or misconstrued, even when that’s in the interest of presenting a happier or a more accessible, less complicated story. The work I presented at the conference last week was kind of a confluence of the Sexuality & Erudition stuff and the axe I was grinding at Wolf—and while I wrote a dissertation chapter in the middle of all this too, about national politics and higher education policy in the 1900s-20s—it sure doesn’t feel as real or as pressing as the Sexuality & Erudition paper or the Wolf review or the paper for this conference or the job materials I have been worrying over wherein I pitch The Second Project.

On Thursday, when I had been sitting in classrooms listening to people speak about postwar British social history all day and was getting a little antsy, I spotted a Misuse of the Queer Past on Twitter and issued a snide and tetchy tweet about it, closely related to the paper I was about to give at 9 the next morning. This had a good reception on Twitter; my friends started making fun of me for bashing Stonewall (the organization); and then there was the paper and the heady queer lunch talking about Gentleman Jack. Somehow all of this caused me at some point, casually, to call myself a “professional queer,” words that I had not uttered in probably a decade. And that act of naming, in turn, unleashed a whole lot. I was still talking to my friends whom I get to see once or twice a year (another reason why this conference was to be cherished), and I heard as if someone else was speaking it the clarifying sentence—true, but strange to my ears—”I used to cover LGBT politics for a national DC-based magazine.” Was that really me? Yes, I know that I used to sing in an Oxford chapel choir—but Emily Rutherford, Dyke Reporter? In the days since, more scenes have flashed suddenly and vividly in my mind, some long forgotten. The time I, like every left-wing intern before or since, snuck into the CPAC conference and got thrown out. The time we carried a sign that said “Even Princeton” in the March for LGBT Equality in Washington. The time every student in Princeton woke up to my bowl-cut 12-year-old-boy head adorning a character-assassination piece in Princeton’s conservative magazine. But the one that stayed with me and kept me up late last night was in the spring semester of my freshman year when I met two women on campus, a professor and a senior administrator, who dressed like I did then, in men’s or men’s-type collared shirts and khakis. In those days I had worked in a cinema, as a theater electrician, and shelving books in the library, places where women regularly wear men’s clothes. But until I met that professor and administrator, I genuinely did not think that it would be possible to get a professional job, or one that involved any kind of public-facing work like lecturing, if you were gender-nonconforming. Thinking about Anne Lister, and about that extraordinary lunch on the grass on the Birmingham campus on Friday, what I get stuck on is the time in the spring of 2009 that I was sitting at a big table in a conference room during a committee meeting, and I turned to the person on my right, who was chairing the meeting, and realized she was dressed just like me. I feel a rush of emotion still, as I write this, many years since anyone has yelled at me for being in the wrong bathroom and so, so much older and more tired.

I have not been in a relationship in many years now—increasingly, and increasingly emphatically, by choice. For me, it in part follows from that state of affairs that I have felt more freedom to present as a woman without feeling that I have to get the illusion exactly right in order to keep the society of women from taking my woman card away. I remember being horrified when I was 14 and my mother suggested to me that I would not be socially ostracized for wearing a dress one day and jeans the next; fifteen years later, that is more or less exactly what my wardrobe looks like. On the other hand, as sex and relationships have faded farther from my purview, I have felt much vaguer about claiming any kind of sexual orientation, any kinship with others on the basis of sexual orientation. I have thrown myself into learning the wider field of British history, into travelling a lot and teaching my students, and into the minutiae of intramural politics at a few universities over a few decades in the past.

This has all, though, these last few months—the workshop in Princeton, and the intense and exciting collaborative experience of planning it with my co-conspirators; nominating myself on the internet as the guardian of John Addington Symonds’ legacy; that lunch on Friday—given me a yearning desire to belong to queer community again, to actually affirm that I am of my people instead of staying silent out of a kind of reticence or embarrassment or privacy (or just, like, dissociation from my body) and hoping people will assume the right thing. I am grateful to the queer communities who have accepted me through my implications rather than assertions up to this point, though I feel that I have to come clean that I am not, as I think many have sometimes assumed absent any kind of indication or clarification whatever on my part, a lesbian (but, you know, nor was Anne Lister, and she still gets a blue plaque saying she was, so).

In the past I have dated and been in relationships with men and women; the Kinsey Spectrum needle has fluctuated wildly and continues to do so on an almost-daily basis. I have, however, been single and celibate for nearly six years, and am increasingly affirmatively committed to celibacy as a concrete lifestyle choice and identity-political category. This does not mean I am asexual, but it does mean that I choose to organize my life around principles other than sex, dating, relationships, marriage, and children. This may change in the future, but it currently holds disproportionate significance for me given that I am nearly 30 and everyone is getting married these days, regardless of the gender of their spouses.

One of those women from college who wore button-up shirts said something beautiful once. We were at a professional dinner and another professor there asked her something about her wife. My mentor clarified that she and her partner are not married: “We’re old-fashioned gays, and we don’t believe in marriage for ourselves. But I’m so happy about marriage equality. There’s no use making a political statement to opt out of something unless there’s something there to opt out of.” I think the “born this way” narrative has led us to underrate the element of choice in queer lives, relationships, and communities. Queer people past and present do, all the time, make daily choices about whether and how to fit in or stand out, to break the law or not, to do so in secret or in the open. When we constantly walk the boundaries of what is acceptable, we know that every aspect of how we dress or how we walk or whom we look at is a choice fraught with meaning, sending a signal about cultural affinity both to those hostile to us and to those on our side. (I remember the electrifying cruising ground that the college dining hall could be, how expert I became at detecting a certain kind of gaze that one man would give another across the tables.) This is no less true in those places today where the medico-legal regulatory regime has been persistently and steadily domesticated to the point of collaboration with queer self-fashioning.

Which is all to say that I am still queer despite my celibacy, and furthermore that (I would contend) my celibacy is a kind of queerness, for all that it is a choice: something which positions me as just as at odds with homonormativity as with heteronormativity, a form of resistance to a kind of totalizing logic about what sex and sexual orientation has to do with one’s personhood. Which is all to say, phrased a different way, that Call the Midwife—and not Gentleman Jack or Fleabag or Killing Eve or The Handmaid’s Tale or Game of Thrones—is the most radical and remarkable show on television, and queerer than you might think.

On (Academic) Writing

For several months now, I have regularly been posting on social media photos and screenshots of my efforts to write my dissertation. Some might call it self-aggrandizing, but I have found this habit—and the “likes” and comments it generates from my Facebook and Instagram friends—to be a powerful source of motivation that keeps me cranking out words daily. Due to posting these photos, I often receive requests to describe my writing process, and in particular what I am doing with the hundreds of little slips of paper with which I cover my desk and bedroom floor. So (and really this is just an excuse to procrastinate on cataloguing the endless varieties of misogyny expressed by interwar male undergraduates, which I find rather wearing) I thought I might write something describing how I write.

It’s necessary to begin with several caveats. First, there are not objectively better ways to write—or to do research, or any other academic or creative task. People find the systems that work well for them based on their learning styles, writing styles, and other life circumstances (for example, if you have children, your writing process may look very different to mine!). Second, while I am probably fairly good at being a fifth-year graduate student, I am not the right person to advise peers at the same career stage as me how to write a dissertation, because I haven’t written one yet. Furthermore, due to a combination of the structure of my program and receiving an extra year of funding through an additional internal fellowship, I have not had to teach for the last two years and thus have been able to engage in an intensive period of research and writing. While I do part-time work to supplement my income, it amounts to less than ten hours per week, and I mostly do it in the evenings and weekends. I say all this not to brag, but rather to observe that structural privilege accumulated over many years affects our ability to write in the way that we would most like. For example, the flexibility of my schedule allows me to block out a few hours every morning, at the time when my brain is sharpest, to devote only to my dissertation, a circumstance available to few graduate students. I am thus not the right person to advise the many peers whose path through graduate school has been considerably more difficult than mine.

In general, most of the brain-work I do is not very systematic. I have a free-associating mind rather than an analytic one, whose natural tendency is to dive very deeply into one topic rather than assessing patterns across different cases or bodies of material. I mostly do history by collecting very large quantities of archival material and then simply telling the reader, at some considerable length, what they contain, before embarking upon an editing process in which I compress that description to a more manageable size. When I get round to writing, therefore, I have spent many weeks or months in an archive or several, and have hopefully had the time to organize my findings, taking apart the Word document of hundreds of pages in which I have recorded my stream-of-consciousness impressions of the archive and creating an entry for each document in my Zotero database. I don’t do much in Zotero beyond recording the correct citation information and organizing all the documents by repository and by date. But I can then print off that database—the poor trees, but for me the most essential part of the process (enabled by the laser printer I bought out of the research account I am exceptionally fortunate to possess)—and get on with writing.

The writing, then, starts with several hundred pages of source material, and a space like a floor or a big table, large enough to lay it all out and get a visual sense of the shape of it. Even a large computer monitor doesn’t quite provide the scope for this, for me. Spreading it out makes it possible to see patterns—for example, if student politicians across many different institutions all became preoccupied by a particular topic in a particular year—and also to trial different ways of structuring the argument and exposition that I might follow over the course of the text. It’s actually possible to skip this step if the piece of writing is very short (such as a conference paper or a blog post), as in that case it might be possible to hold the whole structure in my head at once and/or to take a less comprehensive approach to incorporating all the source material. But for a dissertation chapter, it is absolutely essential—I shouldn’t be able to do any actual thinking without it. I used to nail down a structure on the floor and then paste all the slips into a notebook, but more recently as I have been working with larger and more complex quantities of material I have found that I want to revisit and revise the structure as I write. I keep the slips laid out on the floor, or file them in a more mutable way with paper clips and folders.

So that’s what the slips of paper are. They become the raw material out of which the story is written up, and I interpellate in my own words context and analysis that lead the reader from one slip of paper to the next and sum up what the whole picture is. To draft, I use a piece of software called Scrivener, which allows one to do various fancy split-screen things that make it possible to see, for example, an outline, multiple chapters, and the footnotes all at once, and to save notes and chaff in different folders as part of the same database as the main text. (Scrivener costs $38 with an educational discount; I am fortunate to have been able to afford to buy it.) I write at the rate of about 600–1,600 words per day, which will take between two and four hours of concentrated, intensive thinking that typically leaves me too exhausted to do brain-work the rest of the day. I do this about six days a week, every week (it has been many years since I took a vacation). I usually write between 10am and 1pm, and do other work, paid work, or housework in the afternoons, but occasionally this varies or I have enough energy to be able to write all day. I try to push myself if I can, but listen to my body and stop if I can’t. I never write in the evenings, and only rarely, under extreme time pressure, do other work at that time.

While on the face of it this quantity of word production in this amount of work time might seem productive, it is actually not efficient at all. With so many slips of paper, it can take weeks to progress through a given section of the text as it is laid out on the floor. Almost every aspect of writing the dissertation so far has taken much longer than I had imagined it would, and my goal of finishing a draft by September has slipped progressively further out of reach. Moreover, this word production is the first in a series of revision stages. The first drafts of the chapters that have so far got as far as a first-draft stage have been 35,000–40,000 words long: baggy objects that are not only of an inappropriate length for the genre of thing that they are, but would be much too onerous to ask anyone, even my long-suffering advisor, to wade through. It takes another series of weeks to sculpt this material into something that looks like a chapter. My goal is always to get it under 20,000 words before I show it to anyone, through a painful and painstaking process of winnowing. Then comes the slow round of workshops and seminars, getting feedback from expert and general audiences that will be the basis for further revisions. In previous writing tasks this process, potentially endless, has been put to a stop by a university-imposed deadline or by publication of one sort or another (in a journal, on a blog). But in this case I am several years away from the book going to press which will mark the formal end of writing.

My way of thinking about and doing writing has been influenced—I am still realizing how much—by the semester I spent in John McPhee’s writing workshop in spring 2009. McPhee, a longtime New Yorker correspondent and the author of many works of long-form journalism, has taught a creative writing seminar at Princeton for decades. Most of the students are aspiring journalists, as I was then—though most were much better writers than I was, and I think McPhee was rather tired of my tendency to insert myself into every piece of writing I created for the class. The main lesson McPhee impressed upon us was about structure. Clarity, he said, comes from the structure the writer creates, which leads the reader through the events or concepts discussed in the text and slowly opens out its central meaning to the reader’s understanding. In class, we scrutinized the diagrams McPhee had drawn to visualize the structure of his own essays, often metaphors taken from the natural environment which has been the subject of so much of his writing: a snail shell, a wave. This class was my first realization of how important it is to be deliberate and disciplined about writing, rather than just expressing one’s feelings. I took to heart the structure lesson. It fell to the back of my consciousness when I became disillusioned with journalism and began to study history, however, and I was surprised recently when reading a New York Times review of McPhee’s new book to be reminded that his office, like my bedroom, is filled with little slips of paper that he obsessively arranges and rearranges in an effort to make meaning. Evidently, I had taken away more than I had imagined; evidently, too, there is more than one way to be a writer. But now I, too, tell my students about structure.

I have always had a natural aptitude for writing, though it has improved considerably as I received more formal instruction in the subject and have been prompted as a professional academic to think more consciously about suiting one’s writing to a specific task. As an adolescent, I had perceived writing primarily to be a means of self-expression, and rebelled at restrictive formats. But now I see it primarily as a means of communication, and am happy to embrace formats as constructs that convey meaning and aid the reader. A dissertation chapter, a grant application, an online essay for a general audience, an email all have different audiences and purposes, and the format and style one selects must vary accordingly to be effective at getting one’s message across. I believe it is possible to (strive to) marry beauty and eloquence with clarity and analytic rigor. I admire in others, and try to achieve, a style which is engaging but also articulates a clear argument and makes it possible for the reader to follow it without working too hard. I have had interesting conversations with colleagues who opt for what I might call a more subtle or literary style, to tell a story more than to deliver an argument. It is clear that this is a matter of personal preference that might vary, and that when workshopping others’ writing it is important to respect what their aspirations for their own writing might be.

It is difficult, however, to remain true to personal preference in the context of an academic environment, where one has the sense that one is constantly being evaluated, that extenuating circumstances affect one’s ability to follow one’s instincts. Almost daily, my feeling of pride at having created 1,000 usable, relatively intelligent words in a morning is subsumed in my feelings of guilt, anxiety, and exhaustion: are those words good enough? why am I so tired now? how might I summon up the energy to do an afternoon shift? how will my advisor, or a conference audience, react to them? am I making enough progress to finish on time? will anyone else find this interesting, or just me? If I had to live solely by selling my words instead of by the strange cocktail of things that might see one lucky enough as to win the academic jobs lottery, I might be asking different anxious questions. But I have often felt that if I somehow had income coming in regardless and could still be writing for my own gratification, I would be much more confident in my ability to do good work, and to stop when I am tired and it isn’t possible to do more. It does not help that many in permanent jobs seem often to advise advanced graduate students about writing in precisely the same tone they would take with undergraduates ten or fifteen years younger, forgetting that grad students might have extensive professional training and experience in writing and editing—or, as I indicated above, that writers’ own personal processes and styles may differ and that what works for one person may not work for another.

We all, whatever our level of seniority, I think, often forget that work does not always look like work, and that there are all kinds of non-obvious pursuits—baking, going for a walk, sitting in the pub with friends, playing video games, childcare—that for many are actually an integral part of the writing process. For those who do their best thinking while doing something else, it is necessary to set time aside in the day not only for office/desk time but also for these activities. It is necessary also, whether an individual graduate student’s routine includes caring responsibilities or not, to respect that a 9–5 office schedule is not everyone’s best way of achieving work–life balance. To me, the greatest benefit of academic work has always seemed to be the flexibility of its hours. While for some the babysitter’s schedule is the most important constraining factor, or the 9–5 really is an ideal setup, for others the time that has to be set aside is late at night, or as in my case the hours of 10am to 1pm and those alone. For others still, the luxury of being able to do the grocery shopping in the middle of the day on a weekday may be the one thing that allows them to get the housework done alongside their more-than-full-time job. Much as the writing process takes many forms, and not everyone may be best served by drowning themselves in little slips of paper, so too is it the case that different people work best at different paces or at different times or in different ways. We do not guard work–life balance in the way best-suited to making academia accessible to everyone if we insist that everyone conform to a 9–5.

And with that, I am off to play video games. A happy 2019 to all!

On the Centenary of “Nine Lessons and Carols”

Among this year’s many centenaries, English choral music nerds are celebrating the hundredth anniversary of the “Nine Lessons and Carols” liturgy as celebrated in the Advent and Christmas season in many parts of the Anglican Communion. Lessons and Carols has a lot to say about the intellectual, elite cultural, educational, and religious history of modern Britain. It has a prehistory, a prototype of the liturgy having been conceived initially in 1880 by the writer and churchman E.W. Benson. It was of a piece with the liberal Anglican Incarnational theology of the mid-to-late Victorian period, departing from strict conformity to the Book of Common Prayer to offer a modern, not an early modern, form of plainspoken English prose, wrapped up in one particular story of the Word made innocent flesh.

As the story goes, Dean of King’s College, Cambridge Eric Milner-White picked up Benson’s idea in 1918, and repurposed it for a new era. Everything about the Chapel of King’s is invented tradition: though construction began on it immediately after the foundation of the college in 1441, most of the internal decor comes from the mid-sixteenth century, when construction was completed and Henry VIII re-endowed the foundation. King’s as an academic institution in its own right, and not simply a retirement home for Eton scholars, is also an invented tradition, this time of the 1850s-60s; and the conceit of the Chapel as one of the nation’s great musical institutions is an invented tradition, too, borne out of the renaissance of High Anglicanism of the early-to-mid-nineteenth century and, among other things, out of the fact that fellow of King’s Oscar Browning rather fancied choristers. Of course there had been collegiate choral foundations for centuries, and Eton and King’s had always had choirs at their heart (the Eton Choirbook is our best source for pre-Reformation English liturgical music). But with the Victorian period and its romanticization of childhood came a return to the English male-voice choir, with its characteristically heavy treble sound borne of having about 18 boys on the top part and four adult men on each of the other three parts. New composers wrote new music for these reconstituted choirs: veterans of Anglican choral music will know C.V. Stanford’s canticles, for example, some of which were premiered by the King’s choir when Stanford was Professor of Music at Cambridge.

Back to Eric Milner-White: Milner-White became Dean at King’s in 1918, fresh out of army service, like a lot of other men who came back into the universities after the war, some of them starting fresh as mature students and some of whose studies had been interrupted. Many men in universities across Britain in the immediate postwar period had a great longing to return to a romanticized status quo ante, to imagine their universities as ancient foundations, filled with ritual and male bonding. They could be awfully mean to the women with whom they shared those universities. But at King’s there were, of course, no women to worry about (the larger university was another matter—Stanford was against women’s degrees, Milner-White in favor, to give you a flavor). For Milner-White, the task was to create something suited to the experiences and level of religiosity of the new generation of student-veterans, but which had the feel of ancient mystery. So he borrowed that Truro service, and added new elements, like the bidding prayer that (now traditionally) opens the service. It’s not hard to hear 1918 in the language “Lastly let us remember before God all those who rejoice with us, but upon another shore and in a greater light.”

It wasn’t all roaring in the Twenties: there was a slow ebbing of the high tide of immediate-postwar social liberalism, and there was the BBC, that great voice of high-minded national conservatism. There were multiple services of Anglican worship every day on the interwar BBC, and it first broadcast Lessons and Carols at King’s live to the nation and the empire on Christmas Eve, 1928. It has done so ever since, in latter years adding a prerecorded television broadcast. The sound of the King’s choir made its way onto records and cassettes and CDs and into the multivolume series Carols for Choirs. Generations at this point have grown up hearing, and perhaps learning, 1950s–70s King’s Director of Music David Willcocks’s arrangements of carols such as “God Rest Ye Merry Gentlemen” or “O Come All Ye Faithful.”

The World Service and the Anglican Communion are two of those puzzling vestiges of British Empire that still linger, conspiring both to make invented traditions appear far more ancient and to perpetuate an image of a chocolate-box, amusement-park quaint Britain, in which the revival of the King’s chapel and choir could certainly have nothing to do with Oscar Browning’s views on prepubescent boys and in which British power’s self-evident manifestation as violence could be forgotten. Nowhere was this more true than in the United States, where twentieth-century conservative Anglicans and Catholics and Anglo-Catholics more than ever revered an imagined, romanticized Britain. T.S. Eliot did, of course, and William Buckley, and many other people I have met in the years I have spent attending Episcopal churches in the United States, many of them my age or younger. If you want to hear a faithful rendition of Lessons and Carols on the far side of the Atlantic, the place to do so is at the corner of 53rd Street and 5th Avenue in Manhattan, where stands the US’s only Episcopal collegiate choral foundation. Most of the clergy and staff are English. The director of music there, late of Magdalen College, Oxford (another of the great medieval collegiate foundations) will be taking over at King’s next year. Entering that space off Fifth Avenue is like stepping into a parallel universe. The music sounds beautiful, but whenever I go I leave not knowing whether I liked it.

Most cathedrals in England and Wales, which daily preserve and rehearse the Anglican choral tradition, have in recent years come to recognize that there is little to no anatomical difference between the voices of prepubescent girls and boys, and manage to create the English treble-heavy sound with mixed choirs. Other churches around the country and the world are less attached to that sound get on just fine with adult women singing the top parts. But Stephen Cleobury, the outgoing director at King’s, whose daughters have both sung in girls’ and mixed-voice Anglican choirs, has said that there is a distinctive value to the King’s choir remaining single-sex. In this he sounds rather like C.V. Stanford, and other Cambridge men just exactly one hundred years ago, who held that it was important that there be one single-sex university left in England. It sounds like something that could conceivably be true, but that no one manages to prove quite so much as state over and over again. That this is the same conversation that was happening in Cambridge a century ago speaks volumes about the history of English elite institutions and the ways they continue to reinvent themselves.

Lessons and Carols is such a quintessential invented tradition that it is even cited in the introduction to the famous Hobsbawm and Ranger Invention of Tradition volume, which coined the term and which most history undergraduates will at some point be asked to read. What place invented traditions a hundred years on? We need to have conversations anew about the kinds of public ritual that speak to our present condition, and not hesitate to invent new traditions instead of rehearsing old ones out of a kind of misplaced empty piety. But what Benson and Milner-White and all their collaborators got right, I think, was that forms of music and art and ritual and worship created to speak more immediately to our times—to, in the quest of the Church of England ever since the Act of Supremacy, get bums on seats—must not condescend or presume to know their audience better than they know themselves. The Lessons and Carols liturgy is pitched high, but not abstrusely so, and it presumes that its audience can come meet it where it is. It tells us something about the interwar BBC that they thought broadcasting this worldwide was a genius idea, and what it tells us isn’t all good. But that people still tune in on the radio and TV tells us, I think, how many of us are still looking for a peace that is not to be found here on Earth, and not to be found with any certainty in megachurches any more than it is to be found in homeless shelters and war zones and refugee camps. There is no preaching in Lessons and Carols: it creates a space for us to be; and its layered liturgy centers, under the sediment of British history, on a family of refugees cast asunder by the administrative reach of a foreign imperial power. It is for candles and for music and for prayer and for peace, and rather oddly—and this is one of the mysteries that is, for me, at the heart of British history—it retains the power to bewitch, despite its deep association with the heart of a nation that has committed great acts of violence and done great evil in its citizens’ name. That, in its intention to tell “the tale of the loving purposes of God from the first days of our disobedience unto the glorious Redemption brought us by this Holy Child,” it should hit upon this particular national problem of evil as well, seems about right.

Remembrance

Note: There is extensive original archival research backing up the information shared in this essay, which is drawn from my Ph.D. dissertation. While I have not included citations in-text, I am happy to provide references upon request.

Had I been alive one hundred years ago, my life might have been a little like Rose Sidgwick’s. Sidgwick was 41 in 1918, and I am 28 now, but otherwise the similarities stand. Born in 1877, the oldest child of an academic family, Sidgwick had access to impressive educational opportunities, and finished her first-class BA in history at Oxford the age of 22. Structured PhDs were not yet common for lecturers in the UK, and after her first degree Sidgwick lived with her parents in Oxford, pursuing the mix of part-time work, further independent study, and semiformal education common among many young people at the turn of the twentieth century who aspired to an academic career. While doing a job at the Somerville library she met and began a relationship with the librarian and maths tutor, Margery Fry. Fry was shortly to take up a new position as warden of the new women’s hostel at Birmingham University; she negotiated a history lectureship—the spousal hire of its day—so that Sidgwick could accompany her. Sidgwick began her first (and what would turn out to be her only) full-time academic job in 1905, at age 28.

University House, the Birmingham hostel, must have been an extraordinary place to live and work. Fry, Sidgwick, and the like-minded women they hired to join their resident academic and pastoral staff sought to build a new kind of vibrant community for the young women in their care. Most of the first women’s halls of residence at UK universities, run by wardens of an older generation, were deeply worried about respectability, preoccupied by the need to assure parents that it was safe to allow their daughters to live away from home, and aware of their marginal (and sometimes contested) status within the university. The residents, mostly in their early twenties, complained that they were treated like schoolgirls. But Fry, Sidgwick, and colleagues such as Marjorie Rackstraw and Bertha Orange were part of a younger generation of women who had been to university themselves, and who were often inspired by the freer pace of academic and social life at North American women’s and coeducational colleges. Benefiting from the support of a vice-chancellor who prioritized women’s education and gave them a free hand, the University House staff treated their students with dignity while still looking after their welfare. Inspired, perhaps, in part by Sidgwick’s father, who had fostered a similar kind of community among his men students at his Oxford college, they opted for a kind of controlled silliness that had an implicit higher purpose. Students put on plays; they made a snowman in the image of the vice-chancellor; the staff could be relied upon to do a comic song-and-dance routine in the end-of-term entertainment. Through this, they knit together bonds that might sustain these young women in the difficult life-course they were undertaking. The overwhelming majority of women university graduates in this period did not marry; a strict binary pertained between marriage and a career; women who desired to work in professional occupations or otherwise pursue a public life faced an uphill battle. Single-sex community—and a physical space where young women could be themselves—was a compensation, but also an essential form of nourishment. The letters that Fry, Sidgwick, Rackstraw, and other Birmingham friends exchanged throughout their lives testify to the richness of the love that these women felt for each other.

A new-built brick building on a hill outside of Britain’s great industrial city is not exactly the archetype called to mind when one imagines nostalgia for the summer of 1914. But the safety and surety of this microcosm, too, would be fractured by the war. Shortly after the outbreak of hostilities, the building was requisitioned for a hospital. The male side of the university emptied, and many women students elected to train as nurses or pursue other forms of war work. The staff each faced a difficult decision about whether and how to help, which entailed consideration of deeply personal questions about religion and ethics, and about gender, that many felt torturous. Fry, a devout Quaker, seems to have fairly easily concluded that it was her duty to engage in nonviolent war work, joining the Friends War Victims Relief Committee to bring aid directly to those civilians whose homes and sources of income had been ravaged due to their unfortunate location near to the front lines of the conflict. As crop fields became battlefields, the FWVRC’s volunteers distributed food to starving villagers, and set up schools for traumatized children. Educated women had usually received extremely thorough training in modern European languages, and their ability to communicate with French or Russian peasants was in as high demand as their organizational skills. Fry—whose social circle seems widely to have perceived her as a paragon of selflessness—inspired many of her friends, including non-Quakers, to follow her into the FWVRC, where they spent the duration of the war in camp conditions encountering famine and devastation firsthand.

It had been decided that Sidgwick would remain in Birmingham to keep open University House, now squatting in some rented rooms that were not needed for medical purposes. Numbers of women undergraduates throughout Britain remained steady or rose during the war, even as their male colleagues, like other elite men, made up disproportionate numbers of the casualties on the Western Front. Communities of university women could be important sources of momentum for volunteer aid on the home front, as students undertook first aid courses, worked in national kitchens, picked crops in the university holidays, or used their degrees to enter graduate occupations previously closed to them. But Sidgwick struggled with guilt at staying behind. Her youngest brother, Hugh, fed up with what he perceived as the uselessness of his work as a civil servant in the Education Department, had joined up; her friends doing relief work were enduring daily hardships equivalent to his. As she received their letters and posted to France the blankets and toothpaste and candy they requested, she joined the League of Nations Union and lectured to her students and the Birmingham public about how Britain might participate in building a better postwar world—as well, of course, as doing a day job that before the war would have been done by three people. She also traveled back to Oxford very often to join the rota of mother and sisters caring full-time for her father, whose dementia was steadily worsening. But none of this seemed to her like a sufficiently noble sacrifice—and she keenly felt the widening gulf between her and her brother and closest friends, who were being traumatized by experiences she could not imagine. Her grief only worsened in September 1917, when Hugh was killed at Passchendaele, leaving behind his family and his fiancée, who worked adjacent to Sidgwick as a nurse at the University House hospital. Sidgwick felt obliged to put on a brave face, telling friends that she and her family were much more fortunate than others: the word come back from Hugh’s comrades was that his death had been quick and relatively painless. But she knew that in the last year of his life Hugh had been expressing increasing anger about the pointlessness of the war and the duplicity with which politicians had sold it to the nation, and I read a hollowness into how she sought, by repeating it to others, to tell herself that her brother’s death had to be accepted.

While fighting continued on the Western Front until the proverbial eleventh hour, the wartime coalition that had governed Britain since 1916 did so with an eye also to building the postwar order. It was with this in mind that, in September 1918, University House received an invitation from the Foreign Office: would they like to send a representative to join a British Educational Mission to the United States, a delegation of academics who would meet with American colleagues and politicians to determine how universities might participate in a postwar Anglo-American alliance? Fry was the first choice, but, exhausted from her war work and with responsibilities to her aging parents, she suggested Sidgwick go in her stead. Everything happened very quickly: the five men members of the Mission had already set sail (only belatedly had someone in the Foreign Office suggested that some women delegates would be a useful addition), and Sidgwick scrambled to find someone to cover her teaching for the autumn term, ensure her father’s care was in hand, and tie up her various responsibilities at Birmingham. She and her counterpart, Bedford College English professor Caroline Spurgeon, sailed from Southampton; at the beginning of October they were met at the dock in New York by the senior woman member of the official welcoming committee, Dean of Barnard College Virginia Gildersleeve. With Gildersleeve as their host and guide, they took on a whirlwind tour of over thirty campuses in the northeast and midwest, an itinerary more crowded than that of the men, since they had to squeeze in visits to women’s colleges alongside the predetermined official list of institutions. Sidgwick initially doubted her ability to do the trip justice, but she quickly rose to the challenge, gaining a reputation as an effective public speaker about women’s role in building a new, modern, internationally-connected educational landscape. She marveled at the sense of possibility and optimism in America, at the willingness to invest in women’s higher education, and at the social ease that existed among the students she met at women’s and coeducational colleges. She toured everything she could, museums and hospitals as well as campuses. It was October, and she looked out the train window, admiring the foliage. (A couple weeks ago, in late October, I took a train from New York to New England, a not-so-different view.) The trip was life-changing: a chance to do something, be something, make something away from the violent forces that had ruptured her family and her friendships.

I am not an epidemiologist, but it seems possible that, had Sidgwick not traveled to the US, she might not have caught the deadly strain of influenza that, in 1917–1919, claimed far more military and civilian casualties worldwide than the war. By late 1918, the infection had mutated to a less virulent form in Britain, but American soldiers returning from Europe re-imported the worst strain. The last entry in Sidgwick’s diary was a visit to the Metropolitan Museum in New York, a week before Christmas. Her immune system no doubt compromised by the punishing travel itinerary, she was admitted to the Columbia University Hospital with influenza, and died on December 28. She received the academic equivalent of a state funeral in the Columbia chapel, a High Anglican service (at that time, Columbia was officially affiliated to the US Episcopal Church) with high-ranking politicians, diplomats, and university administrators among the attendees. Her coffin—just like her brother’s—was draped in the Union Jack. Casualties of the First World War who died in the service of their country could take many different forms.

Though the chair of the British Educational Mission did not mention Sidgwick or Spurgeon in the memoir he wrote of the trip, Sidgwick’s death sent shockwaves throughout the community of university women. Numb with grief just after the funeral, perched on trunks in a New York hotel room, Gildersleeve and Spurgeon (who had just begun a decades-long romantic relationship of their own) resolved to found the International Federation of University Women in Sidgwick’s memory, tying together groups of graduates across Europe and North America in the name of internationalism and world peace. Its first act was to found the Rose Sidgwick Memorial Fellowship for a British woman pursuing graduate study in the US. At memorial services in Oxford and Birmingham, the tributes to Sidgwick poured in. Sidgwick had left her estate to University House; the new warden, Bertha Orange, decided to spend the money on a fund for books and travel for low-income students. A year later, after their father had died, Sidgwick’s sister Ethel made the trip out to New York to visit her grave at Woodlawn Cemetery in the Bronx. Ethel and their mother had written the inscription on the headstone, and Gildersleeve had seen that it was installed. It read, very simply, “In loving memory of Rose Sidgwick, of the Universities Mission of amity to the United States, died Dec 28, 1918, aged 41 years.”

I have told you Rose Sidgwick’s story on the eve of the centenary of the 1918 Armistice because not only those who take up arms are the casualties of war. Sidgwick did not have the pacifist convictions of her beloved friend Margery Fry, and remained genuinely ambivalent about whether the Great War was a just conflict. Her pride in Hugh’s sacrifice only turned sour after his death. But she, too, was a casualty of war, one whom we might care to remember this special Remembrance Day. The Sidgwick family lost two children to the war. Their loss was as real as that of any other family who lost a child—and as real as the loss that accompanied the fracturing of the University House community; as real as the deprivations endured by the families in France whom Fry and other Friends War Victims Relief volunteers sought to help, who saw their crops burned and trenches dug through their fields; as real as the grief of German-speaking parents who lost their children. Like many of the most ardent members of the League of Nations Union, Sidgwick sought a forgiving settlement with Germany, and a postwar order that would quickly incorporate it within a liberal community of nations. Like them, too, she probably gave less serious thought to the ways that postwar internationalism might align all too neatly with British and French imperial ambitions. But there can be no doubt that she, and the family and friends who survived her, knew that modern warfare does not discriminate in those on whose lives it drops (literal, figurative) bombs.

The Rose Sidgwick Memorial Fellowship was still going strong in 1933, when members of the Women’s Cooperative Guild adopted the practice of displaying white instead of red poppies to protest the ways that remembrance observances were being used to stoke nationalist fervor and justify rearmament. The following year, the symbol was adopted by the newly-founded Peace Pledge Union (nondenominational but founded in connection with the Church of England), which continues to encourage its use to this day. It is above my pay grade to re-litigate the story of the appeasement of Nazi Germany, and the place that popular opposition to rearmament had in that story. And it is somewhat outside the scope of this essay to consider the ways in which this kind of women’s political activism intersected with popular imperialism—(which it, like basically every other form of British politics, did). But what the Women’s Cooperative Guild understood—like, I think, the International Federation of University Women, like Rose Sidgwick and her friends—is that if war is in some times and places a necessary evil, it is always and everywhere an evil. There is nothing glorious about war. Mostly it is death, and it is mud. And even when we imagine the antithesis of this particular vision of war bequeathed to us by the Great War—the sanitized, automated drone killings of today’s wars—the pervasive stink of evil endures.

When I was the age Rose Sidgwick was when she took her degree, and I too was living in Oxford, I sang in a Remembrance Sunday service in which the processional hymn was “I Vow to Thee My Country.” I know it’s a state church, but the conflation of nationalist battle fervor with Christianity sickened me, and has sickened me since, all throughout these years when the red poppy has become ever more commercialized and ever more mobilized as a litmus test of empty, amoral patriotism. For people like Rose Sidgwick and Margery Fry, and many like them who lived through it or didn’t, the Great War gave lie to patriotism. This year I hope you will join me in remembering them, as well as the countless unnamed casualties of war before and since who were not given a choice about whether to identify emotionally with a conflict fought in their name or on their land or with their bodies. I hope you will join me in doing what you can to resist war and violence in your own lives and communities; in informing yourselves about the wars fought overseas, including those fought in your name, and whether you want to be party to such conflicts. There is far too much anger in our world, as there was a century ago. I hope that, like me, you seek to commit yourselves to peace.

10689989_10204021321907408_2817450914440140168_n

Advice to Those Starting Graduate School

(geared—due to a friend’s request for such advice—to those starting a PhD in the humanities at an elite program in the US)

Always keep a sense of proportion and perspective, and keep alive your contacts with the world outside academia. Few problems you will encounter are unique to academia or uniquely bad within academia, and have much more to do with negotiating the working world/adult life generally, especially in our present blah blah neoliberal precarious gig economy which affects most of us under 40. Limit your engagement with academic social media and the social life of your department if you find that stuff only dials up your anxiety.

Take advantage of one of the really special things about academia—the opportunity for flexible working—but make it work for you. There will be times in the course of graduate school when your timetable is not your own and you won’t be able to impose limits on your work, but make sure you’re only working 24 hours/day when it is absolutely necessary. If you need to work 9-5 in order to impose boundaries around your work life, then great. But if you work better at other times, that’s great too. You do NOT have to work 40 hours a week, or any other set amount. Intellectual work happens at different paces for different people. You might find it more helpful to make to-do lists/set goals, and stop when you’ve finished them.

Some actors—university administrators, maybe your advisor or other professors—will try to treat you as if the category “student” means you’re not really an adult. But always act like the professional adult you want to be treated as, and insist on that treatment if it isn’t automatically given you. You have the right to dignity and respect, as do others—don’t defer to someone else simply because they are professionally senior to you, but be polite and professional if you disagree. Act like the dream colleague you want to have when you’re in your dream post-PhD job, including with your grad-student peers. While you may well become good friends with many of your peers, you aren’t obliged to hang out with (or even like) your whole cohort. You are obliged to be civil—unless someone has done something so egregious as not to merit your civility, in which case you shouldn’t give it them. (This will happen at some point—you may want to think in advance about where your boundaries are.) Like any office, the department will have a rumor mill. If you’re a budding academic politician you may wish to use its powers for good—but otherwise remember that if, for example, you date a classmate, everyone else will soon find out.

When you’re teaching, you are a staff member of the university. Dress professionally (within whatever remit makes you feel comfortable according to cultural norms/gender identity/whatever—I am continually surprised by how flexible these norms seem to be now, even for women). Don’t sleep with your students. Do join the union and insist that the university recognize you as the worker that you are.

Always remember that you have agency and power and independence. Disagree with your advisor if you want to take your dissertation in a different direction from what she prefers, or if you have aspirations for your post-PhD life that he hasn’t considered. Disagree with your cohort-mates if they’re causing drama about something that doesn’t merit it. Never lose sight of the job market, but I don’t mean freak out about it. Constantly ask yourself where you’d like to be in 10 or 20 years, and while recognizing that absolutely nothing is guaranteed or absolutely within your control, do your best to stack the odds to make that vision possible. If you want an academic job, it is your responsibility to seek out as many sources of information as you can, attend every professional development workshop your department puts on, stack up your publications and your teaching experience and your progress on your diss and your network of contacts. No one else is going to hand all that to you. If you don’t want an academic job, great—and your PhD journey and the scholarship you will produce along the way is just as valid and valuable as anyone else’s. But do what you need to do to gain a realistic picture of how to enter the sector that interests you when you graduate. Have back-up plans. You have the right to make whatever tradeoffs you need to make, and no one cares if you disappoint your advisor by not following the path she has imagined for you—but don’t expect you can both land a tenure-track job and keep your family in your current city where your non-academic partner is employed (or whatever).

There are limits to what it is reasonable to feel entitled to. If you are fully funded at a top program at a private US university, always remember that you are the most privileged of all possible grad students, and that if you don’t have dependents you are making a comfortable middle-class wage; and think about how you can help those both within your profession and outside it who are less fortunate than you.

You will encounter many people trying to peddle their crack system for note-taking or memorizing or organizing information or archive workflow. I can tell you about my generals study strategy or my archives system, but these sorts of things are highly individualized and through trial and error you will find the system that works for you. Coursework seems important when you start the PhD, but a year after generals you’ll have forgotten all about it. Remember that you spend most of your time as a grad student researching and writing your dissertation and teaching. Anything you find stressful or annoying about coursework will soon pass.

In conclusion, I really cannot emphasize enough perspective, autonomy, self-confidence, agency, dignity, and dialing down the drama whenever possible!

A Doubter’s Sermon for Easter Sunday

This Facebook post from Paul Raushenbush is the one Easter message I’ve seen this year that’s really resonated with me. I’ve been going to church for seven years, but in the last couple I’ve felt more as if I was going through the motions, as if the mystery and wonder of the Christian message was being drowned out by the evil in the world, and as if the gulf between me and my friends and co-congregants who are actually Christian and actually believe was wider than ever. This morning in church, while the preacher—an academic theologian—was selling real hard a message of joy and celebration, I was thinking about Mary Magdalene, about the anguish in “They have taken away my Lord, and I do not know where they have laid him.” If you had lost a close friend, perhaps the only one you had, to a painful and humiliating public execution, I don’t know how comforted you would be by the idea that he was “ascending to my Father and your Father, to my God and your God.” Surely you would still feel a great chasm in your life, and surely you would still feel doubt as to whether that great cause for which your friend made a big song and dance about sacrificing himself to was really so worth him abandoning his friends who loved him.

I watched Casablanca for the umpteenth time the other week, and I still think it’s one of the great artistic products of the twentieth century.But thinking about it as a version, in a way, of what’s going on between Jesus and the disciples helps to illuminate the doubting side both of the film’s message of moral clarity and of that in John’s Gospel. We are told in the film—which was created to encourage Americans to support the Allied war effort—that American self-interest and isolationism (personified by Rick) must be sacrificed to the higher end of helping Viktor Lazlo to carry on his great work in the Resistance. In Casablanca, as in the Bible, women are represented as pawns, and it takes some work to recuperate them as fully-realized, strong figures with agency. So Rick, having said that he will make the decisions for everyone, tells Ilse that it is her duty as a woman and a wife to go with Viktor and support his noble struggle for justice and righteousness, instead of giving in (perhaps?) to her baser desires to rekindle a sexual and romantic affair with someone who isn’t constantly spouting vague messages about what a hero he is in the great struggle. Of course, the film makes us want to believe that Viktor will win against the forces of darkness, it makes us want to continue that work, it makes us want to leap to our feet with tears in our eyes when they sing the Marseillaise in the café. But we also—at least, those of us who are modern European historians by trade—are aware of how much rhetorical and political work went into making World War II into a moral war of good against evil, and how much has been done since in the name of that narrative to paper over the sins of racism and imperialism and fascism deep within American and French and British politics and culture before, during, and long after the war. It is the work that story of the war has done that makes it hard today, in our communities, perhaps among those with whom we have celebrated the resurrection of our Lord, or the freedom of our people from bondage, these last days, even to see the evil and sin that is still co-constitutive with all that stuff about holy miracles and triumphing over death.

If any of the stuff in those stories that we rehearse year in, year out—those stories I have memorized word-for-word after seven years of churchgoing—ever really happened, I wonder too if Mary Magdalene, when she arrived at the tomb early that morning to take care of her dead friend’s body, was irritable and frustrated, empty and bereft, and if the disappearance of Jesus’s body felt like just another thing that she and the other Mary and any of the men who could be bothered to stick around and help had been left behind to deal with; while their great friend, who was as difficult and self-absorbed as he was charismatic and kind, had preferred heroics to the down-in-the-dirt daily work of sticking around to care for the sick and the poor. For, these days, when I kneel and someone recites a familiar list of prayers for church leaders, political leaders, zones of conflict, the homeless, those in the community who are sick and dying, I think it feels a bit like what Mary Magdalene might have felt when she saw the empty tomb—the divine presence has left us here, rather haplessly, to carry on its work, with little guidance and without even a body to mourn. It’s hard to imagine that a ghostly, glowing presence popping up from time to time to offer elliptical neo-Platonic remarks about ascending to the Father could have substituted in the minds of the disciples for the kinds of practical, down-and-dirty work for which Jesus became famous during his lifetime. What I still find compelling about the Jesus story is how practically-minded his miracles are. They’re about making sure people have enough to eat, a lot of the time; about easing their physical pain; about being kind to people—I think especially women—who were used to fading into the background; and along the way about giving life to a new kind of radical political message. I think, if I were Mary Magdalene, and it was that practical message—making a real difference in ordinary people’s lives, not least my own—that had drawn me to follow Jesus, I might be thinking, “Why did you have to go and make everything so much more difficult? Now how am I going to feed every beggar who calls at the door? What am I going to tell everyone? Yeah, yeah, I know, you have to ascend to the Father, you said, but does that mean you’re too good to do the washing up? We had to feed a lot of beggars this week, and there’s a lot of it.”

Whether Jesus lived or died, whether he was or wasn’t co-equal with the Father, whether we celebrate his resurrection or not, makes awfully little difference to whether all the beggars on this planet stuffed full to bursting have enough to eat, or whether we are capable of passing on to our children an earth worth inheriting. I am still inspired by what Jesus called on his followers to do, who he called on them to be. But even on the first Easter Sunday (inasmuch as there was one, etc.) Mary Magdalene still had to go home and get dinner on the table, perform emotional labor for all the male disciples as well as process her own grief, and get on with doing the work of living here on earth. She didn’t get to—we don’t get to—ascend to the Father. We have so much work to do, and somehow we have to find the strength to carry on.

Me, Too; or, Gender as a Category of Historical Analysis

The first week of my PhD, I was entering the elevator in the history department building when a male grad student—someone whom I had already identified as a bit weird, a bit icky, to be around—followed me. We nodded hello. The doors closed. There was a silence, which he broke. “So, you’re the one who works on sex,” he said. “No,” I said flatly, “I don’t”—for not only did every instinct I possessed scream that being alone in an elevator with a man who has just said the word “sex” is a situation that has to be shut down as quickly and coldly as possible; in those days, having just come back to the States from Oxford, I had absorbed certain preconceptions about working on gender and sexuality. I thought that leading with these subjects would make me be taken less seriously as a scholar, that the only way in with the Big Guns was to purport to be an intellectual historian. And I knew from experience, honed since I first started going to conferences and talking about my work in public fora online as an undergrad, that the “sexuality” research-interest box is so all-consuming that it is very easy to get typecast, very easy for colleagues to forget that you do anything else, very easy to find yourself only talking to interlocutors who do exactly the same thing—and who maybe take just a little too much prurient interest in rehearsing, say, the content of nineteenth-century pornography or the exploits of the men who populated London’s cruising grounds of yore.

This is, of course, a “me too” story, one of the large number of anecdotes women have been sharing—mostly to sympathetic audiences on Facebook—in the last couple days, I suppose as a consciousness-raising and solidarity-building exercise. Others have expressed sentiments that are variations on the theme: that they feel guilty saying “me too” because their daily experiences of sexism have never shaded into the horrific, the violent; that they are looking for a form of action that is concrete rather than symbolic; that they are experiencing cognitive dissonance between the righteous moral outrage of left-leaning metropolitan-elite social media and the impossible task of how they would ever broach subjects like this with their fathers or their brothers or their high-school classmates. Scrolling through my news feed and reading the litany of “me too” and the periodic expansions upon it, though, what came to my mind are the other things I think my anecdote about the Fayerweather elevator suggests: the ambivalence I’ve had throughout grad school about whether to identify as a gender historian and my simultaneous frustration with sexism in intellectual history; the ways that my research into gender and sexuality have often either symbolized or sublimated my personal relationship to gender and sexuality; and the way that experience has trained me to flatly shut down, run away from, and view with deep suspicion forever any man with whom I have a slightly unpleasant interaction. It doesn’t feel safe—for one’s career or for one’s emotional and mental health—to leave oneself open to the possibility that something might go wrong. (And, indeed, years later I learned that this particular grad student had a history of violating women’s boundaries in far more inappropriate ways, and that he did so consciously and not through social ineptitude—much as I later learned that the senior academic I backed away from because he really, really wanted to stand in the corner of a classroom and talk to me about how we had attended the same Oxford college fifty years apart also had a history of harassment.)

So, three years on, I lead with my research interest in gender, because I am not in Oxford anymore, and also I think times have changed across different institutions since I first started doing research in modern British history. I was at a discussion yesterday about a new monograph in economic and political history, and multiple participants—even those who don’t consider themselves to work on gender—suggested ways that the book might have incorporated a discussion of gender. That would just not have happened five, six, seven years ago when I first started to sit in the corner and listen to faculty and grad students talk about new historical research. (Maybe some will contradict me, but it didn’t happen in the places where I was at that time.) A colleague once said that when deciding which of her historical interests to pursue in graduate study, she made her decision based on which group of historians she’d most like to be in rooms talking with. The openness of modern British historians to a wide variety of thematic and methodological approaches, particularly at the present moment, is one reason why I’m loyal to my field. My colleagues who work on political thought can also think about gender; my colleagues who work on neoliberalism are alive to culture as much as to economics. (It strikes me that this is one reason to continue to promote national histories, even after the “global/international turn”—they can serve as useful containers in which to put an eclectic set of methodological approaches, which wouldn’t be the case in a field that is primarily organized around a methodology, such as intellectual history.)

I also find, as I gain more expertise in my subject, that I have more to offer to the wider public discussion we are currently having about how particular norms and practices surrounding gender structure our society. In due course, naturally, the outraged Twittersphere will move on to something else, but right now gender is having its moment. I became interested in the ways that gender difference, and gender segregation, loom large in elite British culture and the culture of elite educational institutions in particular because I have spent a significant part of my adult life living within elite British culture and its educational institutions, and it turns out that there are highly specific historical explanations for the reasons why this culture works the way it does. But once you’re attuned to reading documents for evidence of how gender works, you see it as much in the Greek-letter organizations of American universities, in the Boy Scouts, in the President, in your own social and professional networks, as you do in nineteenth-century single-sex colleges and student societies. The disadvantage of this, of course, is that you can’t unsee it—and I find that as a woman, it makes it harder for me to relate normally to men, to see them as something other than either research subjects or potential predators. This is a state of mind I find it unpleasant to live within, and I’d like to find some way of moving past it.

They say the public have had enough of experts. And that may be true, but sometimes I think we experts are not sufficiently imaginative about how we might reach people with our expertise and use it for good. Expertise is not something best promulgated through broadsheet op-eds or through written work intended largely for already sympathetic audiences; it’s not only a matter of theorizing about something abstract like, say, global economics that seems removed from the experience of daily life; it is not always about having more facts and about the top-down dissemination of them. Instead, experts can encourage non-academics to think analytically about something they might not have thought about before, and then to think with them as they bring that analytic tool to bear on new pieces of information or experiences or feelings. Coincidentally, this practice has a name: it’s called teaching.

Thus far in my academic career I have only given three undergraduate lectures. If I do not win the academic-jobs lottery, I may never give another one. But in each one, I have found a way of telling the story about the past dearest to my heart, the one where my interest in the past began. In each one, I have told fifty-odd young people new to thinking analytically about the past that our modern conception of homosexuality is historically constructed and contingent. I have given them some heroes overcoming adversity that they can take away with them, but also some ways the story is more complicated and less satisfying than that. Every semester there are a couple students who actually care—who use the content from that lecture in their final paper, who come up to me after class and ask a follow-up question. Hopefully, though, in still further students, the presentation of a new way of thinking about a familiar topic plants a seed that they might, unconsciously, come back to later, even if they no longer connect the thought to that particular lecture or to the history of nineteenth-century Europe. In many forms of psychotherapy, it is held that the conscious awareness of patterns of behavior or ideas or emotions that you may have unconsciously held since early childhood, and of how those tendencies might have originated, can help you to move on from tendencies that might be unhealthy or unhelpful and lay down new patterns. The more that we can point to the things we read in the news—or, more important, every single interaction we have every day—and say “look, gender is happening here,” the more we can recognize harmful behavior when we see it, and move outside boxes of gendered behavior that imprison and hurt everyone.