University Days–Redux

Photo by Rakicevic Nenad on

When I was teaching college English courses, my best students, the ones who really paid attention and were hungry for knowledge and ideas, would often come up to me after a class and say something like, “You brought up the French Revolution today while we were talking about William Wordsworth. This morning, in my history class, Professor X also talked about it. And yesterday, in Sociology, Professor Y mentioned it, too. Did you guys get together and coordinate your lectures for this week?”

Of course, the answer was always “no.” Most professors I know barely have time to prepare their own lectures, much less coordinate them along the lines of a master plan for illustrating Western Civilization. It was hard, however, to get the students to believe this; they really thought that since we all brought up the same themes in our classes, often in the same week, we must have done it on purpose. But the truth was simple, and it wasn’t magic or even serendipity. The students were just more aware than they had been before, and allusions that had passed by them unnoticed in earlier days were now noteworthy.

I’ve experienced something of this phenomenon myself in recent days, while reading Colin Tudge’s book The Tree and listening to Karl Popper’s The Open Society and Its Enemies–two books, one on natural science and the other on philosophy, that would seem to have few if any common themes. In this case, the subject both authors touched on was nomenclature and definitions. Previously, I would have never noticed this coincidence, but now I find myself in the same position as my former students, hyper-aware of the fact that even seemingly unrelated subjects can have common themes.

There’s a good reason why I am experiencing what my students did; I am now myself a student, so it makes sense that I’d see things through their eyes. All of which leads me to my main idea for this post: University Redux, or returning to college in later life. It’s an idea that I believe might just improve the lives of many people at this very strange point in our lives.

I happened upon the concept in this way: after five or so years of retirement, I realized that I had lost the sense of my ikigai–my purpose in life. I am not exactly sure how that happened. When I took early retirement at the end of the school year in 2015, I had grand ideas of throwing myself into writing and research projects. But somehow I lost the thread of what I was doing, and even more frightening, why I was doing it. The political climate during the past few years certainly didn’t help matters, either. And so I began to question what it was that I actually had to offer the wide world. I began to realize that the answer was very little indeed.

Terrified at some level, I clutched at the things that made me happy: gardening, pets, reading. But there was no unifying thread between these various pursuits, and I began to feel that I was just a dilettante, perhaps even a hedonist, chasing after little pleasures in life. Hedonism is fine for some people, but I’m more of a stoic myself, and so the cognitive dissonance arising from this lifestyle was difficult for me to handle. And then, after drifting around for three or four years, I discovered a solution.

A little background information first: I have a Ph.D. in English, and my dissertation was on the representation of female insanity in Victorian novels. I’ve published a small number of articles, but as a community college professor, I did not have the kind of academic career that rewarded research. (I should say I tried to throw myself into academic research as a means of finding my ikigai, to no avail. I wrote about that experience here.) As a professor, I taught freshman English, as well as survey courses, at a small, rural community college. Most of my adult life revolved around the academic calendar, which as a retiree ususally left me feeling aimless, even bereft, when my old colleagues returned to campus in the fall, while I stayed at home or headed off on a trip.

A year and a half ago, however, I found my solution, and although I’ve had a few bumps in the road, I am generally satisfied with it. Armed with the knowledge that I was, intellectually at least, most fulfilled when I was a college student, I have simply sent myself back to college. Now, I don’t mean by this that I actually enrolled in a course of study at a university. I did, in fact, think about doing so, but it really made little sense. I don’t need another degree, certainly; besides, I live in an area that is too remote to attend classes. Yet I realized that if there was one thing I knew how to do, it was how to create a course. I also knew how to research. So, I convinced myself that living in the world of ideas, that cultivating the life of the mind, was a worthy pursuit in and of itself, and I gave myself permission to undertake my own course of study. I sent myself back to college without worrying how practical it was. I relied on my own knowledge and ability (Emerson would be proud!), as well as a certain degree of nosiness (“intellectual curiosity” is a nicer term), and I began to use my time in the pursuit of knowledge–knowing, of course, that any knowledge gained would have no value in the “real” world. It wouldn’t pay my rent, or gain me prestige, or produce anything remotely valuable in practical terms.

This last bit was the hardest part. I was raised to believe, as are most people in American society, that one must have practical skills, the proof of which is whether one can gain money by exercising them. If you study literature, you must be a teacher of some kind. If you play music, you must get paying gigs. If you like numbers, then you should consider engineering, accounting, or business. The rise of social media, where everyone is constantly sharing their successes (and academics are often the worst in this respect), makes it even more difficult to slip the bonds of materialism, to escape the all-consuming attention economy. My brainwashing by the economic and social order was very nearly complete: it was, in other words, quite hard for me to give myself permission to do something for the sake of the thing itself, with no ulterior motives. I had to give myself many stern lectures in an effort to recreate the mindset of my twenty-year-old naive self, saying for example that just reading Paradise Lost to know and understand it was enough; I didn’t have to parlay my reading and understanding into an article, a blog, or a work of fiction. (Full disclosure: having just written that, I will point out that I did indeed write a blog about Paradise Lost. You can’t win them all.) One additional but unplanned benefit of this odd program of study is that it fit in quite well with the year of Covid lockdown we’ve all experienced. Since I was already engaged in a purposeless aim, the enforced break in social life really didn’t affect me that much.

What does my course of study look like? Reading, mainly, although I know YouTube has many fine lectures to access. I read books on natural science (trying to fill a large gap produced during my first time at college), as well as history; this year, the topic has been the Franco-Prussian War and the Paris Commune. I study foreign languages on Duolingo (German, French, a bit of Spanish) while occasionally trying to read books in those languages. I have participated in a highly enjoyable two-person online reading group of The Iliad and The Odyssey (thanks, Anne!) Thanks to my recent discovery of Karl Popper, I foresee myself studying philosophy, perhaps beginning with Plato and Aristotle. I’ve taken FutureLearn classes on Ancient Rome, Coursera classes on The United States through Foreign Eyes, and several others. I’ve listened and re-listened to various In Our Time podcasts. I have taxed the local library with my requests for books from other network libraries, and I swear some of those books haven’t been checked out in a decade or more. To be honest, I don’t understand a good part of what I read, but this doesn’t bother me as it used to do the first time around. If I’ve learned one thing from serving on the local city council, it’s that you don’t have to understand everything you read, but you do have to read everything you’re given. Sometimes understanding comes much later, long after the book is returned–and that’s okay.

I’m not sure where this intellectual journey will lead, or if it will in fact lead anywhere. But I’m satisfied with it. I think I’ve chanced upon something important, something which society with its various pressures has very nearly strangled in me for the last thirty years: the unimpeded desire for knowledge, the childlike ability to search for answers just because, and the confidence to look for those answers freely, unattached to any hope of gain or prestige. It takes some getting used to, rather like a new diet or exercise program, but I’m pleased with the results at last, and I am enjoying my second dose of college life.

How the Study of Literature Could Save Democracy

Beowulf MS, picture from Wikipedia

Usually, I am not one to make grand claims for my discipline. There was a time, back when I was a young graduate student in the 1980s, that I would have; perhaps even more recently, I might have argued that understanding ideology through literary theory and criticism is essential to understanding current events and the conditions we live in. But I no longer believe that.

Perhaps in saying this publicly, I’m risking some sort of banishment from academia. Maybe I will have to undergo a ritual in which I am formally cashiered, like some kind of academic Alfred Dreyfus, although instead of having my sword broken in half and my military braids ripped to shreds, I will have my diploma yanked from my hands and trampled on the ground before my somber eyes. Yet unlike Dreyfus, I will have deserved such treatment, because I am in fact disloyal to my training: I don’t believe literary theory can save the world. I don’t think it’s necessary that we have more papers and books on esoteric subjects, nor do I think it’s realistic or useful for academics to participate in a market system in which the research they produce becomes a commodity in their quest for jobs, promotions, or grant opportunities. In this sense, I suppose I am indeed a traitor.

But recently I have realized, with the help of my friend and former student (thanks, Cari!), that literature classes are still important. In fact, I think studying literature can help save our way of life. You just have to look at it this way: it’s not the abstruse academic research that can save us, but rather the garden-variety study of literature that can prove essential to preserving democracy. Let me explain how.

I’ll begin, as any good scholar should, by pointing out the obvious. We are in a bad place in terms of political discourse–it doesn’t take a scholar to see that. Polarizing views have separated Americans into two discrete camps with very little chance of crossing the aisle to negotiate or compromise. Most people are unwilling to test their beliefs, for example, preferring to cling to them even in the face of contradictory evidence. As social psychologists Elliot Aronson and Carol Tavris point out in a recent article in The Atlantic, “human beings are deeply unwilling to change their minds. And when the facts clash with their preexisting convictions, some people would sooner jeopardize their health and everyone else’s than accept new information or admit to being wrong.” They use the term “cognitive dissonance,” which means the sense of disorientation and even discomfort one feels when considering two opposing viewpoints, to explain why it is so hard for people to change their ideas.

To those of us who study literature, the term “cognitive dissonance” may be new, but the concept certainly is not. F. Scott Fitzgerald writes, in an essay which is largely forgotten except for this sentence, “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function” (“The Crack-Up,Esquire Magazine, February 1936). In addition, cognitive dissonance isn’t that far removed from an idea expressed by John Keats in a letter he wrote to his brothers back in 1817. He invents the term “Negative Capability” to describe the ability to remain in a liminal state of doubt and uncertainty without being driven to come to any conclusion and definitive belief. Negative capability, in other words, is the capacity to be flexible in our beliefs, to be capable of changing our minds.

I believe that the American public needs to develop negative capability, lots of it, and quickly, if we are to save our democracy.

But there’s a huge problem. Both Fitzgerald and Keats believe that this function is reserved only for geniuses. In their view, a person is born with this talent for tolerating cognitive dissonance: you either have it–in which case you are incredibly gifted–or you don’t. In contrast, Aronson and Tavris clearly believe it’s possible to develop a tolerance for cognitive dissonance: “Although it’s difficult, changing our minds is not impossible. The challenge is to find a way to live with uncertainty…” While their belief in our ability to tolerate cognitive dissonance and to learn from it is encouraging, it is sobering that they do not provide a clear path toward fostering this tolerance.

So here’s where the study of literature comes in. In a good English class, when we study a text, whether it’s To Kill a Mockingbird or Beowulf, students and teacher meet as more or less equals over the work of literature in an effort to find its meaning and its relevance. Certainly the teacher has more experience and knowledge, but this doesn’t–or shouldn’t–change the dynamic of the class: we are all partners in discovering what the text has to say in general, and to us, specifically. That is our task. In the course of this task, different ideas will be presented. Some interpretations will be rejected; some will be accepted. Some will be rejected, only to be later accepted, even after the space of years (see below for an example).

If we do it well, we will reach a point in the discussion where we consider several differrent suggestions and possibilities for interpretation. This is the moment during which we become experts in cognitive dissonance, as we relish interpretive uncertainty, examining each shiny new idea and interpretation with the delight of a child holding up gorgeously colored beads to the light. We may put a bead down, but it is only to take up another, different one–and we may well take up the discarded bead only to play with it some more.

The thing that makes the study of literature so important in this process is that it isn’t really all that important in the grand scheme of things. To my knowledge, no one has ever been shot for their interpretation of Hamlet; the preservation of life and limb does not hang on an precise explanation of Paradise Lost. If we use the study of literature as a classroom designed to increase our capacity for cognitive dissonance, in other words, we can dissipate the highly charged atmosphere that makes changing our minds so difficult. And once we get used to the process, when we know what it’s like to experience cognitive dissonance, it will be easier to for us to tolerate it in other parts of our lives, even in the sphere of public policy and politics.

If I seem to be writing with conviction (no cognitive dissonance here!), it’s because I have often experienced this negative capability in real time. I will give just two examples. The first one occurred during a class on mystery fiction, when we were discussing the role of gossip in detective novels, which then devolved into a discussion on the ethics of gossip. The class disagreed violently about whether gossip could be seen as good or neutral, or whether it was always bad. A loud (and I mean loud!) discussion ensued, with such force that a janitor felt compelled to pop his head into the classroom–something that I had never seen happen either before or since then–to ask if everything was ok. While other teachers might have felt that they had lost control of the classroom, I, perversely, believe that this might have been my most successful teaching moment ever. That so many students felt safe enough to weigh in, to argue and debate passionately about something that had so little real importance suggested to me that we were exercising and developing new critical aptitudes. Some of us, I believe, changed our minds as a result of that discussion. At the very least, I think many of us saw the topic in a different way than we had to begin with. This, of course, is the result of experiencing cognitive dissonance.

My second example is similar. At the end of one very successful course on Ernest Hemingway, my class and I adjourned for the semester to meet at a local bar, at which we continued our discussion about The Sun Also Rises. My student Cari and I got into a very heated discussion about whether the novel could be seen as a pilgrimage story. Cari said it was ; I vehemently disagreed. The argument was fierce and invigorating–so invigorating, as a matter of fact, that at one point a server came to inquire whether there was something wrong, and then a neighboring table began to take sides in the debate. (For the record, I live in Hemingway country, and everyone here has an opinion about him and his works.) Cari and I left the bar firmly ensconced in our own points of view, but a couple of years ago–some three years after the original argument occurred–I came to see it from Cari’s point of view, and I now agree with her that The Sun Also Rises can be seen as a sort of pilgrimage tale. It took a while, but I was able to change my mind.

It is this capacity to change one’s mind, I will argue, that is important, indeed, indispensable, for the democratic process to thrive.

In the end, it may well be that the chief contribution that good teachers of literature make to culture is this: we provide a safe and accessible place for people to learn what cognitive dissonance feels like, and in doing so, we can help them acquire a tolerance for it. This tolerance, in turn, leads to an increase in the ability to participate in civil discourse, which is itself the bedrock of democratic thought and process. In other words, you can invest in STEAM classes all you want, but if you really want to make people good citizens, do not forget about literature courses.

In view of this discovery of mine, I feel it’s my duty to host a noncredit literature class of sorts in the fall, a discussion-type newsletter that covers the great works of English literature–whatever that means–from Beowulf to the early Romantic period, in which discussion is paramount. If you’re interested or have suggestions, please let me know by commenting or messaging me, and I’ll do my best to keep you in the loop.

And in the meantime, keep your minds open! Cognitive dissonance, uncomfortable as it is, may just be what will keep democracy alive in the critical days to come.

Elegy for Eavan Boland, 1944-2020

The only modern poet I have ever understood is Eavan Boland.

If you recognize that sentence as an echo of Boland’s wonderful poem “The Pomegranate,” you might share my feelings for her work. Boland’s death will probably not get much attention outside of Ireland, but I feel it’s right for me to acknowledge it here, where I talk about the things that are important to me.

In a time of so many losses, perhaps it’s silly to focus on one death, yet I do it out of selfishness, for myself and for what this poet’s work has meant to me. First, a confession: I am not a poet, nor am I really a great reader of poems. As a professor of literature, I have studied poetry, but I feel much more comfortable with the works of Wordsworth, Arnold, Shakespeare, even (dare I say it?) Milton than with contemporary poetry. To be honest, despite my elaborate education, I really don’t understand contemporary poetry–so I must not really “get” it. I’m willing to accept that judgment; after all, there are a lot of things I do get, so it’s a kind of trade-off. I realize I’m not a Michael Jordan of literary studies, which is why I rarely comment on poetry that was written after, say, 1850. But I feel it’s only right to mention here my attraction to, and reverence for, Boland’s poems, one of which (“This Moment“) I used for years to teach poetic language to my freshman and sophomore college students.

I first noticed Boland’s poems in the mid-90s, when I was teaching full time as an adjunct professor, still hoping to make my mark–whatever that was supposed to be–on the world. I had subscribed to the New Yorker, back in the days when it was read for literary, not political, reasons. This was during a period when poets and writers who submitted their work and not gotten it accepted for publication actually protested outside the offices of the magazine, stating that their work was just as bad as what was being published within the pages of the New Yorker and demanding equal time. (I thought about looking this story up on the internet, because, in an age of so much fake news, everything is easily verifiable, but forgive me–I decided not to. If the story about these outraged mediocre writers is not true, I don’t want to know about it. I love it and cling to it, and it does no one any harm, after all.)

I was very much aware of the opacity of much that was published in the New Yorker, and one evening after the children were in bed, having recently heard that story about the protesters, I shared it with my husband. To demonstrate how unreadable the stuff that was being published was, I grabbed a copy off our end table, thumbed through it until I found a poem, and started to read it out loud. After two or three lines, however, I stopped in mid-sentence. My husband said, “What? Why did you stop?” I looked up slowly, reluctant to pull my eyes away from the poem, and said, “It started to make sense to me. Actually, this is really good.”

I am not sure which poem of hers I was reading that evening. Perhaps it’s best that I don’t know, because it drives me to read so many of her poems, always searching for the Ur-poem, that first poem of hers that drove me to appreciate so much more of what she’s written. Boland’s poetry seems to me to explore the intersection of place and person, of history and modernity, in simple, sometimes stark, language. I love it for its depth, not for its breadth (sorry, Elizabeth Barrett Browning). I love the way it sinks its roots deep into the past, all the way back to myths and legends sometimes, yet still manages to retain a hold on the very real present.

Eavan Boland died yesterday, April 27, at the age of 75. You can read about her influence here, in an article by Fintan O’Toole of the Irish Times. Her poems can be found online at and on


Chief Petoskey sporting an Ecohat in Petoskey, Michigan.


We are facing an ecological emergency, and too little is being done to address the consequences of climate change. As individuals, our actions may seem inadequate. Yet every action, however small, can lead to something bigger. Change comes only as a result of collective will, and we can demonstrate that will by showing that we desire immediate political, social, and economic action in the face of global climate change.

Ecohats are not a solution, but they are a manifestation of will made public. Based on the pussyhat, a public display of support for women’s rights, Ecohats display support for immediate political action to address the need for systemic change to deal with climate change. They are easy to create; simply knit, crochet, or sew a hat in any shade of green to show your support for the people and organizations that are dedicated to addressing the issue of climate change, then wear it or display it with pride and dedication.

We can’t all be Greta Thunberg, but we can show our support for those people and organizations that are tirelessly working to address climate change, like  Citizens’ Climate Lobby,, Center for Biological Diversity, and others.

Please consider making, wearing, and displaying an Ecohat to show your support!


Ernest Hemingway is also wearing an Ecohat!

Six Rules for Reading (and Enjoying) Julius Caesar

I have always assumed that the best example of my argument that most people get Shakespeare plays all wrong would be Romeo and Juliet. But I have to admit I was mistaken. In fact, I think it is safe to posit that no other Shakespeare play is so maligned and misunderstood as Julius Caesar.

I think this is largely due to the way we teach the play in the United States. Of course, because we do teach the play in high school, Julius Caesar has always gotten tremendous exposure: almost everyone I’ve met has been forced to read the play during their high school career. In fact, I think it’s still on high school reading lists today. But that’s probably also exactly why it’s so misunderstood.

I’m not blaming high school teachers, because by and large they’re told to teach these plays without any adequate preparation. I suppose if anyone deserves blame, it’s the colleges that train teachers. But all blame aside, before I talk about what a great play it really is, and what a shame it is that most people summarily dismiss  Julius Caesar without ever really considering it, let’s look at why this has happened.

julius_caesarFirst of all, it goes without saying that making someone read a play is not a great way to get him or her to like it. Especially when that play is over 400 years old and written in (what seems to be) archaic language. But a still greater problem is that there is a tendency to use the play to teach Roman history, which is a serious mistake. (American high schools are not alone in this; Samuel Taylor Coleridge, for example, criticized the play for not being realistic in its portrayal of Roman politics back in the early 1800s.) In short, far too many people associate this play with a bunch of men showing a great deal of thigh or swathed in endless yards of material, flipping their togas around like an adolescent girl tosses her hair over her shoulder. It’s all too distracting, to say the least.

So, in order to set us back on the right track and get more people to read this fine play,  I’ve made a little list of rules to follow that will help my readers get the most enjoyment, emotional and intellectual, from the play.

Rule Number One: Forget about Roman history when you read this play. Forget about looking for anachronisms and mistakes on the part of Shakespeare’s use of history. Forget everything you know about tribunes, plebeians, Cicero, and the Festival of Lupercalia. The fact is, the history of the play hardly matters at all. Rather, the only thing that matters is that you know in the beginning moments that Caesar will die and that, whatever his motives and his character, Marcus Brutus will pay for his part in Caesar’s assassination with his own life and reputation.

Rule Number Two: Recognize that this is one of Shakespeare’s most suspenseful plays. Our foreknowledge of events in the play, far from making it predictable and boring, provides an element of suspense that should excite the audience. Here we can point to Alfred Hitchcock’s definition of suspense, in which he explains that it’s the fact that the audience knows there’s a bomb hidden under a table that makes the scene so fascinating to watch, that makes every sentence, every facial expression count with the audience. It’s the fact that we know Julius Caesar is going to die on the Ides of March that makes his refusal to follow the advice of the soothsayer, his wife Calpurnia, and Artemidorus so interesting. We become invested in all of his words and actions, just as our knowledge that Brutus is going to lose everything makes us become invested in him as a character as well. A good production of this play, then, would highlight the suspenseful nature within it, allowing the audience to react with an emotional response rather than mere intellectual curiosity.

Rule Number Three: Understand that this play is, like Coriolanus, highly critical of the Roman mob. Individuals from the mob may be quite witty, as in the opening scene, when a mere cobbler gets the better of one of the Roman Tribunes, but taken as a whole, the mob is easily swayed by rhetoric, highly materialistic, and downright vicious. (In one often-excluded scene–III.iii–a poet is on his way to Caesar’s funeral when he is accosted by the crowd, mistaken for one of the conspirators, and carried off to be torn to pieces.) It’s almost as if this representation of mob mentality–the Elizabethan equivalent of populism, if you will–is something that Shakespeare introduces in 1599 in Julius Caesar, only to return to it nine years later to explore in greater detail in Coriolanus.

Rule Number Four: Recognize that this play, like many of Shakespeare’s plays, is misnamed. It is not about Julius Caesar. It’s really all about Marcus Brutus, who is the tragic hero of the play. He is doomed from the outset, because (1) it is his patriotism and his love of the Roman Republic, not a desire for gain, that drives him to commit murder; (2) he becomes enamored of his own reputation and convinces himself that it is his duty to commit murder and to break the law; (3) he falls victim to this egotism and loses everything because of it. Audience members really shouldn’t give a hoot about Julius Caesar; he’s a jerk who gets pretty much what he deserves. But Brutus is a tragic hero with a tragic flaw, a character whose every step, much like Oedipus, takes him further and further into his own doom. The soliloquies Brutus speaks are similar to those in Macbeth, revealing a character that is not inherently bad but rather deficient in logic, self-awareness, and respect for others. In fact, in many ways, it’s interesting to look at Julius Caesar as a rough draft not only of Coriolanus but of Macbeth as well.

Rule Number Five: Appreciate the dark comedy in the play. Shakespeare plays with his audience from the outset, in the comic first scene between the workmen and the Roman Tribunes, but another great comedic scene is Act IV, scene iii, when Brutus and Cassius meet up before the big battle and end up in an argument that resembles nothing more than a couple of young boys squabbling, even descending into a “did not, did so” level. This scene would be hilarious if the stakes weren’t so high, and if we didn’t know that disaster was imminent.

Rule Number Six: Experience the play without preconceptions, without the baggage that undoubtedly is left over from your tenth-grade English class. Once you do this, you’ll realize that the play is timely. It explores some really pertinent questions, ones which societies have dealt with time and time again, and which we are dealing with at this very moment. For example, when is it permissible to commit a wrong in order for the greater good to benefit? (surely Immanuel Kant would have something to say about this, along with Jeremy Bentham). How secure is a republic when its citizens are poor thinkers who can be swayed by mere rhetoric and emotionalism instead of reason? What course of action should be taken when a megalomaniac takes over an entire nation, and no one has the guts to stop him through any legal or offical means?

In the end, Brutus’s tragedy is that he immolates his personal, individual self in his public and civic responsibilities. Unfortunately, it is the inability to understand this sacrifice and the conflict it creates, not the play’s historical setting in a distant and hazy past, that has made it inaccessible for generations of American high school students. Too many decades have gone by since civic responsibility has been considered an important element in our education, with the sad but inevitable result that several generations of students can no longer understand the real tragedy in this play, which is certainly not the assassination of Julius Caesar.

But perhaps this is about to change. In the last few months, we’ve been witnessing a new generation teaching themselves about civic involvement, since no one will teach it to them. And as I consider the brave civic movement begun by the students from Marjory Stoneman Douglas High School, I am hopeful that from now on it’s just possible that reading Julius Caesar could become not a wasted module in an English class, but the single most important reading experience in a high-school student’s career.

Image from


Teaching Behind the Lines

French resistance fighters putting up posters
French Resistance Fighters putting up posters.  Image from “History in Photos” Blog (available here)

It’s been a year now since the election, and here I am, still fighting off a sense of futility and hopelessness about the future. During that time, the United States has pulled out of the Paris Accord in an astounding demonstration of willful ignorance about climate change, suffered a spate of horrific mass murders due to lax gun laws, and threatened nuclear war with North Korea. Suffice it to say that things are not going well.

But I should point out that the emphasis in my first sentence should be on the word “fighting,” because that’s what I’m doing these days: in my own small way, I’m waging a tiny war on some of the ignorance and egotism that seems to be ruling my country these days. Somewhere (I can’t find it anymore, and perhaps that’s just as well), the French novelist Léon Werth said that any action taken against tyranny, no matter how small, no matter how personal, helps to make things better. I’ve taken his words to heart, and I’m using this space to take stock of what I’ve done in the last year. I do this not to brag–far from it, because I know I’ve done far too little–but to remind myself that although I feel powerless too much of the time, I am not quite as powerless as I seem.

Let me begin, however, by saying what I haven’t done. I have not run for office. I did that in 2012, perhaps having had an inkling that things were not going well in my part of the country, but I was crushed by an unresponsive political system, apathy, and my own supreme unsuitability for the task. I am not ready to run for office again. In fact, I may never be ready to run again. I did write about my experience, however, and over the past year, I have encouraged other people, specifically women, to run for office. I’ve talked to a few activist groups about my experiences, and perhaps most important of all, I’ve donated to campaigns.

The thing I’ve done that merits any kind of discussion, however, is what I would call “resistance teaching”: going behind the lines of smug, self-satisfied ignorance, and using any tools I have to fight it. I still believe, naive as I am, that education can fight tyranny, injustice, and inequality. So I have engaged in a few activities that will, I hope, result in creating discussions, examining benighted attitudes, and opening up minds. I haven’t done anything too flamboyant, mind you–just a few actions that will hopefully develop into something more tangible in the months to come.

Here is my list:

  1. In spite of feeling gloomy about the future, I’ve continued with my writing, because I felt that even in difficult times, people should concentrate on making art. I self-published my second novel, and I wrote about it here, explaining why self-publishing can be an act of resistance in and of itself.
  2. I began to translate a novel about WW I, written by Léon Werth. I am now nearing my second revision of the translation. I have submitted a chapter of it to several fine magazines and received some nice rejection letters. I will be using my translation to present a short paper on WW I writing and Hemingway at the International Hemingway Conference in Paris this summer.
  3. I’ve traveled–quite a bit. I went to Italy, to Wales, to France, to Dallas, to Boston, and some other places that I can’t remember now. Traveling is important to open up barriers, intellectual as well as political. For example, in France I learned that while we Americans thought of Emmanuel Macron as a kind of savior for the French, he was viewed with some real skepticism and even fear by his electorate. Sure, he was better than Marine LePen–but he was still an unknown quantity, and most French people I met expressed some degree of hesitation about endorsing him.
  4. I directed a play for my community theatre group. Although it was hard and very time-consuming, I discovered that I really believe in the value of community theatre, where a group of individuals come together in a selfless (for the most part) effort to bring the words and ideas of a person long dead back to life. So what if audiences are tiny? It’s the work that matters, not the reception of it.
  5. I gave a talk at the C.S. Lewis Festival, which you can read here. It was fun and stimulating, and I remembered just how much I enjoy thinking and exploring literature and the ideas that shape it.

All of these things are fine, but I think the most important thing I’ve done in the past year is going back into the classroom again, this time as a substitute to help out some friends, but also to engage in what I think of “resistance teaching.” As a substitute professor, as a lifelong learning instructor, I can engage students and encourage them to think without being bound by a syllabus or any other requirements. I can get behind the lines of bureaucratic structures and work to create an atmosphere of free discussion and intellectual exploration. It is small work, and it may not be very effective, but I have taken it on as my own work, my own idiosyncratic way of combating the heartless ignorance, the dangerous half-assed education that prevails in our society.

I have always loved the idea of Resistance Fighters. I just never thought I’d be one myself.

My Short, Tragic Career as an Independent Scholar



Several months ago, I had what seemed like a fantastic idea: now that I was retired from teaching English at a community college, I could engage in critical research, something I’d missed during those years when I taught five or more classes a semester. I had managed to write a couple of critical articles in the last few years of my tenure at a small, rural two-year college in Northern Michigan, but it was difficult, not only because of the heavy demands of teaching, but also because I had very limited access to scholarly resources. Indeed, it is largely due to very generous former students who had moved on to major research institutions that I was able to engage in any kind of scholarly research, a situation which may seem ironic to some readers, but which is really just closing the loop of teacher and student in a fitting and natural way.

And so last fall, on the suggestion of a former student, I decided to throw my hat in the ring and apply to  a scholarly conference on Dickens, and my proposal was chosen. In time, I wrote my paper (on Dickens and Music– specifically on two downtrodden characters who play the flute and clarinet in David Copperfield and Little Dorrit, respectively) and prepared for my part in the conference.

It had been close to 25 years since I had read a paper at a conference, and so I was understandably nervous. Back then, there was no internet to search for information about conference presentations, but now I was able to do my homework, and thus I found a piece of advice that made a lot of sense: remember, the article emphasized, that a conference paper is an opportunity to test out ideas, to play with them in the presence of others, and to learn how other scholars respond to them, rather than a place to read a paper, an article, or a section of a book out loud before a bored audience. Having taught public speaking for over a decade, I could see that this made a lot of sense: scholarly articles and papers are not adapted to oral presentations, since they are composed of complex ideas buttressed by a great many references to support their assertions. To read such a work to an audience seemed to me, once I reflected on it, a ridiculous proposition, and would surely bore not only the audience, but any self-respecting speaker as well.

I wrote my paper accordingly. I kept it under the fifteen-minute limit that the moderator practically begged the panelists to adhere to in a pre-conference email. I made sure I had amusing anecdotes and witty bon mots. I concocted a clever PowerPoint presentation to go with the paper, just in case my audience got bored with the ideas I was trying out. I triple-spaced my copy of the essay, and I–the queen of eye contact, as my former speech students can attest–I practiced it just enough to become familiar with my own words, but not so much that I would become complacent with them and confuse myself by ad-libbing too freely. In short, I arrived at the conference with a bit of nervousness, but with the feeling that I had prepared myself for the ordeal, and that my paper would meet with amused interest and perhaps even some admiration.

It was not exactly a disaster, but it was certainly not a success.

To be honest, I consider it a failure.

It wasn’t that the paper was bad. In fact, I was satisfied with the way I presented it. But my audience didn’t know what to do with presentation. This might be because it was very short compared to all the other presentations (silly me, to think that academics would actually follow explicit directions!). Or it could be because it wasn’t quite as scholarly as the other papers. After all, my presentation hadn’t been published in a journal; it was, as C.S. Lewis might have called it, much more of a “supposal” than a fully-fledged argument. Perhaps as well there was something ironic in my stance, as if I somehow communicated my feeling that research in the humanities is a kind of glorified rabbit hunt that is fun while it lasts but that rarely leads to any tangible, life-changing moment of revelation.

Yet this is not to say that humanities research is useless. It isn’t. It develops and hones all sorts of wonderful talents that enrich the lives of those who engage in it and those who merely dip into it from time to time. I believe in the value of interpreting books and arguing about those interpretations; in fact, I believe that engaging in such discussions can draw human beings together as nothing else can, even at the very moments when we argue most fiercely about competing and contrasting interpretations. This is something, as Mark Slouka points out in his magnificent essay “Dehumanized,” that STEM fields cannot do, no matter how much adminstrators and government officials laud them, pandering to them with ever-increasing budgets at the expense of the humanities.

And this is, ultimately, why I left the conference depressed and disappointed. I had created, in the years since I’d left academia, an idealized image of it that was inclusive, one that recognized its own innate absurdity. In other words, sometime in the last two decades, I had recognized that research in the humanities was valuable not because it produced any particular thing, but because it produced a way of looking at the world we inhabit with a critical acuity that makes us better thinkers and ultimately better citizens. The world of research, for me, is simply a playground in which we all can exercise our critical and creative faculties. Yet the conference I attended seemed to be focused on research as object: indeed, as an object of exchange, a widget to be documented, tallied, and added to a spreadsheet that measures worth.

Perhaps its unfair of me to characterize it in this way. After all, most of the people attending the conference were, unlike me, still very much a part of an academic marketplace, one in which important decisions like tenure, admission to graduate programs, promotions, and departmental budgets are decided, at least in part, by things like conference attendance and presentations. It is unfair of me to judge them when I am no longer engaged in that particular game.

But the very fact that I am not in the game allows me to see it with some degree of clarity, and what I see is depressing. One cannot fight the dehumanization of academia, with its insistent mirroring of capitalism, by replicating that capitalism inside the ivy tower; one cannot expect the humanities to maintain any kind of serious effect on our culture when those charged with propagating the study of humanities are complicit in reducing humanities research to mere line items on a curriculum vitae or research-laden objects of exchange.

I can theorize no solution to this problem beyond inculcating a revolution of ideas within the academy in an effort to fight the now ubiquitous goal of bankrupting the study of arts and humanities, a sordid goal which now seems to characterize the age we live in. And I have no idea how to bring about such a revolution. But I do know this: I will return to my own study with the knowledge that even my small, inconsequential, and isolated critical inquiries are minute revolutions in and of themselves. As we say in English studies, it’s the journey that’s important, not the destination. And in the end, I feel confident that it will take far more than one awkward presentation at a conference to stop me from pursuing my own idiosyncratic path of research and inquiry into the literature I love.

On Becoming Professor Brulov

Image from

There’s a part in one of my favorite Hitchcock movies, Spellbound (1945), in which Ingrid Bergman, a clinical psychologist, takes Gregory Peck, a man who has amnesia and may have commited a murder, to her former psychology professor’s house to hide out from the authorities. Professor Brulov is the epitome of a German academic: eccentric, kind, and highly intelligent, he is genuinely happy to see Ingrid Bergman, who, he says, was his best assistant. It’s a wonderful part of an interesting movie, but lately it’s taken on even greater significance for me.

I first watched the movie on television as a teenager, at which time I identified with Ingrid Bergman (of course I did–the movie is all about Freudian wish fulfillment, after all). Some years ago, as a middle-aged professor, I watched it again with my students when I taught a course on the films of Alfred Hitchcock, and I realized with a rather unpleasant shock that I had evolved without realizing it from the young, attractive, and inquisitive Dr. Constance Peterson into the aged, almost-but-not-quite-grumpy Profesor Brulov. (In Mel Brooks’ hilarious spoof of Alfred Hitchcock’s movies, High Anxiety [1977], Professor Brulov is transformed into Professor Lilloman, which the protagonist mistakenly pronounces as “Professor Little Old Man.”) And, while it has taken me a few years to accept this transformation, I’m now fairly comfortable with my new, much less glamorous, role as mentor to my former students.

The reason is simple. Constance Petersons are a dime a dozen. The world is filled with beautiful young people making their mark on the world. But Brulov–he’s different. In fact, he’s quite special. Think of it this way: When Peterson is in trouble, she seeks him out, and Brulov helps her without asking any difficult questions, despite the fact that he knows she’s lying to him. He trusts her even more than she trusts him, which is touching, in a way. And so one thing that this very complex movie does is set up the idea of a mentor relationship between Brulov and his former student. It’s an interesting side angle to the movie that I never really noticed before.

And, now, in my retirement, I am learning to embrace this new Brulovian stage of life. I have had very few, if any, mentors in my own career, so while I’m not too proficient at it yet, I hope to grow into the role in the years to come. The way I see it, we need more Professor Brulovs in this world; we can’t all be Ingrid Bergman or Gregory Peck, after all. I’m happy that my students remember me with something other than aversion, after all, and so becoming Professor Brulov is, at least for now, quite enough for me.


My Short, Unhappy Life as a Politician



Four years ago, I made a decision that turned out to be a mistake–a real whopper of one. I ran for state representative. It is a decision that I still regret today.

Why would I, a political innocent, so to speak, decide to run for office? The extent of my experience in the political world at that time was knocking on doors fo2009-01-07-shepardobamaposter.jpgr Barack Obama back in 2008. I knew next to nothing about the real political climate. I was naive and optimistic, and, excited by the events in Zuccotti Park (the Occupy Wall Street movement), I thought I saw real possibility to be the  change that President Obama had called for. I thought I could make a difference–that my very innocence in the realm of politics might make me more credible and hence more  attractive to voters. Of course, I can see now, at a distance of almost four years, that I was not just naive, but downright stupid. It probably isn’t the first time a candidate has been motivated by silly, misguided ideas.

The real question is this, however: why did I, a person with a more than adequate supply of humility, decide to run in the first place? What made me think I could make any kind of a difference? I’ve been considering that question for the last three years. Looking back, I see there was a perfect storm of situations that made me believe that it was not only my right, but my duty to run for office. First of all, as a community college professor, I was teaching writing and public speaking to a population of largely underprivileged students. I realized that not only could I gain valuable experience as a teacher if I ran, but that I could also serve as an example to my students. After all, at the end of every semester, I would offer both my writing and my speech students this parting advice: “Now you know how to raise your voices. Go out and do it. Make trouble for other people. Be good citizens.” Running for office was a chance for me to practice what I preached, and it would help make me a better teacher.

Primary Night Tally

(Side Note: This much was true. I did learn a lot from campaigning, which I tried to incorporate into the tiny textbook I wrote for my speech students. I used what skills I had in writing and in public speaking several times a day, and I found that teaching those practical skills was both meaningful and necessary. I do think I became a better teacher by running for office.)

Second, my position as a community college professor in a very rural area allowed me to see that the people whom recent legislation hurt were my students. I felt obliged to help them as much as I could. Third, as union president, I was also able to see the willful ignorance and arrogance of those in office. Fourth, my husband is a whiz at numbers and finance, and I knew he would make a fine campaign manager, and that by sharing the experience with me, we could be partners in a greater good. Fifth, I knew many people in my town, and they all told me they thought my candidacy was a good idea.

All of these things added up to a feeling of excited inevitability, which then turned into a sense of obligation to run. The only way I can describe the result of this transformation is to compare it to a statement found not once, but three times in Astrid Lindgren’s The Brothers Lionheart: “There are things you have to do even if they are dangerous, otherwise you aren’t a human being but just a bit of filth.” This may be overstating the case a bit, but at the time, I really felt that if I didn’t run, I would be shirking an important responsibility, and that I would be avoiding an unsavory but necessary duty.

The result of all this will not be a surprise. I lost–and by a hefty margin. I don’t mind losing the election; it was probably the best thing that could happen to me personally. But I lost more than the election, and that’s what really bothers me. Somewhere along the way, I lost my my faith in a political system that unabashedly favors those with large campaign coffers. I lost my desire to talk to and get to know people, which had been so useful in the classroom, and which I am only now regaining. I lost a good deal of self confidence, too, because I saw the limit of my own ability. Most tragic of all, I lost what had started me on my journey in the first place: hope.

Even a wonderful group of supporters can’t guarantee a victory.

For three years, I have tried not to talk about my abortive foray into politics. To be honest, I look upon it as if it were a stupid stunt I pulled while on a bender, as if I woke up one morning to remember that pulling off my clothes and jumping into the fountain was not a dream, but a horrible reality. And, like a hungover college student on Monday morning, I now regard my actions while under the influence with a sense of bemused shame:  I’m impressed with the enormity of my mistake, because I should have known better than to have exposed myself.

But I am healing from my experience, and perhaps the best evidence of this is my willingness to analyze my feelings about running for office. I offer up this post as a testament to a person’s ability to recover, if not to learn, from unpleasant experiences.

A collector’s item



The End of Something

My career as a full-time teacher is drawing to a close, and I’m having some trouble getting used to the idea.

Several weeks ago, I decided to take advantage of an early retirement program at my college, so I will be leaving at the end of this semester. Of course, I’m really excited about the prospect of having free time to read whatever I want, in whatever order I want to read it; to focus on my writing, music and knitting; and, most of all, to do a bit of traveling. Teaching, as I told one of my colleagues, was getting in the way of my own learning, and so I’m grateful to be able to step back from a career that, despite its frustrations, has been a central and valued part of my life. I have learned more about teaching in the dozen or so years I’ve spent at this small community college in rural Northern Michigan than I ever thought possible, which is part of the reason I have mixed feelings about leaving.

In some ways, I feel I’m at the top of my game as a teacher. I don’t have to take a lot of time to prepare for each class, and most classroom situations don’t really throw me. (Of course, there are a few that were pretty funny, and, once I retire, I look forward to sharing these stories, like the one about the time a speech student tried to bring a goat to class.) Grading papers, of course, is still a tremendous burden, and like most writing instructors, I greatly resent it. But it turns out that grading is not as heavy a burden for me as the human burden. By this, I mean that I try to see each of my students as an individual; everyone, I told myself as I began my teaching career, is someone’s child. I asked myself, how would I want my child treated by their professors? The answer was clear.  So I have always tried to be open, inviting, and encouraging with my students, and it’s made for some great moments as a teacher. But it’s also made it possible for me to see the real pain in my students’ lives. From the student who schleps her infant to class, no matter subzero temperatures, to the student whose grip on religion is ironclad because he’s found no other outlet or support, to the student who suffers from a laundry list of health problems as a result of serving in Afghanistan–each of these students has a claim on me, because I have always felt it’s more important to be a human being first and a professor second.

It’s a noble idea, but now, as I move towards my last days of teaching (at least full-time), I can see its flaws. In essence, teaching as a human being is like “hearing the grass grow and the squirrel’s heartbeat,” in the famous words of George Eliot in Middlemarch (Chapter 20), whose narrator predicts that those who can attend to such emotional minutiae are likely to “die of that roar which lies on the other side of silence.” I am not in danger of dying from being exposed to that roar, but I am pained by my work these days. I’m stricken by the sadness of seeing the eccentric student who has no friends, sitting alone in the cafeteria; I’m depressed by the difficulties facing young mothers and fathers as they try to gain an education to make life better for their children; and I’m overwhelmed by the challenges, and, yes, the tragedies, that lie behind the eyes of many of my students who stare up at me as I try to dish out some wisdom to help them in their journey through life.

It’s a tough job, and I’ve given it what I could over the years. One consolation I’ve always had, however, is that if I do a poor job one semester, I could always improve the next time around. This semester is different, though. There is no next time. Does that mean I’ll finally get it right and teach well this term? The answer has become clear over the past few weeks. This semester will be like all the other semesters I’ve had: some successes in the classroom, but many more failures. I’m satisfied that I’ve made the right decision, though. It’s time for someone else to step up and try their hand at this job. I’m ready to take my ball and go home, even if that means that I leave a career I love.

Image from
Image from