How We Got Here: A Theory

The United States is a mess right now. Beset by a corrupt president and his corporate cronies, plagued by a — um — plague, Americans are experiencing an attack on democracy from within. So just how did we get to this point in history?

I’ve given it a bit of thought, and I’ve come up with a theory. Like many theories, it’s built on a certain amount of critical observation and a large degree of personal experience. Marry those things to each other, and you can often explain even the most puzzling enigmas. Here, then, is my stab at explaining how American society became so divisive that agreement on any political topic has become virtually impossible, leaving a vaccuum so large and so empty that corruption and the will to power can ensure political victory.

I maintain that this ideological binarism in the United States is caused by two things: prejudice (racism has, in many ways, always determined our political reality), and lack of critical thinking skills (how else could so many people fail to see Trump for what he really is and what he really represents?) Both of these problems result from poor education. For example, prejudice certainly exists in all societies, but the job of a proper education in a free society is to eradicate, or at least to combat, prejudice and flawed beliefs. Similarly, critical thinking skills, while amorphous and hard to define, can be acquired through years of education, whether by conducting experiements in chemistry lab or by explicating Shakespeare’s sonnets. It follows, then, that something must be radically wrong with our educational system for close to half of the population of the United States to be fooled into thinking that Donald Trump can actually be good for this country, much less for the world at large.

In short, there has always been a possibility that a monster like Trump would appear on the political scene. Education should have saved us from having to watch him for the last four years, and the last month in particular, as he tried to dismantle our democracy. Yet it didn’t. So the question we have to ask is this: Where does the failure in education lie?

The trendy answer would be that this failure is a feature, not a bug, in American education, which was always designed to mis-educate the population in order to make it more pliable, more willing to follow demogogues such as Trump. But I’m not satisfied with this answer. It’s too easy, and more important, it doesn’t help us get back on track by addressing the failure (if that’s even possible at this point). So I kept searching for an explanation.

I’ve come up with the following premises. First, the divisions in the country are caused by a lack of shared values–this much is clear. For nearly half the American people, Trump is the apotheosis of greedy egotism, a malignant narcissist who is willing to betray, even to destroy, his country in order to get what he wants, so that he can “win” at the system. For the other half, Trump is a breath of fresh air, a non-politician who was willing to stride into the morass of Washington in order to clean it up and set American business back on its feet. These two factions will never be able to agree–not on the subject of Trump, and very likely, not on any other subject of importance to Americans.

It follows that these two views are irreconcilable precisely because they reflect a dichotomy in values. Values are the intrinsic beliefs that an individual holds about what’s right and wrong; when those beliefs are shared by a large enough group, they become an ethical system. Ethics, the shared sense of right and wrong, seems to be important in a society; as we watch ours disintegrate, we can see that without a sense of ethics, society splinters into factions. Other countries teach ethics as a required subject in high school classes; in the United States, however, only philosophy majors in universities ever take classes on ethics. Most Americans, we might once have said, don’t need such classes, since they experience their ethics every day. If that ever was true, it certainly isn’t so any more.

Yet I would argue that Americans used to have an ethical belief system. We certainly didn’t live up to it, and it was flawed in many ways, but it did exist, and that’s very different from having no ethical system at all. It makes sense to postulate that some time back around the turn of the 21st century, ethics began to disappear from society. I’m not saying that people became unethical, but rather that ethics ceased to matter, and as it faded away, it ceased to exist as a kind of social glue that could hold Americans together.

I think I know how this happened, but be warned: my view is pretty far-fetched. Here goes. Back in the 1970s and 1980s, literary theory poached upon the realm of philosophy, resulting in a collection of theories that insisted a literary text could be read in any number of ways, and that no single reading of a text was the authoritative one. This kind of reading and interpretation amounted to an attack on the authority of the writer and the dominant ideology that produced him or her, as it destabilized the way texts were written, read, and understood. I now see that just as the text became destabilized with this new way of reading, so did everything else. In other words, if an English professor could argue that Shakespeare didn’t belong in the literary canon any longer, that all texts are equally valid and valuable (I’ve argued this myself at times), the result is an attack not only on authority (which was the intention), but also on communality, by which I mean society’s shared sense of what it values, whether it’s Hamlet or Gilligan’s Island. This splintering of values was exacerbated by the advent of cable television and internet music sources; no one was watching or listening to the same things any more, and it became increasingly harder to find any shared ideological place to begin discussions. In other words, the flip side of diversity and multiplicity–noble goals in and of themselves–is a dark one, and now, forty years on, we are witnessing the social danger inherent in dismantling not only the canon, but any system of judgment to assess its contents as well.

Here’s a personal illustration. A couple of years ago, I taught a college Shakespeare class, and on a whim I asked my students to help me define characters from Coriolanus using Dungeons and Dragons character alignment patterns. It was the kind of exercise that would have been a smashing success in my earlier teaching career, the very thing that garnered me three teaching awards within five years. But this time it didn’t work. No one was watching the same television shows, reading the same books, or remembering the same historical events, and so there was no way to come up with good examples that worked for the entire class to illustrate character types. I began to see then that a splintered society might be freeing, but at what cost if we had ceased to be able to communicate effectively?

It’s not a huge leap to get from that Shakespeare class to the fragmentation of a political ideology that leaves, in the wreckage it’s produced, the door wide open to oligarchy, kleptocracy, and fascism. There are doubtless many things to blame, but surely one of them is the kind of socially irresponsible literary theory that we played around with back in the 1980s. I distinctly remember one theorist saying something to the effect that no one has ever been shot for being a deconstructionist, and while that may be true, it is not to say that deconstructionist theory, or any kind of theory that regards its work as mere play, is safe for the society it inhabits. Indeed, we may well be witnessing how very dangerous unprincipled theoretical play can turn out to be, even decades after it has held sway.

Convent-ional Trends in Film and Television

Lately I’ve been spending quite a bit of time with the Aged Parent , and one thing we do together–something we’ve rarely done before–is watch television shows. My mother, deep in the throes of dementia, perks up when she sees Matt Dillon and Festus ride over the Kansas (it is Kansas, isn’t it?) plains to catch bad guys and rescue the disempowered from their clutches. Daytime cable television is filled with Westerns, and I find this fascinating, although I’ve never been a fan of them in the past. Part of my new-found fascination is undoubtedly inspired by Professor Heather Cox Richardson’s theory–presented in her online lectures as well as her Substack newsletter–that the United States’s fascination with the Western genre has a lot to do with the libertarian, every-man-for-himself ideal most Westerns present. I think she’s got a point, but I don’t think that this alone explains our fascination with Westerns. This, however, is an argument I’ll have to return to at a later date, because in this blog post, what I want to talk about is nuns.

Yes–that’s right–Catholic nuns. What was going on in the 1950s and ’60s that made the figure of the young, attractive nun so prevalent in films and television? Here, for example, is a short list of the movies that feature nuns from the 1960s:

  1. The Nun’s Story (1959) with Audrey Hepburn
  2. The Nun and the Sergeant (1962), itself a remake of Heaven Knows, Mr. Allison (1957)
  3. Lilies of the Field (1963) with Sidney Poitier
  4. The Sound of Music (1965), no comment needed
  5. The Singing Nun (1966) starring Debbie Reynolds
  6. The Trouble with Angels (1966) with Rosalind Russsell and Hayley Mills
  7. Where Angels Go, Trouble Follows (1968), the sequel to #6
  8. Change of Habit (1969), starring the strangely matched Mary Tyler Moore and Elvis Presley (!)

The fascination with nuns even bled over into television, with the series The Flying Nun (1967-1970), starring a post-Gidget Sally Field. This show, with its ridiculous premise of a nun who can fly, seems to have ended the fascination with nuns, or perhaps its bald stupidity simply killed it outright. From 1970 until 1992, when Sister Act appeared, there seemed to be a lull in American movies featuring nuns. Incidentally, the films I’ve mentioned here all feature saccharine-sweet characters and simple plots; in a typically American fashion, many of the difficult questions and problems involved in choosing a cloistered life are elided or simply ignored. There are, however, other movies featuring nuns that are not so wholesome; Wikipedia actually has a page devoted to what it terms “Nunsploitation.” These films, mostly foreign, seem more troubling and edgier. I leave an analysis of such films to another blogger, however, because what I really want to investigate is this: why was American culture so enamored, for the space of a decade, with nuns and convent life? I’ve argued previously that popular culture performs the critical task of reflecting and representing dominant ideologies, so my question goes deeper than just asking, “Hey, what’s with all these nuns?” Rather, it seeks to examine what conditions caused this repetitive obsession about nuns in a country that prided itself on the distance between religion and politics and, at least superfiically, religion’s exclusion from American ideology.

I have some ideas, but nothing that could be hammered together neatly enough to call a theory to explain this obsession, and so I will be looking to my readers to provide additional explanations. Surely the box-office success of films starring Audrey Hepburn, Debbie Reynolds, Sidney Poitier, and Julie Andrews count for something: Hollywood has always been a fan of the old “if it worked once, it should work again” creative strategy. But I think this might be too simple an explanation. I’ll have another go: perhaps in an era when women were beginning to explore avenues to power, self-expression, and sexual freedom, the image of a contained and circumscribed nun was a comfort to the conservative forces in American society. It’s just possible that these nuns’ stories were a representation of the desire to keep women locked up, contained, and submissive. On the other hand, the image of the nun could be just the opposite, one in which women’s struggle for independence and self-actualization was most starkly rendered by showing religious women asserting their will despite all the odds against them.

I think it’s quite possible that both these explanations, contradictory as they seem, might be correct. Certainly the depiction of women who submit to being controlled and defined by religion presents a comforting image of a hierarchical past to an audience that fears not only the future but the present as well (we should remember that the world was experiencing profoundly threatening social and political upheaval in the late 1960s). Yet at the same time, the struggle many of these nun-characters undergo in these films might well be representative of non-religious women’s search for meaning, independence, and agency in their own lives.

As I said, I have more questions than answers, and I will end this post with an obvious one: what effect did these films have on the general public? We’ve briefly explored the idea of where such movies came from and what they represent in the American ideology that produced them, but what did they do to their audiences? Was there any increase in teenage girls joining convents in the 1970s, after these films played in theatres and later, on television? What did the religious orders themselves have to say about such films? I’d be interested in learning the answers to these questions, so readers, if you have any ideas, or if you just want to compare notes and share your impressions, please feel free to comment!

How the Study of Literature Could Save Democracy

Beowulf MS, picture from Wikipedia

Usually, I am not one to make grand claims for my discipline. There was a time, back when I was a young graduate student in the 1980s, that I would have; perhaps even more recently, I might have argued that understanding ideology through literary theory and criticism is essential to understanding current events and the conditions we live in. But I no longer believe that.

Perhaps in saying this publicly, I’m risking some sort of banishment from academia. Maybe I will have to undergo a ritual in which I am formally cashiered, like some kind of academic Alfred Dreyfus, although instead of having my sword broken in half and my military braids ripped to shreds, I will have my diploma yanked from my hands and trampled on the ground before my somber eyes. Yet unlike Dreyfus, I will have deserved such treatment, because I am in fact disloyal to my training: I don’t believe literary theory can save the world. I don’t think it’s necessary that we have more papers and books on esoteric subjects, nor do I think it’s realistic or useful for academics to participate in a market system in which the research they produce becomes a commodity in their quest for jobs, promotions, or grant opportunities. In this sense, I suppose I am indeed a traitor.

But recently I have realized, with the help of my friend and former student (thanks, Cari!), that literature classes are still important. In fact, I think studying literature can help save our way of life. You just have to look at it this way: it’s not the abstruse academic research that can save us, but rather the garden-variety study of literature that can prove essential to preserving democracy. Let me explain how.

I’ll begin, as any good scholar should, by pointing out the obvious. We are in a bad place in terms of political discourse–it doesn’t take a scholar to see that. Polarizing views have separated Americans into two discrete camps with very little chance of crossing the aisle to negotiate or compromise. Most people are unwilling to test their beliefs, for example, preferring to cling to them even in the face of contradictory evidence. As social psychologists Elliot Aronson and Carol Tavris point out in a recent article in The Atlantic, “human beings are deeply unwilling to change their minds. And when the facts clash with their preexisting convictions, some people would sooner jeopardize their health and everyone else’s than accept new information or admit to being wrong.” They use the term “cognitive dissonance,” which means the sense of disorientation and even discomfort one feels when considering two opposing viewpoints, to explain why it is so hard for people to change their ideas.

To those of us who study literature, the term “cognitive dissonance” may be new, but the concept certainly is not. F. Scott Fitzgerald writes, in an essay which is largely forgotten except for this sentence, “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function” (“The Crack-Up,Esquire Magazine, February 1936). In addition, cognitive dissonance isn’t that far removed from an idea expressed by John Keats in a letter he wrote to his brothers back in 1817. He invents the term “Negative Capability” to describe the ability to remain in a liminal state of doubt and uncertainty without being driven to come to any conclusion and definitive belief. Negative capability, in other words, is the capacity to be flexible in our beliefs, to be capable of changing our minds.

I believe that the American public needs to develop negative capability, lots of it, and quickly, if we are to save our democracy.

But there’s a huge problem. Both Fitzgerald and Keats believe that this function is reserved only for geniuses. In their view, a person is born with this talent for tolerating cognitive dissonance: you either have it–in which case you are incredibly gifted–or you don’t. In contrast, Aronson and Tavris clearly believe it’s possible to develop a tolerance for cognitive dissonance: “Although it’s difficult, changing our minds is not impossible. The challenge is to find a way to live with uncertainty…” While their belief in our ability to tolerate cognitive dissonance and to learn from it is encouraging, it is sobering that they do not provide a clear path toward fostering this tolerance.

So here’s where the study of literature comes in. In a good English class, when we study a text, whether it’s To Kill a Mockingbird or Beowulf, students and teacher meet as more or less equals over the work of literature in an effort to find its meaning and its relevance. Certainly the teacher has more experience and knowledge, but this doesn’t–or shouldn’t–change the dynamic of the class: we are all partners in discovering what the text has to say in general, and to us, specifically. That is our task. In the course of this task, different ideas will be presented. Some interpretations will be rejected; some will be accepted. Some will be rejected, only to be later accepted, even after the space of years (see below for an example).

If we do it well, we will reach a point in the discussion where we consider several differrent suggestions and possibilities for interpretation. This is the moment during which we become experts in cognitive dissonance, as we relish interpretive uncertainty, examining each shiny new idea and interpretation with the delight of a child holding up gorgeously colored beads to the light. We may put a bead down, but it is only to take up another, different one–and we may well take up the discarded bead only to play with it some more.

The thing that makes the study of literature so important in this process is that it isn’t really all that important in the grand scheme of things. To my knowledge, no one has ever been shot for their interpretation of Hamlet; the preservation of life and limb does not hang on an precise explanation of Paradise Lost. If we use the study of literature as a classroom designed to increase our capacity for cognitive dissonance, in other words, we can dissipate the highly charged atmosphere that makes changing our minds so difficult. And once we get used to the process, when we know what it’s like to experience cognitive dissonance, it will be easier to for us to tolerate it in other parts of our lives, even in the sphere of public policy and politics.

If I seem to be writing with conviction (no cognitive dissonance here!), it’s because I have often experienced this negative capability in real time. I will give just two examples. The first one occurred during a class on mystery fiction, when we were discussing the role of gossip in detective novels, which then devolved into a discussion on the ethics of gossip. The class disagreed violently about whether gossip could be seen as good or neutral, or whether it was always bad. A loud (and I mean loud!) discussion ensued, with such force that a janitor felt compelled to pop his head into the classroom–something that I had never seen happen either before or since then–to ask if everything was ok. While other teachers might have felt that they had lost control of the classroom, I, perversely, believe that this might have been my most successful teaching moment ever. That so many students felt safe enough to weigh in, to argue and debate passionately about something that had so little real importance suggested to me that we were exercising and developing new critical aptitudes. Some of us, I believe, changed our minds as a result of that discussion. At the very least, I think many of us saw the topic in a different way than we had to begin with. This, of course, is the result of experiencing cognitive dissonance.

My second example is similar. At the end of one very successful course on Ernest Hemingway, my class and I adjourned for the semester to meet at a local bar, at which we continued our discussion about The Sun Also Rises. My student Cari and I got into a very heated discussion about whether the novel could be seen as a pilgrimage story. Cari said it was ; I vehemently disagreed. The argument was fierce and invigorating–so invigorating, as a matter of fact, that at one point a server came to inquire whether there was something wrong, and then a neighboring table began to take sides in the debate. (For the record, I live in Hemingway country, and everyone here has an opinion about him and his works.) Cari and I left the bar firmly ensconced in our own points of view, but a couple of years ago–some three years after the original argument occurred–I came to see it from Cari’s point of view, and I now agree with her that The Sun Also Rises can be seen as a sort of pilgrimage tale. It took a while, but I was able to change my mind.

It is this capacity to change one’s mind, I will argue, that is important, indeed, indispensable, for the democratic process to thrive.

In the end, it may well be that the chief contribution that good teachers of literature make to culture is this: we provide a safe and accessible place for people to learn what cognitive dissonance feels like, and in doing so, we can help them acquire a tolerance for it. This tolerance, in turn, leads to an increase in the ability to participate in civil discourse, which is itself the bedrock of democratic thought and process. In other words, you can invest in STEAM classes all you want, but if you really want to make people good citizens, do not forget about literature courses.

In view of this discovery of mine, I feel it’s my duty to host a noncredit literature class of sorts in the fall, a discussion-type newsletter that covers the great works of English literature–whatever that means–from Beowulf to the early Romantic period, in which discussion is paramount. If you’re interested or have suggestions, please let me know by commenting or messaging me, and I’ll do my best to keep you in the loop.

And in the meantime, keep your minds open! Cognitive dissonance, uncomfortable as it is, may just be what will keep democracy alive in the critical days to come.

Choosing Optimism

Photo credit: Daniel Shumway

I haven’t been writing much lately, even though Heaven knows I have the time for it these days. I suppose the main reason is because I haven’t had anything positive to say for a couple of weeks. The political outlook, as well as the growing realization that social distancing will become the new norm for the next three to five years, has taken its toll on my usual optimism.

Having said that, I have to add that I must be the most cautious optimist who ever walked the earth. Several years ago, when my mother was facing a fairly dire medical diagnosis, I told my daughter that until there was definitive proof of it, I would continue to hope for the best. Granted, this was a conscious choice on my part; like everyone else, I can always see the worst possibilities, but on this occasion, I had deliberately decided not to panic. “After all,” I added, “I have absolutely nothing to lose by being an optimist.” Immediately after the words came out of my mouth, I started to laugh; I could not think of a more pessimistic way of expressing my optimism. It’s almost as if I was some mashup of Ernie and Bert, of Winnie the Pooh and Eeyore, existing in the same body at the same time.

(Incidentally, I turned out to be right: my mother was misdiagnosed and recovered, but not before a young doctor, visiting her in her hospital room on his rounds, said to her, “You’re doing so much better! And you’re looking very good for a woman who is 70 years old.” My mother smiled and replied, “Thank you! Actually, I’m 80 years old.” He checked her chart and nodded. “Yes, so you are. Well, you’re looking quite good, aren’t you!” It must have cost my mother a bit to have answered him in that way, because she’s self conscious about her age, but I assume the temptation to put the young doc in his place was simply too great for her to resist.)

This is simply a long-winded way of saying that I often don’t write for this blog unless I’m either outraged or optimistic, and I’ve been neither for the past week or so. But now I think I have something good, something positive, to offer my readers–whoever you may be. It’s not entirely good, but it’s a sunny day today, after several days of wintry weather, and for the moment, at least, I’m able to see some bright spots in our landscape.

It comes in a bad news/good news package. So, here’s the bad news: we’ve tanked our economy, globally, because of Covid-19, trashing productivity, jeopardizing livelihoods, causing mass unemployment. And now, here’s the good news: we’ve tanked our economy, globally, because of Covid-19. How is that good? Think of it this way: whatever happens from here on out, we should never forget that we were willing to sacrifice a great deal, perhaps as much as any generation has ever sacrificed in so short a time, not for a war, but to protect segments of our population that we might ordinarily never even consider: the aged, the infirm, the immunocompromised. This is remarkable–so remarkable, in fact, that we might think this kind of altruism has never happened before in the history of humankind.

But if we did think this, we’d be wrong, because it has. Over and over again.

The anthropologist Margaret Mead once said she considered the earliest sign of civilization to be a healed femur, because it demonstrated that compassion and caring existed within a society, since it takes at least six weeks for the thighbone to heal, and during that time the injured person would be totally dependent on others for his or her survival. And, despite our modern tendency to believe, along with Thomas Hobbes, that life in a natural state must be “nasty, brutish, and short,” we are gaining more and more evidence of the existence of compassion in prehistoric human societies. For example, anthropologists have discovered that Neanderthals cared for injured people, nursing them into old age–and this despite other infirmities that would have precluded their useful contributions to the group.

Like many other people, I’ve been taught that nature was a rough business, and that only the fittest survive. Americans especially have been nurtured on that old chestnut, it seems, even before Darwin’s theories were misappropriated and twisted to create Social Darwinism. We’ve been taught to see the world in this way because it fits our view of ourselves as “rugged individuals” who conquer the environment and make their own destiny. But the era in which this view has held sway is about to end, I hope, and we have Covid-19 to thank for its demise.

One thing we have to understand is that, Hollywood blockbusters and dystopian fiction notwithstanding, disasters don’t always bring out the worst in people; in fact, much of the time, they bring out the very best in humans, as many theorists have pointed out. At least in the early stages of disasters, people tend to act rationally and altruistically. In the last two months, many of us have seen heroic and caring actions performed by people in our neighborhoods and communities. It’s these things we need to focus on, I’d argue, hard as it may be when we are supplied with a never-ending supply of fear and anxiety.

Don’t get me wrong — I’m both afraid and anxious. I should, to be honest, add a few more adjectives to the mix: terrified, frustrated, angry, sad, antsy, hysterical. But I am learning to fight against the media, and perhaps my own nature, which has learned to feed on bad news and fear. In fact, this blog post is just my way of sharing my most recent discovery about the way we live now: We have been spoon-fed bad news for so long now that we are addicted to it. Like the teenager who loves to ride the scariest roller coasters or watch the most terrifying horror flicks, we want to scare ourselves with stories of the disasters that lie ahead of us, of tragedies waiting to jump out at us. Fear, it turns out, is just as thrilling in a news report as it is in a terrifying ride which we cannot get off of. I will leave it to another blogger, or to my readers (please comment below!), to explain why fear is so compelling and addictive. My point for now is that many of us cannot do without such fear; it has become, in the last ten years especially, part of the fabric of our lives now.

But it is dangerous to give in to our addiction to fear in the form of news reports and dire projections about the future, for at least two reasons. First, such reports and predictions may be wrong. Media reporting of human behavior in disasters often is wrong, concentrating on the bad rather than the good. Murder and mayhem sells: “if it bleeds, it leads,” according to an old journalistic saw. Second, these dark views, in addition to their potential inaccuracy, feed our desire for the negative, which I’d argue exists in all of us, even the most optimistic of us. If we think of this desire as an addiction, perhaps we can begin to see the danger of it and wean ourselves off of our negative viewpoints. We may not be more productive (and the very nature of productivity will be questioned and redefined in the coming years, I’d guess), but we may be happier, more satisfied, and ready to work hard to create a better world than the one that lies in shambles around us. After all, we have nothing to lose by being optimists about the future.

Of course, the challenges that face us are enormous, perhaps greater than any other generation has faced. And I don’t always feel optimistic about the likelihood that we can change things substantially. But I know that change is possible, although admittedly it sometimes comes at a great cost. And I know as well that in order to create necessary changes, the work must start well before they actually occur, sometimes centuries before. In other words, we must often imagine the possibility for change long before we can expect to effect it. (This kind of imagining, after all, is exactly what Virginia Woolf does so beautifully at the end of A Room of One’s Own in regard to women’s writing.) In other words, incremental change is likely just as valuable as actual change, though it is often invisible, swimming just below the surface of current events. Without it, real change could never occur.

So I will just end by referring you to the last scene of Charlie Chaplin’s masterpiece of satire, The Great Dictator (1940, though begun in 1937), in which he makes fun of Adolf Hitler and Nazism. Chaplin reportedly ad-libbed this speech he gave as the Hitler lookalike, which is perhaps why it rings as true now as it did 70 years ago, when the world was facing another catastrophe, one which it survived and continues to learn from to this day. Take a look at it and see if it makes you feel just a little bit better as you face the future that lies ahead.

Inappropriate Songs from the Past (brought to you by SiriusXM Radio)

BennyGoodmanandBandStageDoorCanteen
By Film screenshot – Stage Door Canteen film, Public Domain, https://commons.wikimedia.org/w/index.php?curid=1194330

I am a fairly devoted listener of ’40s Junction, a channel on Sirius Radio that plays songs from the 1940s–except when they don’t. (To my lasting fury and frustration, I discovered this year that the station ceases its normal operations on November 1 and, for the next six weeks, plays “holiday music.” Now, I don’t know if the program managers decided that the people who listen to 1940s music are the same ones who enjoy endless Christmas carols, but if anyone from Sirius is reading this, here’s a hint: they aren’t.) One thing I’ve found out by listening to ’40s Junction is that  if one listens long enough, one can discover some real gems. I mean, we all know “Blues in the Night,” “I’ll Be Seeing You,” “Don’t Fence Me In,” and “String of Pearls,” but occasionally this station plays some songs I’ve never heard of, despite living in a certifiable time warp for my entire life. And so today I’m taking a break from politics and pessimism to discuss four of these little oddities from the past, complete with YouTube links, so that you can listen to them and judge for yourself. Above all, I’m curious about my readers’ reactions to these songs, so please do leave your comments on some or all of these songs.

I will start out with a song that has a catchy rhythm and some very interesting lyrics: “The Lady from 29 Palms.” It seems to have been recorded in 1946 or so, and there is an interesting reference to the explosive attraction of the lady in question, who is compared to “a load of atom bombs,” which, coming so soon after the bombing of Hiroshima and Nagasaki, is highly insensitive, to say the least. Yet it has a great sax line in the beginning, and with its really clever rhymes, I’d say this is a song that’s worth listening to.

Next on my list is a very odd little song that shocked me when I first heard it. Unlike “The Lady from 29 Palms,” “Who’s Yahoodi” is famous enough to have its own Wikipedia entry, which certainly  bears checking out. Hop over there, and you can read about the song’s origins on the Bob Hope Radio Program, when announcer Jerry Colonna got tickled as he introduced one of Hope’s guests, the young violin prodigy Yehudi Menuhin. Colonna made fun of the name, continued his joke through later programs, and eventually, it became a popular meme, although memes hadn’t been invented yet.  In 1940, songwriters Bill Seckler and Matt Dennis made a song out of it. The U.S. Navy also got some mileage out of the situation, naming one of its research programs “Project Yehudi.” I’ve linked to the Cab Calloway version, but there are several other versions, including an astoundingly antisemitic one (I know–the whole thing’s kind of antisemitic, but this version is really takes the cake). As with “The Lady from 29 Palms,” the song itself is catchy, with the kind of finger-snapping rhythm that makes tunes from this era so appealing. In addition, the song’s many references to secretive, spooky people who are there, but not there remind me of those Dr. Who beings, the Silence, who watch and influence human events, but are never seen by humans. Added to these odd but intriguing lyrics, there’s some enjoyable big-band music, with the necessary saxophone solos and brass rhythms creating a memorable song. How it disappeared is a mystery–unless, of course, the Silence got involved in the whole thing and wiped it from our collective memories.

The next two songs are about body-shaming. In a way, I’m surprised they found their way on the air at all in the present day, given the changing climate and hyper-awareness about body images we’ve seen in recent years. The first, “Lean Baby,” was recorded by Frank Sinatra and was very popular. But the Dinah Washington version is even better, so you should check that one out, too. Clever yet brutal lyrics make the song interesting, and once again the music is quite catchy. On the flip side of this “appreciation” of thinness is “Mr. Five-by-Five,” arguably the most successful of the songs I’ve mentioned here. At least seven singers have recorded versions of it, the most recent one in a 2013 movie (Gangster Squad), according to its Wikipedia entry. Here’s a version by Ella Mae Morse recorded in 1942. Again, there are some devilishly funny lyrics that, inappropriate as they are, make you laugh out loud–if you’re by yourself.

So, readers, what do you think about these four songs? Politically incorrect, a fascinating trip down memory lane, historical footnotes, or just oddities? I’m not sure what to make of them, but I am grateful that they are preserved, inappropriate or not, for us to listen to, consider, and critique them. So, thanks, Sirius Radio, for the memories–even if I have to put up with two months of Christmas music to get them.

 

Ecohats

Chief Petoskey sporting an Ecohat in Petoskey, Michigan.

 

We are facing an ecological emergency, and too little is being done to address the consequences of climate change. As individuals, our actions may seem inadequate. Yet every action, however small, can lead to something bigger. Change comes only as a result of collective will, and we can demonstrate that will by showing that we desire immediate political, social, and economic action in the face of global climate change.

Ecohats are not a solution, but they are a manifestation of will made public. Based on the pussyhat, a public display of support for women’s rights, Ecohats display support for immediate political action to address the need for systemic change to deal with climate change. They are easy to create; simply knit, crochet, or sew a hat in any shade of green to show your support for the people and organizations that are dedicated to addressing the issue of climate change, then wear it or display it with pride and dedication.

We can’t all be Greta Thunberg, but we can show our support for those people and organizations that are tirelessly working to address climate change, like  Citizens’ Climate Lobby, 350.org, Center for Biological Diversity, and others.

Please consider making, wearing, and displaying an Ecohat to show your support!

 

Ernest Hemingway is also wearing an Ecohat!

Against Cynicism

scout and the mob

 

Like many other people, I have been very worried about the direction we’re going–not just the United States, but the world in general. Populism–which in its most innocuous form is little more than cheering for the home team but which can be so much more insidious and damaging–is on the rise, and partisanship seems to have infiltrated many western governments, causing us to question the efficacy of democracy itself. Couple this with the imminence of climate change, and the future of human civilization seems dire indeed.

So I understand why many of us might live in a state of worry, of fearful suspense. I know how it feels to wake up each morning, wondering what new terrible thing has happened while I slept, and how it feels to wait impatiently for the slow-moving wheels of democratic rule to right themselves, and to hope for a period in which government works for its citizenry rather than for corporations and billionaires. I, too, have lost hope at times, allowing myself to be convinced that the struggle against injustice and oligarchy is fruitless; like so many other people, I have frequently succumbed to cynicism and inertia, telling myself that any action is doomed to failure.

But that attitude is wrong. I know that now. More importantly, I feel it’s my duty to lead a crusade against this type of cynicism, even if I do so all by myself.

If you’re reading this blog post, excellent. We can work together. On the other hand, if this essay slips unacknowledged into cyberspace, read by no one at all, so be it–that doesn’t matter. The struggle is important, and it must be waged, even if by only one person, and even if I myself am that person. What follows, then, is my small own contribution in the war on cynicism–my manifesto, if you like.

The time for idle anger is over. The time for pessimism is past. We do not have the luxury to sit back and watch knowingly as the world falls apart, nodding as we say, “Of course, we always knew it would be this way. The system is rigged.” Whether it is or is not rigged is beside the point. This is a question that can be debated by future historians, much like the question “How many angels can dance on the head of a pin?” Saints and scholars can debate such a question; they have that luxury. But we no longer have the luxury to debate whether government is or is not effective. We have to act, and we have to act now, if we want to save our way of life.

And actions start with beliefs, primarily the belief that when we act, our actions have effects. I believe–rather, I know, with certainty–that they do have effects. At the local level, our actions, indeed, our mere presence at meetings and councils do have an effect; I have seen this demonstrated in the past year in my role as a city councilmember. Government, at least at the most local level, works–but only if we work hard at it by electing the right people and by holding them accountable. In short, democracy is not compatible with either cynicism or complacency; yet throughout much of the last generation, our citizenry has been guilty of both these things.

Why? Because cynicism is easy. Cynicism is seductive. Cynicism is cool–so cool, in fact, that many voters in 2016 agreed that “draining the swamp” was the best thing going for a candidate who had no other ideas or attractions. So here is my advice: do not give in to the lure of cynicism. It leads nowhere but to the self, to an inflated view of one’s own cleverness and perception, to a self-satisfied egocentrism that congratulates itself on seeing the worst at all times, in all places. 

Close to the climax of To Kill a Mockingbird, Scout Finch prevents a lynching from happening merely by demonstrating her own naive lack of cynicism. As the crowd of angry white men encircles Atticus, who is guarding the innocent Tom Robinson in his cell, Scout does more than anyone else to quell the murderous mob and send it home. Her simple, naive words, her attempt to connect with Mr. Cunningham on a human and neighborly level, represent a belief in innate goodness and the power of community, and  it is just enough to disarm a group of angry men bent on taking the life of an innocent African-American man. (Click here to watch this pivotal scene; my apologies for the commercial at the beginning.)

That Tom Robinson ends up dying at the end of the novel for a crime he didn’t commit is part of To Kill a Mockingbird’s tragic impact. As readers, perhaps we can be cynical about that tragic message; but as actors, as characters in our own human drama and, most of all, as real-life community members, we cannot afford such cynicism. We must be like Scout if we want to survive.

Reader, I implore you: give up your cynicism. Today, I ask you to believe in something grander than your own cleverness in discovering the duplicity of others, and to act in good faith, even though no discernible good may come out of your actions in your lifetime. Be naive, if you have to. But say good-bye to cynicism today, this minute. I am certain the generation that comes after us will thank you for it.

 

 

 

 

An Unexpected Masterpiece: Felix Holt

Turducken_quartered_cross-section
Turducken–Image from Wikipedia

 

One of the joys of being a retired English professor is that you never really leave your work behind: you just leave all the parts of it that aren’t that much fun. This means that while I don’t have to grade papers or go to pointless committee meetings, I still get to do what inspired me to go to graduate school in the first place: read.

And I do read–a lot. I read all sorts of things, but of course my favorite thing to read is (guess what) Victorian novels. I have taken a lot of pleasure in re-reading the Victorian novels that I studied in depth, like David Copperfield and Jane Eyre, but there is a special sort of pleasure in discovering a new favorite novel. It’s like finding a new star hidden in a constellation you’ve looked at for years, or in a more mundane manner of speaking, like finding that lost sock that went missing in the last load of wash you did.

My lost sock is, to mix metaphors, a turducken of a Victorian novel. We all know that George Eliot is probably the most brilliant of the Victorian novelists; if we didn’t, we have Virginia Woolf declaring, in her autocratic way, that Middlemarch is “one of the few English novels written for grown-up people” (The Common Reader, “George Eliot”). We also have New Yorker writer Rebecca Mead’s take on the novel in her book My Life in Middlemarch, which apparently qualified her to write a nice essay on the book in the magazine.

But then there are the Eliot works that are too seldom read these days: Romola, Scenes of Clerical Life, Daniel Deronda. And who among us has actually read Felix Holt, the Radical? I confess that I have had a copy of it on my bookshelf since 1999, yet I never opened it until last week. I am thankful that I did, because I now think it’s one of the best Victorian novels I’ve read, despite the fact that, according to Wikipedia, it is one of the least popular of Eliot’s novels–so unpopular, in fact, that although it would make a fantastic mini-series (PBS or BBC, are you listening?), the last time it was adapted for film was in 1915. Yes, 1915.

What makes the novel so wonderful are not just some fantastic statements that are eminently quotable, although the book does contain a couple of real gems. Here is one, from the Introduction:

“Posterity may be shot, like a bullet through a tube, by atmospheric pressure from Winchester to Newcastle: that is a fine result to have among our hopes; but the slow old-fashioned way of getting from one end of our country to the other is the better thing to have in the memory. The tube-journey can never lend much to picture and narrative; it is as barren as the exclamatory O!”

Eliot goes on to explain that it is a slow, surface journey that allows the traveler to see and experience the varieties of life, not a quick, subterranean (we might add “aerial” here) journey. How prescient Eliot must have been to have seen what would happen to travel in the next generations, to have understood the way in which “getting there” is no longer fun or important. She makes us understand that the saying “it’s the journey that counts, not the destination” refers only to some kind of moral or experiential journey, and sadly, no longer a real, actual one.

Here is another famous quote, from Chapter 3, in which Eliot displays a remarkable sensitivity to social life:

“…there is no private life which has not been determined by a wider public life, from the time when the primeval milkmaid had to wander with the wanderings of her clan, because the cow she milked was one of a herd that had made the pastures bare.”

The insight revealed in this novel is remarkable, but these selected quotations are not the chief strengths of Felix Holt. What is absolutely amazing to me is that in this one book Eliot combines a variety of different Victorian novels and still manages to create an incredibly good story, one which pulls you back to it day after day because you cannot wait to find out how the characters will respond to the events they become caught up in.

Here’s a simple way of putting it: In George Eliot’s Felix Holt, the Radical (ironically, we are halfway through the book before we realize that Felix Holt is no Radical), we find a turducken of a novel, one which combines and recombines aspects of several different subgenres of the 19th-century novel, fitting many novels, miraculously, into one organic whole. For example, we see the re-education of Esther Lyon, in a Mansfield Park (Jane Austen) narrative; we have the political machinations that are redolent of Anthony Trollope’s Palliser novels; we have the emphasis on hidden secrets and parentage, on madness and eccentricity that Dickens loved to play with; we are treated to a look at a kind of Orientalism, which is worthy of Wilkie Collins; and we have a legal plot about long-hidden heirs and family trusts that blends both Trollope and Dickens with Thomas Hardy. And at the center of it all, we find a difficult love story, starring Esther Lyon and Felix Holt, who are clearly borrowed from some of Sir Walter Scott’s best romances.

With all these things going on, you’d think this would be a mess of a novel, but Eliot is a master craftsman, and she manages to create a wonderful story from these disparate threads, replete with excellent character depictions and some memorable scenes. In short, this is a fine novel, probably just as good as Middlemarch, and quite a bit shorter. It deserves to be read. I certainly wish I hadn’t waited almost 20 years to read it, but I’m very glad I finally have.

So go out and get a copy and read it. Or, if you want, you can always wait for the BBC Miniseries to air. Julian Fellowes or Emma Thompson, it’s time for you to get to work on the script!

Six Rules for Reading (and Enjoying) Julius Caesar

I have always assumed that the best example of my argument that most people get Shakespeare plays all wrong would be Romeo and Juliet. But I have to admit I was mistaken. In fact, I think it is safe to posit that no other Shakespeare play is so maligned and misunderstood as Julius Caesar.

I think this is largely due to the way we teach the play in the United States. Of course, because we do teach the play in high school, Julius Caesar has always gotten tremendous exposure: almost everyone I’ve met has been forced to read the play during their high school career. In fact, I think it’s still on high school reading lists today. But that’s probably also exactly why it’s so misunderstood.

I’m not blaming high school teachers, because by and large they’re told to teach these plays without any adequate preparation. I suppose if anyone deserves blame, it’s the colleges that train teachers. But all blame aside, before I talk about what a great play it really is, and what a shame it is that most people summarily dismiss  Julius Caesar without ever really considering it, let’s look at why this has happened.

julius_caesarFirst of all, it goes without saying that making someone read a play is not a great way to get him or her to like it. Especially when that play is over 400 years old and written in (what seems to be) archaic language. But a still greater problem is that there is a tendency to use the play to teach Roman history, which is a serious mistake. (American high schools are not alone in this; Samuel Taylor Coleridge, for example, criticized the play for not being realistic in its portrayal of Roman politics back in the early 1800s.) In short, far too many people associate this play with a bunch of men showing a great deal of thigh or swathed in endless yards of material, flipping their togas around like an adolescent girl tosses her hair over her shoulder. It’s all too distracting, to say the least.

So, in order to set us back on the right track and get more people to read this fine play,  I’ve made a little list of rules to follow that will help my readers get the most enjoyment, emotional and intellectual, from the play.

Rule Number One: Forget about Roman history when you read this play. Forget about looking for anachronisms and mistakes on the part of Shakespeare’s use of history. Forget everything you know about tribunes, plebeians, Cicero, and the Festival of Lupercalia. The fact is, the history of the play hardly matters at all. Rather, the only thing that matters is that you know in the beginning moments that Caesar will die and that, whatever his motives and his character, Marcus Brutus will pay for his part in Caesar’s assassination with his own life and reputation.

Rule Number Two: Recognize that this is one of Shakespeare’s most suspenseful plays. Our foreknowledge of events in the play, far from making it predictable and boring, provides an element of suspense that should excite the audience. Here we can point to Alfred Hitchcock’s definition of suspense, in which he explains that it’s the fact that the audience knows there’s a bomb hidden under a table that makes the scene so fascinating to watch, that makes every sentence, every facial expression count with the audience. It’s the fact that we know Julius Caesar is going to die on the Ides of March that makes his refusal to follow the advice of the soothsayer, his wife Calpurnia, and Artemidorus so interesting. We become invested in all of his words and actions, just as our knowledge that Brutus is going to lose everything makes us become invested in him as a character as well. A good production of this play, then, would highlight the suspenseful nature within it, allowing the audience to react with an emotional response rather than mere intellectual curiosity.

Rule Number Three: Understand that this play is, like Coriolanus, highly critical of the Roman mob. Individuals from the mob may be quite witty, as in the opening scene, when a mere cobbler gets the better of one of the Roman Tribunes, but taken as a whole, the mob is easily swayed by rhetoric, highly materialistic, and downright vicious. (In one often-excluded scene–III.iii–a poet is on his way to Caesar’s funeral when he is accosted by the crowd, mistaken for one of the conspirators, and carried off to be torn to pieces.) It’s almost as if this representation of mob mentality–the Elizabethan equivalent of populism, if you will–is something that Shakespeare introduces in 1599 in Julius Caesar, only to return to it nine years later to explore in greater detail in Coriolanus.

Rule Number Four: Recognize that this play, like many of Shakespeare’s plays, is misnamed. It is not about Julius Caesar. It’s really all about Marcus Brutus, who is the tragic hero of the play. He is doomed from the outset, because (1) it is his patriotism and his love of the Roman Republic, not a desire for gain, that drives him to commit murder; (2) he becomes enamored of his own reputation and convinces himself that it is his duty to commit murder and to break the law; (3) he falls victim to this egotism and loses everything because of it. Audience members really shouldn’t give a hoot about Julius Caesar; he’s a jerk who gets pretty much what he deserves. But Brutus is a tragic hero with a tragic flaw, a character whose every step, much like Oedipus, takes him further and further into his own doom. The soliloquies Brutus speaks are similar to those in Macbeth, revealing a character that is not inherently bad but rather deficient in logic, self-awareness, and respect for others. In fact, in many ways, it’s interesting to look at Julius Caesar as a rough draft not only of Coriolanus but of Macbeth as well.

Rule Number Five: Appreciate the dark comedy in the play. Shakespeare plays with his audience from the outset, in the comic first scene between the workmen and the Roman Tribunes, but another great comedic scene is Act IV, scene iii, when Brutus and Cassius meet up before the big battle and end up in an argument that resembles nothing more than a couple of young boys squabbling, even descending into a “did not, did so” level. This scene would be hilarious if the stakes weren’t so high, and if we didn’t know that disaster was imminent.

Rule Number Six: Experience the play without preconceptions, without the baggage that undoubtedly is left over from your tenth-grade English class. Once you do this, you’ll realize that the play is timely. It explores some really pertinent questions, ones which societies have dealt with time and time again, and which we are dealing with at this very moment. For example, when is it permissible to commit a wrong in order for the greater good to benefit? (surely Immanuel Kant would have something to say about this, along with Jeremy Bentham). How secure is a republic when its citizens are poor thinkers who can be swayed by mere rhetoric and emotionalism instead of reason? What course of action should be taken when a megalomaniac takes over an entire nation, and no one has the guts to stop him through any legal or offical means?

In the end, Brutus’s tragedy is that he immolates his personal, individual self in his public and civic responsibilities. Unfortunately, it is the inability to understand this sacrifice and the conflict it creates, not the play’s historical setting in a distant and hazy past, that has made it inaccessible for generations of American high school students. Too many decades have gone by since civic responsibility has been considered an important element in our education, with the sad but inevitable result that several generations of students can no longer understand the real tragedy in this play, which is certainly not the assassination of Julius Caesar.

But perhaps this is about to change. In the last few months, we’ve been witnessing a new generation teaching themselves about civic involvement, since no one will teach it to them. And as I consider the brave civic movement begun by the students from Marjory Stoneman Douglas High School, I am hopeful that from now on it’s just possible that reading Julius Caesar could become not a wasted module in an English class, but the single most important reading experience in a high-school student’s career.

Donald-Trump-as-Julius-Caesar-500x281
Image from https://angiesdiary.com/featured/donald-trump-as-julius-caesar/

 

The Ideological Work of Television and the Zombie Apocalyse

I have long argued that television programs, particularly situation comedies, perform an important piece of ideological work in our culture. Far from being pure entertainment, they introduce ideas that society may not want to confront. Of course, no one who can remember All in the Family or Murphy Brown will dispute this; but we may well be surprised to realize that television has always done this, even from its earliest days.

The two examples I have chosen to demonstrate this theory come from The Honeymooners (1955) and Bewitched (1964-1972). Back in the 1950s and ’60s, these sitcoms had to code their messages, making them available only to subtle and clever television viewers. In fact, the entire premise of both series rests on the implicit understanding that while women may have to kow-tow to their husbands, they are in fact the brains in their marriages. After all, Samantha is presumably all-powerful, yet she chooses to remain with the awkward and pouty Darren. Alice Kramden’s situation is less enviable–she is constrained by the 1950s dictum that proclaims women to be subservient to their husbands–but at the same time, she demonstrates to herself, to Ralph, and most importantly, to the audience, that she is in fact much more capable than Ralph and that he is head of the household only because of society awards him this position.

Ideological work is hidden, or coded, in early sitcoms, but it’s still there. For example, in The Honeymooners, in Episode 4 (“A Woman’s Work is Never Done”), Alice decides to get a job after Ralph berates her for not being able to keep up with the housework, while telling him it’s easier to work outside the home than within it. Ralph ridicules the notion, but Alice succeeds quite well, and even earns enough money to hire a maid to carry out the household chores, a maid who turns out to be so efficient and sarcastic that Ralph begs Alice to quit and return to being a homemaker. The message here, years before either That Girl or The Mary Tyler Moore Show appear on television, is that women can indeed be successful in the professional world. This message might have been too revolutionary to appear without coding, but it is delivered nonetheless through this subtle means.

Perhaps more interesting is Episode 7 of the first season of Bewitched (“The Witches Are Out”), in which Darren’s work on an advertising campaign that features witches is critiqued by Samantha as being clichéd and, even worse, rife with prejudice. She takes to the streets to spearhead protests against the campaign, joining a picket line, clearly reflecting the actual protests that were taking place in 1964, when this episode first aired. Since it was too dangerous to talk openly about racial prejudice, the show used a fictional prejudice–against witches–that the viewers would still understand, though perhaps unconsciously.

Neither of these episodes were intentional about their ideological work: in early situation comedies, these shows’ writers merely reflected and refracted the social reality they observed. In other words, during the early years of television, shows didn’t consciously represent the women’s movement or the civil rights movement. They simply reflected and displaced the social trends that were present at the time of their creation and presented them in a non-threatening, palatable form for their viewers.

But by the mid-1970s and beyond, television changed and became more outspoken, taking on a more direct role in society, and at the same time becoming much less afraid to stand on a soap-box. The velvet gloves came off, and we grappled openly with all sorts of issues, from bigotry (All in the Family), to homosexuality (Will and Grace). However, I believe that television still uses coded messages from time to time, and I think I’ve found an example of one genre that horrifies me, and not for its intended reason.

Since the mid 2000s, zombie-themed shows and books have proliferated. I first noticed a fascination with zombies among my students in about 2005, and I found it strange that a genre that had lain dormant for so long was coming back to life (pardon the pun, please). Since then, we’ve had World War Z, Pride and Prejudice and Zombies, and The Walking Dead. Ever the cultural analyst, I wondered what this preoccupation with zombie infestation might represent: just what kind of ideological work is it performing? At first, I thought it might indicate a fear of contagion, of a swift-moving and deadly pandemic. After all, we’ve seen, in the last twenty years, outbreaks of swine and bird flu, SARS, and Ebola. It would certainly make sense for a fear of virulent and lethal illness to express itself as a zombie invasion.

But recently it dawned on me that the imagined zombie invasion might represent something far worse: an invasion of migrants. And, before you dismiss this idea, let me pose a question: Is it possible that the populist rhetoric directed against immigrants is connected, through a subtle, ideological sleight-of-hand, to the rise of the zombie genre in film and television?

After all, so much of zombie plots resemble the imagined threat of uncontrolled immigration: the influx of great numbers of threatening beings who are completely foreign to our way of thinking, who are willing to fight for resources, who will not give up easily, who make us just like them–and who must be destroyed at any cost. I think it’s just possible, in other words, that the present social climate of suspicion, of protectionism, of hostility towards outsiders, has been fostered and cultivated by our ideological immersion in the genre of the zombie plot. Again, as with early television situation comedies, I don’t think this is an intentional linkage on the part of the writers; but intentional or not, the ideological work gets done, and suddenly we find our culture and civilization hostile to the very force that made us what we Americans are.

About ten years ago, I had a student who adored horror films and books. I asked him how he could stand to be made frightened by what he loved and spent so much time on. His answer haunts me today: “This isn’t what frightens me,” he said, pointing to a Lovecraft novel. “What frightens me is the day-to-day things, such as how I’m going to pay my rent.” In the same vein, I’ll end by asking this question: what if the really frightening thing about zombie shows isn’t what happens to their characters, but what happens to us when we watch them?