It may be my bad luck, and my generation’s bad luck, to be alive at a time when we are witnessing the limits of democracy. We’ve had a good run–over two hundred years now–but it may be time to call it a day and start over with some new form of government.
I suppose I am as patriotic as anyone. There are two times in my life when I felt tears well up in my eyes solely because of my pride in being an American. One was after a three-week trip to Iceland, Scotland, and England in 1996, when I returned with my young family to Houston Intercontinental Airport. Waiting in customs, I noticed a babble of languages, and looking around, I saw myself surrounded by people of color, dressed in a variety of ways, many with headscarves or turbans. At that time, it was easy to imagine that these people, if not Americans themselves, would be welcomed as visitors to the United States, or perhaps as potential citizens. That was enough to make me sentimental about the diversity of my country, to be thankful to live in a country that valued all people.
(I will pass over for now the very real possibility—indeed, the near certainty—that this was simply a fiction, even at that time. My belief, however, was real enough to draw tears of pride from my eyes, which of course I quickly wiped away.)
The second time I became emotional with pride in my country was in about 2003 or 2004, when, as a union member from the local community college I stood in solidarity with nurses who were striking at the hospital. I was proud to do so—it is our right as Americans to stand and protest, as so many of us have recently found out. I was proud to be a citizen of a country that allows its citizens to congregate for such a purpose, despite the inconveniences that may be caused by it.
In the last couple of years, I’ve seen protests, but I haven’t taken part in them. I’ve supported them, but I have not been able to make myself participate in them. During the Women’s March, I stayed home, dissolved into a teary mess most of the day. But these were not tears of pride. Perhaps there was some pride mixed in, and admiration for the women who dedicated themselves to the cause, but there was also a feeling of profound despair at the need for such a march. It was the same thing with the March for Our Lives. What a beautiful expression of solidarity, but why should the people of this country need to march in order to protect our children, in order to stand up against an organization that should have no part in our electoral process, to protest the very electoral process that has been shown to be corrupt—not only because of foreign interference, but because of outrageously large campaign donations that fund and sway our elected officials?
Don’t get me wrong. To those of you who are participating in these movements, I want to say that I admire and love you for what you are doing. Yet I cannot help but feel that the need for such movements marks the decline of democracy, the end of this glorious experiment in civic rule that began over 200 years ago.
(Again, I will pass over the fact that this glorious experiment probably started, as so many others have, with the desire for personal gain on the part of the architects of the experiment.)
Democracy cannot work when it is corrupted by the desire for financial gain. It cannot work when the electorate is divided along the lines of hard-held, incontestable beliefs that brook no argument or discussion. It cannot work when our elected officials are, like the people who elect them, small-minded, fearful, and utterly dependent on large corporations who try to direct every facet of their lives and thoughts and are free to do so if they spend enough money on licit and illicit media campaigns.
Recently, retired Supreme Court Justice John Paul Stevens called for the repeal of the Second Amendment. It may in fact be time for such a step. But I fear it may be time for a more drastic step: to admit that our democracy, such as it is, has failed, and that it is time to go back to the drawing board to find a new, more equitable, more humane way of living together in this world that we have created for ourselves.
Please note: This is a very long post. It is based on a talk I gave yesterday (October 28, 2017) at the C.S. Lewis Festival in Petoskey, Michigan. Consider yourself warned!
The study of myth seems to me to take three different paths:
Anthropological / Archeological: the study of classical mythologies (Bulfinch’s Mythology, Edith Hamilton)
Religious / Transcendent: the spiritual meaning of myth (Karen Armstrong, Joseph Campbell, Sigmund Freud)
Structuralist: the study of the same structures that recur in myths (Northrop Frye, Joseph Campbell, Roland Barthes)
This is all interesting, but I would like to back up a moment. I feel like I’ve arrived a dinner party, and that somehow I missed the first two courses. I feel as if I might get some kind of mental indigestion if I don’t start over at the very beginning.
The fact is, I want to know something more fundamental about myth and its function.
I want to know what it is and how it works.
Specifically, I want to know what distinguishes myth from other forms of story-telling.
Because for me, Story-Telling is what distinguishes human beings, homo sapiens, from all other species on this planet, as far as we know.
Studies have shown that crows have memories
Studies have shown that chimpanzees use tools
Philosophers are now beginning to agree that animals do indeed have consciousness
But we—we should be known not as homo sapiens (wise man, the man who knows), but as homo narrans—the speaking man, the man who tells, who narrates—story-telling man. Because it is clear to me that we humans communicate largely through story-telling, and this story-telling function, this tendency to rely on narration, is what makes us human.
I’m going to ask you to bear with me for a little while as I tease this out. I’d like to say that by the end of this essay, I’ll have some answers to the questions I posed (what is myth, and how does it work, and what is the difference between a really good story and a myth)—but I’m pretty sure I won’t. I may, however, ask some more questions that might eventually lead me to some answers.
So here goes. To begin with, a few people who weigh in on what myth is and what it does:
Roland Barthes, the French post-structuralist literary theorist, says that myth is a type of speech, a system of communication, a kind of message. In a way, Barthes and JRR Tolkien are not really different on this point, incredible as it is to think of Barthes and Tolkien agreeing on anything at all, much less something so important to each of them.
They are both incredibly passionate and devoted to the concept of language
Barthes, in his book Mythologies, which I have shamelessly cherry-picked for this essay, says that the myth’s objective in being told is not really important; it is the way in which it conveys that message that is important.
He says that “the knowledge contained in a mythical concept is confused, made of yielding, shapeless associations” (119).
But this isn’t as bad as it sounds, because myths actually don’t need to be deciphered or interpreted.
While they may work with “Poor, incomplete images” (127), they actually do their work incredibly efficiently. Myth, he says, gives to its story “a natural and eternal justification…a clarity which is not that of an explanation but that of a statement of fact” (143).
Myth is a story in its simple, pure form. “It acts economically: it abolishes the complexity of human acts, it gives them the simplicity of essences…” (143).
You can see how this view of myth kind of works with the myth-building that Tolkien does in The Lord of the Rings, which works with simple efficiency, whose very images are incomplete to the point of needing clarification in Appendices and further books like the Silmarillion. Yet even without having read these appendices and other books, we grasp what Tolkien is getting at. We know what Middle-Earth is like, because the myth that Tolkien presents needs no deciphering, no real interpretation for us to grasp its significance.
Tolkien, I think we can all agree, was successful in creating a myth specifically for England, as Jane Chance and many other scholars have now shown to be his intention. But is it a novel? Some might argue it isn’t—myself included. In fact, what Tolkien created in The Lord of the Rings is less a myth (I would argue that we only use that term because Tolkien himself used it to describe his work and his object—think of the poem “Mythopoeia,” which he dedicated to C.S. Lewis) than it is a full-blown epic.
For my definition of epic versus novel, I’m going to my personal literary hero, Mikhail Bakhtin, a great thinker, a marvelous student of literature, a man who wrote with virtually no audience at all for many years because he was sent into internal exile in the Soviet Union. In his essay “Epic and the Novel,” Bakhtin attributes these characteristics to epic:
It deals with an absolute past, where there is little resemblance to the present;
It is invested with national tradition, not personal experience, arousing something like piety;
There is an absolute, unbridgeable distance between the created world of epic and the real world.
The novel, says Bakhtin, is quite the opposite. It is new, changing, and it constantly “comes into contact with the spontaneity of the inconclusive present; this is what keeps the genre from congealing. The novelist is drawn toward everything that is not yet completed” (27).
I think the three characteristics of epic described by Bakhtin do in fact match up nicely with The Lord of the Rings: absolute past, national tradition, distance between the actual and the created world. But here’s another thing about epic as described by Bakhtin: “The epic world knows only a single and unified world view, obligatory and indubitably true for heroes as well as for authors and audiences” (35). It would be hard, indeed impossible, to imagine The Lord of the Rings told from a different point of view. We need that distant narrator, who becomes more distant as the book goes on. As an example, imagine The Lord of the Rings told from Saruman’s point of view, or from Gollum’s. Or even from Bilbo or Frodo’s point of view. Impossible! Of course, we share some of the point of view of various characters at various points in the narrative (I’m thinking specifically of Sam’s point of view during the Cirith Ungol episode), but it couldn’t be sustained for the whole of the trilogy.
The interesting thing here is that in The Lord of the Rings, Tolkien took the novel form and invested it with epic. And I think we can say that against all odds, he was successful. On the other hand, C.S. Lewis, in his last book Till We Have Faces, took a myth (the story of Cupid and Psyche), which is certainly more closely related to epic than it is to novel, and turned it into a successful novel. This isn’t the time and place to talk about Till We Have Faces, although I hope someday that we can come together in the C.S. Lewis Festival to do that very thing, but I couldn’t help mentioning this, because it’s striking that Lewis and Tolkien, while they clearly fed off each other intellectually and creatively, started from opposite ends in writing their greatest creative works, as they did in so many other things. It’s almost amazing that you can love both of them at the same time, but of course you can. It’s the easiest thing in the world to do.
But I’m losing the thread of my questions here. What is myth? Can we actually have modern myths? Can someone actually set out with the intention of creating a myth? And can a mythic work spontaneously just happen? Another question needs to be posed here: if this long book, which is probably classified in every bookstore and library as a novel, touches on myth but is really an epic, can a novel, as we know it, become a myth? This forces us to tighten up our definition of what a myth is and asks us to think about what myth does.
Karen Armstrong, I think, would say yes, to all three of these questions. In her book A Short History of Myth, Armstrong follows the trajectory of myths through time and argues that the advent of printing and widespread literacy changed how we perceive and how we receive myth. These developments changed myth’s object and its function—and ultimately, it changed the very essence of myth.
Armstrong points out that myths and novels have similarities:
They are both meditative
They can both be transformative
They both take a person into another world for a significant period of time
They both suspend our disbelief
They break the barriers of time and space
They both teach compassion
Inspired by Armstrong and by Bakhtin, I’m going to go out on a limb here and make a stab at answering my questions. And I’ll start by defining a modern myth as a super-story of a kind: a novel (or a film, because let’s open this up to different kinds of story-telling) that exerts its power on a significant number of people. These stories then provide, in film professor and writer Stuart Voytilla’s words, “the guiding images of our lives.”
In short, a modern myth has these characteristics:
It belongs to a certain place and time. Like epic, it is rooted in a time and a place. It might not be far removed from the actual, but it cannot be reached from the actual.
It unites a group of readers, often a generation of readers, by presenting an important image that they recognize.
It unites a group of readers by fostering a similar reaction among them.
It contains identifiable elements that are meaningful to its readers/viewers. Among these might be important messages (“the little guy can win after all,” “there’s no place like home,” the American Dream has become a nightmare”).
In other words, a mythic story can be made intentionally, as Star Wars was by George Lucas after he considered the work of Joseph Campbell; or it can happen accidentally. Surely every writer dreams of writing a mythic novel—the Great American novel—but it’s more or less an accident. The Adventures of Huckleberry Finn was a mythic novel of American, until it was displaced by To Kill a Mockingbird. And I would note here that having your novel go mythic (as we might term it—it is, in a way, like “going viral,” except mythic stories tend to last longer than viral ones) is not really such a good thing after all. Look at Harper Lee—one mythic novel, and that was the end of her artistic output—as far as we know. A mythic novel might just be the last thing a great writer ever writes.
Anyway, back to our subject: a modern myth gets adopted rather than created. Great myths are not made; they become. So let’s’ think of a few mythic novels and see how they line up with my four characteristics:
The Wizard of Oz
The Great Gatsby or Death of a Salesman—take your pick.
The Case of Local Myths—family or friend myths, references you might make to certain films or novels that only a small number of people might understand. A case in point would be the re-enactments of The Rocky Horror Picture Show that take place each year around Halloween.
In essence, my answer, such as it is, to the questions I posed earlier comes down to this:
Modern myths are important stories that unite their readers or viewers with similar emotional and intellectual reactions. Modern mythology works by presenting recognizable and significant images that unite the people who read or view them. As for what distinguishes modern myths from other forms of story-telling, what tips a “normal” novel or film over into the realm of “mythic”—I don’t have an answer for this. I only have a couple of vague, unformed theories. One of my theories is this: Could one difference between myth and the novel (“mere” story-telling as such) be that myth allows the reader/listener to stay inside the story, while the novel pushes the reader back out, to return to the actual world, however reluctantly?
And let’s not forgot what Karen Armstrong wrote about myth: “It has been writers and artists, rather than religious leaders, who have stepped into the vacuum [created by the loss of religious certainty and despair created by modernism] and attempted to reacquaint us with the mythological wisdom of the past” (138). Armstrong’s closing sentence is perhaps the most important one in the book: “If professional religious leaders cannot instruct us in mythical lore, our artists and creative writers can perhaps step into this priestly role and bring fresh insight to our lost and damaged world” (149). With this in mind, perhaps it’s time to go and read some more, and find more myths that can help us repair and restore ourselves, our faith in our culture, and in doing so, the world itself.
I had actually planned this post a couple of days before my favorite living writer, Kazuo Ishiguro, won the Nobel Prize in Literature (announced on on October 5th). So, along with the satisfaction and sense of vindication I felt when I woke up last Thursday morning and discovered that he’d been awarded the Prize, I also felt a sense chagrin at being late in making this post. After all, I could have gone on record about Ishiguro’s talent days before the Nobel committee made its announcement. Still, better late than never, so I will offer my belated post now, and explain the three most important things I’ve learned from Ishiguro over the years.
The most important thing I’ve learned from Kazuo Ishiguro is this: great writing often goes unnoticed by readers. (This point, of course, is now somewhat diluted by the fact that Ishiguro has indeed won acclaim for his work, but I think it deserves to be made all the same.) I remember reading Never Let Me Go about eight years ago and being gob-smacked by its subtle narrative brilliance and its emotional resonance. And yet I’ve met many readers of the book who, while affected by the narrative, seemed unimpressed by Ishiguro’s writerly achievement. It’s almost embarrassing that my reaction to the novel was so different than other people’s. Could I have gotten it wrong, somehow? Was it possible that Never Let Me Go really wasn’t the masterpiece I thought it was? While I considered this, I never once really believed I had made a mistake in my estimation: it is a tremendous book. The fact that few other people see it as such does not change my view of it. It simply means that I see something in it that other people don’t. Hence my first object lesson from reading Ishiguro: genius isn’t always obvious to the mass of readers out there. Perhaps it just isn’t that noticeable with so many other distracting claims for our attention.
The second thing I’ve learned from Ishiguro also stems from Never Let Me Go: genre doesn’t matter. When you really think about it, categorizing a work based on its plot is a silly thing to do, and yet we are firmly locked into that prison of categorization, since almost all bookstores and libraries, as well as readers, demand that every work fit into a narrow slot. I commend Ishiguro for defying the convention of genre, incorporating elements from both science fiction and fantasy into realist narratives. In my view, the sooner we break the shackles of genre, the better. Good, responsible readers should never restrict themselves to a certain genre any more than good, imaginative writers should. A certain amount of artistic anarchy is always a good thing, releasing creative juices and livening things up.
And finally, the third thing I’ve learned is this: a good writer does not hit the bull’s eye every time he or she writes. The Remains of the Day and Never Let Me Go are truly wonderful books. An Artist of the Floating World is promising, but not nearly as good as Ishiguro’s later works. The Buried Giant, I’d argue, is a failure–but it is a magnificent failure, one whose flaws emanate from the very nature of the narrative itself, and thus it transcends its own inability to tell a coherent story. I’ve learned from this that a writer should never be afraid to fail, because failing in one way might be succeeding in another, less obvious, way. This is as good a place as any other to admit that I have never been able to get through The Unconsoled. And as for When We Were Orphans–well, the less said about that disaster of a book, perhaps the better. I can’t imagine what Ishiguro was thinking there–but I will certainly defend his right to fail. And I am thankful that even a writer with such talent as Ishiguro does, from time to time, fail–and fail big. It certainly gives the rest of us hope that while we fail, we can still aspire to success.
I will close by saying that I am grateful to Kazuo Ishiguro for the wonderful books he’s written. If you haven’t read any of them, you should–and not just because some panel gave him an award. But I am just as grateful to him for the three important lessons he has taught me about the nature of writing.
Let me get one thing straight right from the get-go: I know self-publishing is not the same thing as publishing one’s work through a legitimate, acknowledged publishing company. I also know that self-publishing is looked down upon by the established writing community and by most readers. In fact, for the most part I agree with this estimation. After all, I spent much of last year writing freelance book reviews for Kirkus Reviews, so I know what’s being published by indie authors: some of it is ok, but much more of it is not very good at all.
Knowing this, then, why would I settle for publishing my novels on Amazon and CreateSpace? This is a tricky question, and I have thought about it a great deal. Whenever anyone introduces me as an author, I am quick to point out that I am, in fact, just a self-published author, which is very different from a commercial writer. (And if at any time I am liable to forget this important fact, there are enough bookstores in my area that will remind me of it, stating that they don’t carry self-published books.) When I meet other writers who are looking for agents, I do my best to encourage them, largely by sharing with them the only strategy I know: Be patient, and persist in sending your queries out.
So why, since I know all this, do I resort to self-publishing my work? I’ve boiled it down to four main reasons.
First of all, I self-publish because I am not invested in becoming a commercially successful writer. I write what I want, when I want, and when I decide my work is complete, I submit it to an electronic platform that makes it into a book, which I can then share with family and friends and anyone else who cares to read it. In other words, for me writing is not a means by which to create a career, celebrity, or extra income. I have long ago given up the fantasy of being interviewed by Terry Gross on Fresh Air; my fantasies are more mundane these days.
Second, I do not need to be a commercial writer, with a ready-made marketing machine to sell my books, because I am not hoping to make any money from them. Rather, I look upon writing as a hobby, just as I look upon my interest in Dickens, Hardy, and the Brontes as a hobby. I am helped here by having spent many years engaged in academic research, a world in which publications may win their authors momentary notice, but certainly not any money, unless one happens to sell out to the lure of literary celebrity, as Stephen Greenblatt has. I have a few publications out in the academic world, but no celebrity and certainly no money to show for them–and I am totally fine with that. In my creative writing, I am lucky enough to have a hobby that satisfies me and costs me relatively little–far less, in fact, than joining a golf or tennis club would cost.
The third reason that I self-publish my work is that I actually enjoy doing so. There are some aspects of publication that have surprised me. For example, I have found that I really enjoy working with a great graphic designer (thanks, Laura!) to develop the cover of my novels. It is an extension of the creative process that is closely related to my work but something that I could never do myself, and this makes me all the more grateful and fascinated as I watch the cover come to life and do its own crucial part to draw readers into the world I have created.
As a retired writing professor, I realize how important revision and proofreading is, and to be honest, this is the only part of self-publishing that gives me pause, because I worry about niggling little errors that evade my editorial eye. But for the most part, I am old enough now to have confidence in my writing. Plus, the beauty of self-publishing is that it is electronic: if there are errors (and there are always errors, even in mainstream published books), I can fix them as soon as a kind reader points them out. So I suppose the fourth reason to self-publish lies in the fact that it is so very easy to do it these days.
These are four good reasons for me to self-publish, but the most important reason is that I apparently love to write, and self-publishing allows me to do this without worrying about submitting the same piece of work over and over again to agents and publishers, stalling out my creativity. While at the Bronte Parsonage Museum this past summer, I picked up a card that expresses how I feel about it, a quote from Charlotte Brontë: “I’m just going to write because I cannot help it.” (It is a testament to my literary nerdiness that I happen to know that this quotation comes from Brontë’s Roe Head Journal, but strangely enough, before I encountered it on a greeting card I never realized that it applied to myself as well as to Brontë.) In my idiosyncratic view, self-publishing allows the reader to decide whether a novel is worth reading, rather than punting that responsibility over to an overworked and market-fixated literary agent or editorial assistant. I am willing to trust that reader’s judgment, even if it means I will never sell many books.
And so today, as I am releasing my second self-published novel (Betony Lodge, available on Amazon and CreateSpace–and this is my last attempt at marketing it here on my blog), I am fully aware of the stigma of self-publishing, but I realize that what’s right for other writers may not be right for me. Today, then, I am taking my courage into my own hands and pushing that key to make the book go live.
And tonight I will be making my own champagne toast: here’s to living in the 21st century, when digital publishing makes authors of us all!
Several months ago, I had what seemed like a fantastic idea: now that I was retired from teaching English at a community college, I could engage in critical research, something I’d missed during those years when I taught five or more classes a semester. I had managed to write a couple of critical articles in the last few years of my tenure at a small, rural two-year college in Northern Michigan, but it was difficult, not only because of the heavy demands of teaching, but also because I had very limited access to scholarly resources. Indeed, it is largely due to very generous former students who had moved on to major research institutions that I was able to engage in any kind of scholarly research, a situation which may seem ironic to some readers, but which is really just closing the loop of teacher and student in a fitting and natural way.
And so last fall, on the suggestion of a former student, I decided to throw my hat in the ring and apply to a scholarly conference on Dickens, and my proposal was chosen. In time, I wrote my paper (on Dickens and Music– specifically on two downtrodden characters who play the flute and clarinet in David Copperfield and Little Dorrit, respectively) and prepared for my part in the conference.
It had been close to 25 years since I had read a paper at a conference, and so I was understandably nervous. Back then, there was no internet to search for information about conference presentations, but now I was able to do my homework, and thus I found a piece of advice that made a lot of sense: remember, the article emphasized, that a conference paper is an opportunity to test out ideas, to play with them in the presence of others, and to learn how other scholars respond to them, rather than a place to read a paper, an article, or a section of a book out loud before a bored audience. Having taught public speaking for over a decade, I could see that this made a lot of sense: scholarly articles and papers are not adapted to oral presentations, since they are composed of complex ideas buttressed by a great many references to support their assertions. To read such a work to an audience seemed to me, once I reflected on it, a ridiculous proposition, and would surely bore not only the audience, but any self-respecting speaker as well.
I wrote my paper accordingly. I kept it under the fifteen-minute limit that the moderator practically begged the panelists to adhere to in a pre-conference email. I made sure I had amusing anecdotes and witty bon mots. I concocted a clever PowerPoint presentation to go with the paper, just in case my audience got bored with the ideas I was trying out. I triple-spaced my copy of the essay, and I–the queen of eye contact, as my former speech students can attest–I practiced it just enough to become familiar with my own words, but not so much that I would become complacent with them and confuse myself by ad-libbing too freely. In short, I arrived at the conference with a bit of nervousness, but with the feeling that I had prepared myself for the ordeal, and that my paper would meet with amused interest and perhaps even some admiration.
It was not exactly a disaster, but it was certainly not a success.
To be honest, I consider it a failure.
It wasn’t that the paper was bad. In fact, I was satisfied with the way I presented it. But my audience didn’t know what to do with presentation. This might be because it was very short compared to all the other presentations (silly me, to think that academics would actually follow explicit directions!). Or it could be because it wasn’t quite as scholarly as the other papers. After all, my presentation hadn’t been published in a journal; it was, as C.S. Lewis might have called it, much more of a “supposal” than a fully-fledged argument. Perhaps as well there was something ironic in my stance, as if I somehow communicated my feeling that research in the humanities is a kind of glorified rabbit hunt that is fun while it lasts but that rarely leads to any tangible, life-changing moment of revelation.
Yet this is not to say that humanities research is useless. It isn’t. It develops and hones all sorts of wonderful talents that enrich the lives of those who engage in it and those who merely dip into it from time to time. I believe in the value of interpreting books and arguing about those interpretations; in fact, I believe that engaging in such discussions can draw human beings together as nothing else can, even at the very moments when we argue most fiercely about competing and contrasting interpretations. This is something, as Mark Slouka points out in his magnificent essay “Dehumanized,” that STEM fields cannot do, no matter how much adminstrators and government officials laud them, pandering to them with ever-increasing budgets at the expense of the humanities.
And this is, ultimately, why I left the conference depressed and disappointed. I had created, in the years since I’d left academia, an idealized image of it that was inclusive, one that recognized its own innate absurdity. In other words, sometime in the last two decades, I had recognized that research in the humanities was valuable not because it produced any particular thing, but because it produced a way of looking at the world we inhabit with a critical acuity that makes us better thinkers and ultimately better citizens. The world of research, for me, is simply a playground in which we all can exercise our critical and creative faculties. Yet the conference I attended seemed to be focused on research as object: indeed, as an object of exchange, a widget to be documented, tallied, and added to a spreadsheet that measures worth.
Perhaps its unfair of me to characterize it in this way. After all, most of the people attending the conference were, unlike me, still very much a part of an academic marketplace, one in which important decisions like tenure, admission to graduate programs, promotions, and departmental budgets are decided, at least in part, by things like conference attendance and presentations. It is unfair of me to judge them when I am no longer engaged in that particular game.
But the very fact that I am not in the game allows me to see it with some degree of clarity, and what I see is depressing. One cannot fight the dehumanization of academia, with its insistent mirroring of capitalism, by replicating that capitalism inside the ivy tower; one cannot expect the humanities to maintain any kind of serious effect on our culture when those charged with propagating the study of humanities are complicit in reducing humanities research to mere line items on a curriculum vitae or research-laden objects of exchange.
I can theorize no solution to this problem beyond inculcating a revolution of ideas within the academy in an effort to fight the now ubiquitous goal of bankrupting the study of arts and humanities, a sordid goal which now seems to characterize the age we live in. And I have no idea how to bring about such a revolution. But I do know this: I will return to my own study with the knowledge that even my small, inconsequential, and isolated critical inquiries are minute revolutions in and of themselves. As we say in English studies, it’s the journey that’s important, not the destination. And in the end, I feel confident that it will take far more than one awkward presentation at a conference to stop me from pursuing my own idiosyncratic path of research and inquiry into the literature I love.
I have fairly sloppy reading habits these days, moving randomly from one book to the next, choosing them for the slightest of reasons. A couple of weeks ago, I was in Wales, and I stopped in a bookstore. This bookstore was not in Hay-on-Wye, which is noted for its bookstores and its annual literary festival; frankly, I found Hay-on-Wye to be too commercial and couldn’t get out of there soon enough. Rather, it was a small bookstore in Crickhowell, in South Wales, which, it turns out, was a place that Tolkien visited on a holiday as a young adult and whose influence can be found in The Hobbit.
Whenever I go into a bookstore, I feel obligated to purchase something. For me, it’s like getting a table in a restaurant: you wouldn’t go in at all if you didn’t mean to buy something. And, because I was in Wales, and because the bookstore had a wonderful collection of Welsh books written in English, I picked up a novel by Raymond Williams called Border Country. I chose it because I am a retired English professor and am familiar with some of Williams’s critical work. I was hoping it would be a good book, because I always root for scholars who write fiction, being one myself.
I will simply say here that Border Country surpassed any hope I had that it would be an interesting book to read on vacation. It really is a fine novel, a beautiful and thoughtful narrative in which Welsh village life is depicted against the background of labor struggles, the clash of generations, and the difficulty involved in leaving one’s home and then returning to it.
Williams creates a subtle story with a strong narrative pull, largely because of the lively, interesting characters he presents. The protagonist is a professor of economics who lives in London with his wife and two sons; he must return to the Welsh border country, however, because his father has had a stroke. But “border country” also refers to the space that Matthew Price (called “Will” back in his hometown of Glynmawr) occupies within his own world: neither fully in the cosmopolitan world of London intellectuals (we get only a glimpse of his life there) nor in the village of his birth, Matthew is caught between worlds and a strange, palpable dysphoria ensues.
Yet the novel does not dwell on this unease. Rather, it provides flashbacks to an earlier time, when Matthew’s father Harry first arrives in Glynmawr to work as a railway signalman with his young wife Ellen, and in doing so it recounts the struggles involved in making a life in that beautiful and rugged country. The novel, true to its form (and no one would know that form better than Williams, who was a literary scholar of the highest merit), presents a varied and beautiful mix of narratives, woven together so subtly and with such artistry that the reader moves effortlessly between them.
I am new to Welsh literature, but I have learned this from Border Country: reading Welsh novels means reading about the Welsh landscape, with its rough yet welcoming mountains, where life is difficult but well worth living. Williams manages to get that feeling across to the reader in his simple, almost elegiac tone. The threads of the story keep us turning the pages, but the message of the book will stay with us long after we finish reading.
This is a novel that deserves to be read. It is both a pleasure and a pain to say that: a pleasure to discover a hidden gem, and a pain to realize that this gem has been obscured by newer, less deserving but flashier novels, and has only been revealed by the undisciplined, random choice of a reader strolling into a bookstore looking for something to read while on holiday in Wales. So I’m doing my part to gain it the readership it deserves by saying here: get this book and read it. You will be glad that you did.
The title is a misnomer of sorts: most contemporary book reviews, I’ve noticed, are little more than marketing ploys designed to get you to buy the book they’re reviewing. If the reviewer is quite brave, the review might actually critique the book, but the point remains the same: to weigh in on a book that has grabbed, or wants to grab, the attention of a large body of readers.
That is not my goal in writing book reviews.
Am I alone in wailing and moaning the lost art of reading? Certainly not. Yet I am advocating here a certain kind of reading, a way of reading which demands thoughtful yet emotional responses to a book. This kind of reading and critiquing is not systematic, like a college paper; it is not formulaic and profit-generating, like a Kirkus book review; and it is certainly not aimed at gaining a readership for a book, or for this blog, either, for that matter. I am simply modeling the behavior I would like to see in other readers. I want to log my emotional and intellectual responses to certain books, to join or create a critical discussion about the the works I’m reading. Some of these works will be current, but many more will be older. As I used to tell my literature students, I specialize in works written by long-dead people. Long mesmerized by the works from the nineteenth century and before, I have, one might say, a severe case of century deprivation.
But today I am starting with a book by Susan Sontag, The Volcano Lover: A Romance. Published in 1992, it is a historical novel set in Naples, Italy, at the end of the eighteenth century, focusing on Sir William Hamilton and his second wife Emma, destined to become the mistress of Horatio Nelson.
Let me say that I have never read many of Sontag’s essays, and now I feel I don’t really have to, because this book seems in many ways much more a essay than a novel. There’s a good story in the lives of Sir William, Lady Hamilton, and Lord Nelson, but Sontag pushes this story into the background, eclipsing it by allowing her narrator’s cynical distance to diminish the reader’s ability to connect with the characters and events portrayed in the novel. Sontag gets in the way of the story a great deal too much. Egotism has no place in the act of telling a story; unfortunately, this lesson is something many writers are slow to learn, and indeed, some writers never learn it at all.
The true protagonist of the novel emerges only in the last eight pages. Sontag has had her revenge on the prurient reader who has picked up this novel only to delve into the lurid details of one of the most famous threesomes in British history. She pulls out a minor character, one that has had only the most fleeting reference given her, and gives her some of the best scenes to narrate. By playing hide-and-seek games with her story in this way, Sontag regrettably implodes her own narrative.
In the end, Sontag is much too clever a story-teller, and this hurts her novel–irreparably, in my view. There is one sentence in the novel that I think is worthy of remembering, however. Describing Sir William long after her own death (yes, Sontag does this, time-hopping with impunity, apparently), his first wife describes him like this in a single-sentence paragraph: “Talking with him was like talking with someone on a horse” (376). That’s a clever description, and I will give Sontag her due by calling attention to it.
In the end, though, I am left feeling frustrated and annoyed by The Volcano Lover. I have no idea how it can be construed as a romance, just as I have no idea why this novel, with its sly undercurrent of critical attitudes–towards the characters, the actions, and perhaps even the very nature of novel-writing–should hold a reader’s attention. Sontag’s work, described on the jacket as “a book of prismatic formal ingenuity, rich in speculative and imaginative inventiveness and alive with delicious humor,” is in reality a self-absorbed narrative, filled with annoying commentary, strained attempts at originality, and a smug disregard for its readers’ desire to like the book they’re reading.
Fair warning: this post is not political. It is for all the writers out there who hate revising their work.
Guys, I know the feeling. You labor over something for weeks, months, even years, and when you reach the end, or what you think is the end, it’s so very tempting to stop, put down your pen or push aside your keyboard, and break out the champagne. You love what you’ve written, if only because (1) it’s finished and (2) it meets your expectations, which, let’s be honest, have been systematically lowered throughout the duration of your project. The last thing you want to do is pick over every word and line you’ve sweated over in a pointless effort to tear it apart.
I used to feel that way, too. In fact, I suppose a pretty substantial part of me still does. But today, on the eve of 2017, at the end of a year that so many people are calling a very bad year, if not a catastrophic one, I pause in my own revision work to offer other writers a new way of looking at revision.
I am learning to love this part of writing, because I see it as a perfect marriage between creativity and analysis. Note that I am using the word “analysis,” not the word “criticism,” because that’s too negative for what I think we do in revision. The job of revision is to help make something better, not to tear it apart. (Tearing it apart should come later, during the critical review, but only in as much as the critic must tear something apart in order to see what it’s made of and how it works. A good critic will always put the work back together again after she does the work of criticism.)
My secret to loving revision, then, is this: Revising a work must involve a willing, enthusiastic attitude. The writer must regard the task of revising with excitement, because it is this part of writing that really shows the essence of craftsmanship, that separates those who write for fun (whether they are published authors or not) from those who write because they are compelled to do so. But how can a writer change their attitude about this pain-in-the-ass time sink? I’ve devised a very simple solution. Instead of hoping that your work contains few mistakes and needs minimal revision, you should assume that it houses many mistakes, some of them not easy to find. Rather than bewailing the need to revise, growing bored and frustrated with finding topical errors, learn to use revision as a sonar device to locate the buried as well as the superficial mistakes. Once found, even deep mistakes are usually fairly easy to fix–much easier to fix than most writers would think. I’ve found that when you let go of the inherent desire not to have to fix something and give yourself over to the idea that fixing it is not only a good thing to do, but an entertaining and satisfying aspect of the nature of the job, revision loses its drudgery. It becomes a pleasant and in some ways delightful stage in the work of creation, and it invites the best use of problem-solving tactics–and creativity–a writer possesses.
There you have it. Stop avoiding revision. (You know you have.) Change your attitude–for real. Love revision and all it offers. Because it’s revision, and not the mere act of writing itself, that makes us real artists. Any third-grader can write. Only a real writer has the ability, and the drive, to revise.
–Offered on this last day of 2016 with a minimum of revision
For me, discovering an important book that I’ve overlooked is one of the most pleasurable parts of the reading life. I used to use the classroom to share my findings with students–who, I’ll admit, for the most part didn’t really care about my discoveries–but now, since I’ve retired, I’m forced to use The Tabard Inn to record them for a posterity which probably doesn’t really exist. That’s ok, because I feel it’s my duty, if not my destiny, to read forgotten books, to encourage these literary wallflowers and buried masterpieces to take their place in the spotlight, so to speak, even if no one is in the audience.
I’ve discovered a number of fine books through having absolutely no discipline in my reading the last few years. But I count Laughing Whitefish, by Robert Traver (McGraw Hill, 1965), among the most significant of my discoveries. My readers may recognize Robert Traver as the author of the book Anatomy of a Murder, which was made into a racy film starring James Stewart in 1959: the star’s father, believing the film to be immoral, actually took out an advertisement in his paper to ask people not to see it. You can see the unusual trailer for the film below:
Much attention has been given to Anatomy of a Murder, but I’ve seen virtually nothing about Laughing Whitefish, which is actually a great deal more important than Traver’s earlier book. In fact, I will make the claim here that this novel is every bit as important in its way as To Kill a Mockingbird, which was published five years earlier. Laughing Whitefish is based on real events and is based in Michigan’s Upper Peninsula; it takes place in the late nineteenth century and focuses on a lawsuit in which a Native American sues a mining company for breach of contract. Like Lee’s mythic condemnation of the inequalities between blacks and whites in the first half of the twentieth century, Traver’s book addresses the evils done to Native Americans during the settlement of the United States. And it does this in impassioned language. Take, for example, these words spoken by the first-person narrator:
It seems passing strange that we whites in our vast power and arrogance cannot now leave the vanishing remnants of these children of nature with the few things they have left….Can we not relent, for once halt the torment? Must we finally disinherit them from their past and rob them of everything? Can we not, in the name of the God we pray to, now let them alone in peace to live out their lives according to their ancient customs, to worship the gods of their choice, to marry as they will, to bring forth their children, and finally to die? Can we, who for centuries have treated the Indians as dogs, only now treat them as equals when they dare seek relief from injustice in our courts?….I am the first to concede that whatever you may decide here will be but a passing footnote in the long history of jurisprudence, that the pittance we are jousting over is but a minor backstairs pilfering in the grand larceny of a continent. (202)
These are difficult words for a white person to read, but I believe it is important for every American to read them, because they present the situation as clearly as Harper Lee did in To Kill a Mockingbird. The question is, why is it that we know Lee’s work, but not Traver’s? I would suggest that Laughing Whitefish be made required reading in public schools, because it is just as important a book as To Kill a Mockingbird.
No one has a monopoly on misery in this country. But the first step in solving a problem is admitting it exists. The second step is exploring its origins. What a different world we might be living in today if, instead of making a film of Anatomy of a Murder, Otto Preminger had made one of Laughing Whitefish.
I read a lot. Not as much as my husband seems to think, but a respectable amount nonetheless. This year I am keeping track, and since January 1st, I’ve read fifteen books. That’s three books a month, a figure that includes one audio book but does not include the four books I’ve read for reviewing purposes. And among those books, I’ve found two books that I think are actually bad novels. Surprisingly, these two bad novels are by acclaimed authors–authors whose works I have enjoyed, recommended, and highly admired. Hence today’s topic: why reading a bad novel isn’t an utter waste of time.
Many of us have had those moments in which we spend a good chunk of time resolutely plowing through a New Yorker short story only to complain afterwards, muttering something like, “That’s an hour of my life I’ll never get back.” And the same could be said about these two novels. Reading Kazuo Ishiguro’s When We Were Orphans and listening to Umberto Eco’s The Mysterious Flame of Queen Loana left me frustrated and perplexed until I began to think about bad novels. After several days of thought, I began to see the value of reading books that simply don’t measure up to our standard of writerly quality.
Don’t get me wrong: while in the midst of these two books I kept reading and listening precisely because, knowing the authors’ other works, I expected things to take a turn for the better. When they didn’t, I grumbled and complained, and marveled at the insipidness of the stories being told. I finished Ishiguro’s novel thinking, “That’s strange–it never did get any better. Where is the writer who produced two of the finest novels of the last thirty years?” I finished Eco’s in even worse shape, thinking, “At least I knitted several dishcloths while I spent fifteen hours [!] listening to this thing.”
So why would I celebrate bad novels? There are a number of reasons. First, there’s value in reading a body of a writer’s work, just as it’s worthwhile to watch a body of a director’s films. Watching the ebb and flow of good writing within one author’s body of work is instructive: it shows us readers that all writing is experimental, even the writing created by excellent and talented writers. Second, it makes us question our values. What makes a novel bad rather than good? Is it predictability and relying on telling rather than showing, as in When We Were Orphans? Or could it be long-winded musings that interrupt and detract from the real narrative, leaving readers with a shaggy-dog story rather than an enriching experience, as in The Mysterious Flame of Queen Loana? Would we judge these books as harshly if we didn’t know the authors’ other works, masterpieces in their own right? These questions may not have clear answers, but they are certainly worth considering.
And for those writers out there (and aren’t all of us writers, even those of us who don’t regularly produce manuscripts or succeed in getting our work published?), I’d offer this thought: considering bad novels gives us hope. If Kazuo Ishiguro can miss the bull’s-eye, even after he wrote The Remains of the Day, then we can certainly forgive ourselves for not coming up to snuff. We can continue to labor at our work, trusting that, like Ishiguro, we can still produce some wonderful work, a heart-breaking novel like Never Let Me Go, jaw-dropping in its artistry. Using Eco’s example, we can say to ourselves that our present work may not be quite the thing, but that another, beautiful piece of writing lies within us, struggling to come out.
And most important of all, we can remind ourselves that all stories are significant, and that even the not-so-good ones deserve to be told–and read.