Writing and Authenticity, Part I

57258287_10213481439407462_2179221283063988224_o
Legacy’s Lady Camilla, fresh from her Crufts win in March, 2019.

You may notice, if you are a regular reader of this blog, that I have posted much less frequently in the last few months. The reason is this: I have taken some time to stop writing and really think about what writing does, what it can do, and what it should do. In other words, I have given myself a self-imposed sabbatical from writing while I contemplate the job of writing, and, more personally, how–and even if–I want to continue writing at all.

I have sorted some ideas out in my head, and I’m beginning to get to a point where things are making a bit more sense than they did a few months ago. One thing that galvanized me was an experience I had with a good friend, an experienced writer who kindly volunteered to help me with a short story I was working on. He gave me some excellent advice on how to make the story better: more polished, more focused, and ultimately more ready for publication. I could tell that his advice was spot on. I knew that he was right about the changes he suggested. And yet, almost as soon as I heard his suggestions, I also knew I would not take his advice. Despite knowing that he was right about these suggested improvements, I could not bring myself to make them.

Now, every writer knows that you are supposed to “kill your darlings”: writers should never get so attached to their work that they are not willing to chop it all up in order to mix it back together, or even trash it and begin anew if necessary. I knew that my story wasn’t perfect, so my reason for not making those changes wasn’t that I thought it was good enough as it was. At the time, I didn’t know why I resisted my friend’s excellent advice. In fact, as it turned out, I had to think a good, long time before I could discover why I had had such a profound and visceral reluctance to tinker with it. And now, some three months later, I think I have found the answer.

But in order to explain it, I have to refer to a world that is far removed from writing. My husband shows dogs (you can find his website here), and over the years we have noted something interesting about the way kennel clubs and dog shows operate. The winning dogs all correspond closely to a perceived (but not fully agreed upon) standard. No surprise here: this is, of course, to be expected. The dog who looks closest to the standard, in the judge’s opinion, is the dog that takes home the trophy. Of course, the key words are “in the judge’s opinion“: there can be a wide variety of opinions, which is why different dogs might win in different shows with different judges. Yet it is the corollary to this rule which is most interesting, and most troubling for the future of all pedigree show dogs. If dogs are penalized for deviating from the norm, then that inherently means all the show winners must look more alike–must be more alike–than different. And because it is largely the show dogs that are responsible for propagating the breed, then it naturally follows that the genetic diversity is always shrinking, because of this desire to create a puppy that follows the breed standard to a tee. In other words, the very act of judging the dog makes it so that all people participating in showing will want their dog to look just like the “ideal” dog–the breed standard–and they will take great pains to make sure that the puppies they produce, and sell, and buy, will be more similar to this perceived ideal than different from it. (This has dire consequences for the sustainability, and even the survivability, of pedigree dogs, but that is a matter for other blogs.)

It’s human nature to want to win, whether in dog shows (where surely the dogs don’t care if they are Best of Show) or in the world of writing, which we know as publishing. Publishers–and by extension, readers–are like the dog show judges: they are looking for the best combination of words and anecdotes to hit a home run in the marketplace. They have an ideal standard, which is why all the fiction published in literary journals and The New Yorker ends up feeling the same over time. In other words, what is published will always come to look a great deal more like everything else that is published than it will look like something individual and unique.

And so, being “good,” in the sense of getting published, means that a writer may have to close off options that will divert his or her work into an unfamiliar, perhaps even an uncomfortable, form. It could mean that a writer has to compromise on whatever artistic integrity he or she has developed, getting rid of archaic words, semicolons, and–yes–even adverbs in favor of a more widely accepted style of writing. In short, it means that a writer might have to second-guess his or her own writerly instincts in order to fit into a “breed standard” that is instantly recognizable and appreciated by publishers, readers, and critics alike.

I am not saying that writers should write what they want and not worry about revision. Nor am I saying that all writing is good writing. I am just saying that with the market set up as it is today, it could be very easy to miss unique and talented writing in favor of writing that resembles what we’ve already seen. The danger in this situation is that we may, tragically, fail to recognize authentic writing, and worse still, fail to cultivate writers who strive for authenticity.

It’s time for another clarification. I remember the first time I ever heard the expression, “I’d rather be lucky than good.” I recall the momentary surprise I felt when I thought about it and then realized it actually made sense. One could be as good as possible, only to be in the wrong place at the wrong time. Luck, as Malcolm Gladwell points out in his book Outliers, is often an integral component of success. I want to offer a variation of this saying for writers–or rather, for those writers who are serious about exploring the world of the imagination, about the craft of writing (whatever that may be), about creating something that is meaningful rather than successful. Here goes:

It is better to be authentic than to be good.

I’ve come to this maxim by thinking about the novels I love, those books that I have re-read throughout a half-century of reading: Jane Eyre, David Copperfield, The Sun Also Rises, Never Let Me Go, Till We Have Faces, Mrs. Dalloway–and the list goes on. These books are not perfect. In some places, they are not even good. It is not difficult to find passages in any of them (with the possible exception of Never Let Me Go) that make readers cringe with frustration and/or embarrassment for the author. But each one of these novels is, to a certain degree, memorable and authentic, which is why I am compelled to read them again and again throughout the years. 

Certainly the term “authentic” is fraught, and I will need to define what I mean by it. I will try to do this in a timely manner; readers may look forward to a subsequent post in which I take a stab at my definition of authenticity in writing. But for now, I simply pave the way for that post by explaining why I am resisting the idea of writing good stories for the time being, even if that means rejecting the advice of a talented and well-meaning friend. And I invite my readers to weigh in on this topic, half-formed though it is at the present time, as I try to figure out just what it means for writing to be authentic.

 

Mere Democracy

In 1952, OxforC.s.lewis3d don C.S. Lewis, famous now for having written his seven-book series about Narnia, published a book called Mere Christianity, which remains one of his most popular works. Lewis himself was no theologian; although he had a Bachelor’s degree in philosophy from Oxford in 1922, he never pursued the study of it, focusing instead on English literature. As a scholar, he is remembered for his contributions to Renaissance and medieval literary studies, not for his forays into theology. In fact, some critics find fault with his works on Christianity; while it is true that he successfully boiled Christian theology down to its most important features, his simplification of difficult concepts may have gone too far for some heavy-hitting theologians. And yet despite these criticisms, Mere Christianity is celebrated and beloved today for being the book that brought countless non-believers to accept Christianity.

I am not interested in Mere Christianity for its Christian message, however, but rather for its ideological goal and impact. I believe that what Lewis did for Christianity–boiling it down to its major premises, its essential elements–is a brilliant tactic and could, if used correctly, help save civilization as we know it. In short, I want to urge one of my readers to write a similar book. This book, however, would be called Mere Democracy.

Why is such a book needed? The answer is obvious: in the wake of decades of corruption, party politics, winner-take-all contests, and win-at-any-cost stratagems, American democracy is ailing. Indeed, some pundits have even declared it dead. Lewis probably feared the same end for Christianity, yet, instead of giving up, he set to work and succeeded in revitalizing the Christian religion with his book.

What would Mere Democracy look like? Here’s my idea: It would be a modest book written in plain language that spelled out the basic tenets of democracy. Rather than providing a lengthy history of democracy and a comparison of different types of government, Mere Democracy would explain to the masses–to those very people who should be safeguarding democracy–what democracy looks like without the corrupting shadow of gerrymandered districts, unlimited corporate lobbying, and mindless populism. It would work to educate and inform, in plain language, those people who are put off by elitism, arrogance, and entitlement. In short, Mere Democracy would spell out the very least a society must do in order to remain democratic. In doing so, it would of course be incomplete and reductive; in its drive towards simplicity and clarity, it would not satisfy political scientists or sociologists; but it could, like Lewis’s book, help millions of people see their world in a new and vital way and convert them into a new understanding of the best form of government humanity has yet discovered.

Somewhere in the blogosphere today is the person who could write this book. Is it you? If so, I urge you to get started. I don’t mean to be alarmist, but the clock is ticking, and we’re running out of time.

 

 

Collecting Crumbs

CB

Why do we write? This is a question that few of us writers consider seriously. It’s a question we can amost always evade, because most of us feel compelled to write, almost as if this strange pastime were some kind of powerful addiction, driving us to write novels, poems, plays, and–of course–blog essays like this without any real thought about why we do so. Certainly there are plenty of answers to the question “why write?” For example: “Because no one can tell your story exactly you can”; “Because the world deserves to hear your story”; “Because you have a responsibility to engage in that great conversation we call literature.” I have myself discussed some of these answers in an earlier blog, but my favorite response to the question comes from Charlotte Bronte: “I’m just going to write because I cannot help it.”

However, the awful truth–and it is awful for us writers–is that there is no good answer to this question, because our work is completely unnecessary. There are already enough novels, poems, blogs, plays–you name it–to keep the entire world busy with reading for generations. This is a hard truth to accept, but I am convinced that it is the truth, and that all writers know it; they just refuse to accept it most of the time. The world doesn’t need our writing, because there are plenty of people engaged in the same task we are, making our work completely unnecessary and generally unwanted.

If anyone doubts this, consider how much marketing and publicity plays into every book that we read. Things seemed different a decade ago, when self-publishing through Amazon became possible for writers. In that moment, it seemed like the locked gates of publishing were ready to be stormed and broken. However, although the iron bars may have been shaken a bit, the hinges were not broken, and the gates remain closed to those who cannot muster up the money, the resolve, or the chutzpah to play the marketing game. This means that most of us will continue to write in obscurity, never making it onto any best-selling list–indeed, never making it onto any list at all.

It’s been hard to school myself to accept this situation. The wisest thing to do would be to stop writing, but like all addictions, the writing addiction is a hard one to break. I have indeed taken a sabbatical from writing, that dangerous pastime that sucks up too much time and gives much too little in return. I hate the fact that I find it so hard to write in an echo chamber, but after all, everyone wants recognition; everyone wants, once in a while, to be noticed.

For example, in a pathetic letter to a teacher with whom she had fallen in love, Charlotte Bronte wrote, “Monsieur, the poor do not need a great deal to live on — they ask only the crumbs of bread which fall from the rich man’s table — but if they are refused these crumbs — they die of hunger…” Yesterday, a good friend and neighbor remarked in passing that he really enjoyed my last novel. Startled, I did not thank him enough, and I’m sure he had no idea how much those words meant to me (though he might if he reads this). Yet through his simple words, I received a crumb of bread so big and so unexpected that I am still happily digesting it today, and will be, I’m sure, for weeks to come. Indeed, it was a large enough crumb to compel me to write this blog, to make me think of completing another writing project, and maybe–though I know it to be yet another futile task–to undertake new ones.

So let me end this blog by saying that if you know an indie writer and have enjoyed reading his or her work, take a moment and tell him or her so. It only takes a moment, and it may mean more to him or her than you’ll ever know. Scatter those crumbs, readers! Scrape them off of your table, take them into your hands, and toss them out as far as you can into the wind! By doing so, you may well  keep a person from starving.

And Marc, if you’re reading this, thank you.

How I Became a Writer, Part 3

brothers karamazov
Image from https://www.librarything.com/topic/132994

And now, as promised, the last installment on how I became a writer.

By the time I was in high school I knew I wanted to be a writer. I also knew that I needed to read as much as I could, and, with an older brother in college who evicted me from my bedroom each summer when he came home and left his previous semester’s English syllabi laying around, it was not difficult for me to devise a reading plan to fill out my knowledge of literature. For example, I declared tenth grade the year of the Russian novel; during that year, I read War and Peace, Anna Karenina, The Brothers Karamazov, and The Idiot. It was an ambitious undertaking, and I neglected my math and science classes to achieve it.

But I worked hard at the task I set myself. For example, one day in Band class (I was an underachieving clarinet player), the instructor was going through a piece with the flute section. Earlier that month, I had found a fantastic copy of The Brothers Karamazov–hardbacked, with two columns of print on each page–and I found that it fit perfectly on my music stand.

Usually I would put my sheet music on top of the book to camouflage my reading, but I had reached a really rivetting section (a chapter called “Lacerations”) that morning and I was oblivious to pretty much everything around me. I didn’t realize that Mr. Wren had crept up behind me and was, along with everyone else in the band, watching me read. When I finally realized the entire room was silent, with no flutes playing dissonant notes and no baton clicking out a rhythm on the conductor’s stand, I looked up to see what was going on, and met Mr. Wren’s small blue eyes peering at me. I expected to be duly chastised, but all he said was, “Lacerations? Do I need to send you to the counselor?” Mortified, I shook my head and shoved my book beneath my seat.

This is merely a long-winded way of demonstrating that I was a dedicated reader at a fairly young age. I tried to create a system, a reading method, but when I reached college, I realized how very inadequate my system was. My subsequent years in graduate school were probably an attempt to fill in the gaps of my literary knowledge. That attempt also ended in relative failure. I got a master’s degree and filled in a few of the many gaps left by my undergraduate education, then continued on to the Ph.D. level and filled in a few more. I was still very imperfectly educated in terms of English literature by the time I received my Ph.D., but thankfully education has no definitive endpoint. And if one becomes a generalist, as one must at a community college professor, then one can continue to add to one’s knowledge year after year after year. Even now, some years after retiring, I am still working hard to fill in those gaps.

But of course all this reading derailed me from becoming the writer I had originally planned to be. In other words, the preparatory work I set myself that was designed to make me a good writer eclipsed the desire to write for a great many years. There was, after all, so very much to learn and to read! I decided that if I had to choose between writing and reading, I would opt for reading, because I wanted to know what was out there. I guess you could say that my quest to perfect my knowledge of English literature (certainly an impossible task) has never been anything more than mere nosiness.

I would still pick reading over writing any day. In fact, most days I usually do. There is still so much to read, so many gaps to fill. For me, reading comes first, and it always will. I write to show that I am reading, that I am paying attention to what is out there. In the end, I write not because I love story-telling , but rather because I love the stories we’ve told throughout the ages so much that I cannot keep myself from adding to the ever-growing collection of them that makes up human culture.

How I Became a Writer, Part 2

As I recall, my first real attempt at critical writing involved a book review of Jack London’s The Sea Wolf, written in third or fourth grade–I’m not sure which. Why did I pick an obscure novel by a largely forgotten American writer? I believe it was because I had read White Fang (or was it The Call of the Wild? or perhaps both?) earlier that year and felt that writing a book review about a book I’d already read seemed to be cheating, so I found another book by Jack London. Perhaps this was my first foray into literary studies. I didn’t really get much from The Sea Wolf, unfortunately. My book review basically argued that London’s use of curse words within the narrative was a distinctive feature of his writing. I have no idea whether or not this is true, never having gone back to read The Call of the Wild, White Fang, or The Sea Wolf again.

20180129_111655
Some more recent critical writing

This was, if I recall, the year I had requested to be allowed to bring in the Bible for independent reading. My request was denied, which was a good thing. Mrs. Cirillo (or was it Mrs. Moss?) was right to curtail my outrageous desire to be a precocious reader. No one who spells the word “universe” with almost every letter of the alphabet, as I did back then, has earned the right to be a waywardly precocious reader. As for writing, we were allowed to make a book that spring, and I chose to write an elegy about my parakeet Dinky, who had dropped dead on Christmas morning. (This, coupled with the fact that during the Easter pageant at my church that year I was chosen to play the donkey that Jesus rode into Jerusalem on, may account for the fact that I later converted to Judaism.) The little book, which I can no longer locate, itself is nothing special, but it may be telling that the “Note About the Author” (written in a pretentious third person) at the end of the book refers to its author’s ardent desire to become a writer.

After this, there was a long spell of forgettable short stories, poems, and other forced writing assignments. But then, in my senior year of high school, I was nominated by my long-suffering English teachers to compete in a nationwide writing contest held by the National Council of Teachers of English (NCTE). The contest had two components: first, a prepared story (mine was some atrocious story about Brian Boru, King of Ireland–even the passage of fifty years cannot erase my shame at having concocted it) sent in ahead of time, and second, an on-demand essay. I remember being pulled out of algebra class on a April morning in 1976, ushered into an empty classroom, given pencil and paper, and losing myself in an essay in which I mused on my relation to my birthplace–Brooklyn, New York–a place I had moved away from some nine years earlier. I also remember that I left the classroom feeling somewhat pleased that I had mentioned my grandmother, who still lived in Brooklyn, noting that she was in fact the last thread that drew me back to my birthplace summer after summer. Somehow, I was surprised but not shocked when, an hour or so after I got home from school that day, my father called to tell me that my grandmother had died that morning. It didn’t take me long to figure out that she had probably died while I was actually writing my essay. (Two days later, I received an Easter card in the mail from Grandma. She always had great timing.)

I won the contest despite my dreadful story about Brian Boru, and was chosen as one of 26 students from Texas to win the NCTE Writing Award in 1977. It wasn’t such a big deal. While it may have helped me get into college, I have to admit that I completely forgot that I’d won such an award until a few years ago, when I was cleaning out some old papers. It came as something of a shock to me to realize that I had been involved with the NCTE years and years before I myself became a teacher of English and a member of that organization.

I have not gone into detail about the short stories I wrote in my high school years, because they are too pedestrian to stand out. Everyone writes those kind of stories. I was, however, quite a letter writer in those days, stealing funny bits from P.G. Wodehouse and other comic writers and inserting them into my letters to my parents and siblings. Below is a letter I found while visiting my mother last year. (Obviously, she recognized my genius–or she decided to save it as evidence that time travel really happens.)

15936965_10154604546211281_485930890542856060_o

Once again, I’ve made less progress on this project than I had anticipated. That leaves one more post (I promise–just one more!) to bridge the gap between my young adult and middle age years, and how I postponed my writing career (such as it is) by making a study of literature and becoming a professor of English.

How I Became a Writer, part 1

I cannot remember a time in my life when I didn’t want to be a writer. Perhaps I didn’t care about writing back when I was too young to understand what being a writer meant–before I’d really learned to read, in those days when, as a young child, I read only the books  that were placed in my willing hands, those rhyming, oddly illustrated children’s books that were so common in the 1960s. It’s quite possible that back then I didn’t have a hankering  to join what I have come to consider the Great Conversation, that I was content to look and pass, not feeling compelled to offer something–some small tidbit at least–to the exchange of stories and ideas that has gone on for centuries now.

My first memory of reading was from the Little Bear books, which my father, an accountant, got by the cartload, since he worked for the publisher (I think?). I am confused about this, however. It’s just as likely that we had a surplus of these books laying around our house. I was the youngest of three children, after all, so it makes sense that children’s books would pile up, and that they would be handed off to me. I don’t remember actually learning to read, but I do remember the laughter that ensued when I tried to sound out “Chicago,” as well as having to struggle with the word “maybe,” which I pronounced incorrectly, with the accent on the “be” and not the “may.”

LittleBear
By Source, Fair use, https://en.wikipedia.org/w/index.php?curid=38403132

But these books certainly didn’t enchant me. That would have to wait for some years. In the meantime,  I remember seeing a copy of Julius Caesar on our dining room table, with its cover illustration featuring a lurid, bloody toga attracting more than a mere glance at it, and although I didn’t try to read Shakespeare’s misnamed tragedy, it couldn’t have been mere coincidence that I became enamored of the story of Caesar and Cleopatra, to such an extent that I would wrap myself in striped beach towels and stomp through our Brooklyn duplex declaring, in all seriousness, “I wish to be buried with Mark Anthony.” My elaborately crafted Cleopatra-fantasy imploded, however, when I convinced my second-grade class to put on a short play about Cleopatra, Julius Caesar, and Mark Anthony. (Is it possible that I wrote the play myself? That seems unlikely, but I cannot imagine that many age-suitable plays on that subject were available.) I was over the moon–until I got the news that I was to play Julius Caesar. And that was the end of that fantasy, much to the relief of my family.

The books that did grab my attention were a set of great books that my grandmother had20171225_142442 bought for her two children back in the 1930s: a set of all of Dickens, all of Twain, and some odds-and-ends, such as William M. Thackeray’s Vanity Fair, as well as a full set of encyclopedias. (I still have a few of the Dickens works, but most of the books were destroyed in a flooded warehouse back in the 1990s.) I am sure that my grandmother’s purchase was an investment in wishful thinking: I would swear an oath that neither my uncle nor my father ever read a word of these books. I am equally sure that I, feeling sorry for the books (which is something I still do–and explains why I sometimes check out books from the library that I have no interest in but will read because I think someone should pay them some attention), picked a few of them off the dusty shelf one summer and began to read them. I remember reading, and delighting in, Mark Twain’s Innocents Abroad long before I ever read Tom Sawyer or Huckleberry Finn.

As the youngest child in the family, I was frequently left to my own devices, and that was fine with me. But I think I might have been a little lonely, a little too strange for children my own age, and this was something that my parents wouldn’t have noticed, not back in the 1960s and ’70s, when there was less attention placed on the lives of children. So it’s natural that the books became my friends. When I visited my father after my parents got divorced (there was no joint custody back then, which was delightful for me, as it meant that I was able to stay with my father in NYC for a huge swath of the summer vacation), I started reading through the set of Dickens. In doing so, I found a whole new set of friends and family. Even today, when I open a Dickens novel–any Dickens novel–I feel like I am at a family reunion full of quirky, oddball relatives. It is a wonderful feeling.

This oddly rambling blog post is doing a fine job of explaining how I became a reader, but it is completely missing what I set out to do: explain how I became a writer. That, I can see now, will have to wait for another post.

 

 

On the Relationship of Myth and Story

The_Lord_of_the_Rings_Trilogy
Image from the lotr.wiki.com

Please note: This is a very long post. It is based on a talk I gave yesterday (October 28, 2017) at the C.S. Lewis Festival in Petoskey, Michigan. Consider yourself warned!

 

The study of myth seems to me to take three different paths:

  • Anthropological / Archeological: the study of classical mythologies (Bulfinch’s Mythology, Edith Hamilton)
  • Religious / Transcendent: the spiritual meaning of myth (Karen Armstrong, Joseph Campbell, Sigmund Freud)
  • Structuralist: the study of the same structures that recur in myths (Northrop Frye, Joseph Campbell, Roland Barthes)

This is all interesting, but I would like to back up a moment. I feel like I’ve arrived a dinner party, and that somehow I missed the first two courses. I feel as if I might get some kind of mental indigestion if I don’t start over at the very beginning.

The fact is, I want to know something more fundamental about myth and its function.

  • I want to know what it is and how it works.
  • Specifically, I want to know what distinguishes myth from other forms of story-telling.

Because for me, Story-Telling is what distinguishes human beings, homo sapiens, from all other species on this planet, as far as we know.

  • Studies have shown that crows have memories
  • Studies have shown that chimpanzees use tools
  • Philosophers are now beginning to agree that animals do indeed have consciousness

But we—we should be known not as homo sapiens (wise man, the man who knows), but as homo narrans—the speaking man, the man who tells, who narrates—story-telling man.  Because it is clear to me that we humans communicate largely through story-telling, and this story-telling function, this tendency to rely on narration, is what makes us human.

I’m going to ask you to bear with me for a little while as I tease this out. I’d like to say that by the end of this essay, I’ll have some answers to the questions I posed (what is myth, and how does it work, and what is the difference between a really good story and a myth)—but I’m pretty sure I won’t. I may, however, ask some more questions that might eventually lead me to some answers.

So here goes. To begin with, a few people who weigh in on what myth is and what it does:

Roland Barthes, the French post-structuralist literary theorist, says that myth is a type of speech, a system of communication, a kind of message. In a way, Barthes and JRR Tolkien are not really different on this point, incredible as it is to think of Barthes and Tolkien agreeing on anything at all, much less something so important to each of them.

  • They are both incredibly passionate and devoted to the concept of language
  • Barthes, in his book Mythologies, which I have shamelessly cherry-picked for this essay, says that the myth’s objective in being told is not really important; it is the way in which it conveys that message that is important.
  • He says that “the knowledge contained in a mythical concept is confused, made of yielding, shapeless associations” (119).
    • But this isn’t as bad as it sounds, because myths actually don’t need to be deciphered or interpreted.
    • While they may work with “Poor, incomplete images” (127), they actually do their work incredibly efficiently. Myth, he says, gives to its story “a natural and eternal justification…a clarity which is not that of an explanation but that of a statement of fact” (143).
    • Myth is a story in its simple, pure form. “It acts economically: it abolishes the complexity of human acts, it gives them the simplicity of essences…” (143).
  • You can see how this view of myth kind of works with the myth-building that Tolkien does in The Lord of the Rings, which works with simple efficiency, whose very images are incomplete to the point of needing clarification in Appendices and further books like the Silmarillion. Yet even without having read these appendices and other books, we grasp what Tolkien is getting at. We know what Middle-Earth is like, because the myth that Tolkien presents needs no deciphering, no real interpretation for us to grasp its significance.

Tolkien, I think we can all agree, was successful in creating a myth specifically for England, as Jane Chance and many other scholars have now shown to be his intention. But is it a novel? Some might argue it isn’t—myself included. In fact, what Tolkien created in The Lord of the Rings is less a myth (I would argue that we only use that term because Tolkien himself used it to describe his work and his object—think of the poem “Mythopoeia,” which he dedicated to C.S. Lewis) than it is a full-blown epic.

For my definition of epic versus novel, I’m going to my personal literary hero, Mikhail Bakhtin, a great thinker, a marvelous student of literature, a man who wrote with virtually no audience at all for many years because he was sent into internal exile in the Soviet Union. In his essay “Epic and the Novel,” Bakhtin attributes these characteristics to epic:

  1. It deals with an absolute past, where there is little resemblance to the present;
  2. It is invested with national tradition, not personal experience, arousing something like piety;
  3. There is an absolute, unbridgeable distance between the created world of epic and the real world.

The novel, says Bakhtin, is quite the opposite. It is new, changing, and it constantly “comes into contact with the spontaneity of the inconclusive present; this is what keeps the genre from congealing. The novelist is drawn toward everything that is not yet completed” (27).

I think the three characteristics of epic described by Bakhtin do in fact match up nicely with The Lord of the Rings: absolute past, national tradition, distance between the actual and the created world. But here’s another thing about epic as described by Bakhtin: “The epic world knows only a single and unified world view, obligatory and indubitably true for heroes as well as for authors and audiences” (35).  It would be hard, indeed impossible, to imagine The Lord of the Rings told from a different point of view. We need that distant narrator, who becomes more distant as the book goes on. As an example, imagine The Lord of the Rings told from Saruman’s point of view, or from Gollum’s. Or even from Bilbo or Frodo’s point of view. Impossible! Of course, we share some of the point of view of various characters at various points in the narrative (I’m thinking specifically of Sam’s point of view during the Cirith Ungol episode), but it couldn’t be sustained for the whole of the trilogy.

The interesting thing here is that in The Lord of the Rings, Tolkien took the novel form and invested it with epic. And I think we can say that against all odds, he was successful. On the other hand, C.S. Lewis, in his last book Till We Have Faces, took a myth (the story of Cupid and Psyche), which is certainly more closely related to epic than it is to novel, and turned it into a successful novel. This isn’t the time and place to talk about Till We Have Faces, although I hope someday that we can come together in the C.S. Lewis Festival to do that very thing, but I couldn’t help mentioning this, because it’s striking that Lewis and Tolkien, while they clearly fed off each other intellectually and creatively, started from opposite ends in writing their greatest creative works, as they did in so many other things. It’s almost amazing that you can love both of them at the same time, but of course you can. It’s the easiest thing in the world to do.

But I’m losing the thread of my questions here. What is myth? Can we actually have modern myths? Can someone actually set out with the intention of creating a myth? And can a mythic work spontaneously just happen? Another question needs to be posed here: if this long book, which is probably classified in every bookstore and library as a novel, touches on myth but is really an epic, can a novel, as we know it, become a myth? This forces us to tighten up our definition of what a myth is and asks us to think about what myth does.

Karen Armstrong, I think, would say yes, to all three of these questions. In her book A Short History of Myth, Armstrong follows the trajectory of myths through time and argues that the advent of printing and widespread literacy changed how we perceive and how we receive myth. These developments changed myth’s object and its function—and ultimately, it changed the very essence of myth.

Armstrong points out that myths and novels have similarities:

  • They are both meditative
  • They can both be transformative
  • They both take a person into another world for a significant period of time
  • They both suspend our disbelief
  • They break the barriers of time and space
  • They both teach compassion

Inspired by Armstrong and by Bakhtin, I’m going to go out on a limb here and make a stab at answering my questions. And I’ll start by defining a modern myth as a super-story of a kind: a novel (or a film, because let’s open this up to different kinds of story-telling) that exerts its power on a significant number of people. These stories then provide, in film professor and writer Stuart Voytilla’s words, “the guiding images of our lives.”

In short, a modern myth has these characteristics:

  1. It belongs to a certain place and time. Like epic, it is rooted in a time and a place. It might not be far removed from the actual, but it cannot be reached from the actual.
  2. It unites a group of readers, often a generation of readers, by presenting an important image that they recognize.
  3. It unites a group of readers by fostering a similar reaction among them.
  4. It contains identifiable elements that are meaningful to its readers/viewers. Among these might be important messages (“the little guy can win after all,” “there’s no place like home,” the American Dream has become a nightmare”).

In other words, a mythic story can be made intentionally, as Star Wars was by George Lucas after he considered the work of Joseph Campbell; or it can happen accidentally. Surely every writer dreams of writing a mythic novel—the Great American novel—but it’s more or less an accident. The Adventures of Huckleberry Finn was a mythic novel of American, until it was displaced by To Kill a Mockingbird.  And I would note here that having your novel go mythic (as we might term it—it is, in a way, like “going viral,” except mythic stories tend to last longer than viral ones) is not really such a good thing after all. Look at Harper Lee—one mythic novel, and that was the end of her artistic output—as far as we know. A mythic novel might just be the last thing a great writer ever writes.

Anyway, back to our subject: a  modern myth gets adopted rather than created. Great myths are not made; they become. So let’s’ think of a few mythic novels and see how they line up with my four characteristics:

  1. Frankenstein
  2. Star Wars
  3. The Wizard of Oz
  4. The Great Gatsby or Death of a Salesman—take your pick.
  5. Casablanca
  6. The Case of Local Myths—family or friend myths, references you might make to certain films or novels that only a small number of people might understand. A case in point would be the re-enactments of The Rocky Horror Picture Show that take place each year around Halloween.

In essence, my answer, such as it is, to the questions I posed earlier comes down to this:

Modern myths are important stories that unite their readers or viewers with similar emotional and intellectual reactions. Modern mythology works by presenting recognizable and significant images that unite the people who read or view them. As for what distinguishes modern myths from other forms of story-telling, what tips a “normal” novel or film over into the realm of “mythic”—I don’t have an answer for this. I only have a couple of vague, unformed theories. One of my theories is this: Could one difference between myth and the novel (“mere” story-telling as such) be that myth allows the reader/listener to stay inside the story, while the novel pushes the reader back out, to return to the actual world, however reluctantly?

And let’s not forgot what Karen Armstrong wrote about myth: “It has been writers and artists, rather than religious leaders, who have stepped into the vacuum [created by the loss of religious certainty and despair created by modernism] and attempted to reacquaint us with the mythological wisdom of the past” (138).  Armstrong’s closing sentence is perhaps the most important one in the book: “If professional religious leaders cannot instruct us in mythical lore, our artists and creative writers can perhaps step into this priestly role and bring fresh insight to our lost and damaged world” (149). With this in mind, perhaps it’s time to go and read some more, and find more myths that can help us repair and restore ourselves, our faith in our culture, and in doing so, the world itself.

 

Three Things I’ve Learned from Kazuo Ishiguro

06Nobel-master768
Image from the New York Times (October 5, 2017)

 

I had actually planned this post a couple of days before my favorite living writer, Kazuo Ishiguro, won the Nobel Prize in Literature (announced on on October 5th). So, along with the satisfaction and sense of vindication I felt when I woke up last Thursday morning and discovered that he’d been awarded the Prize, I also felt a sense chagrin at being late in making this post. After all, I could have gone on record about Ishiguro’s talent days before the Nobel committee made its announcement. Still, better late than never, so I will offer my belated post now, and explain the three most important things I’ve learned from Ishiguro over the years.

The most important thing I’ve learned from Kazuo Ishiguro is this: great writing often goes unnoticed by readers. (This point, of course, is now somewhat diluted by the fact that Ishiguro has indeed won acclaim for his work, but I think it deserves to be made all the same.) I remember reading Never Let Me Go about eight years ago and being gob-smacked by its subtle narrative brilliance and its emotional resonance. And yet I’ve met many readers of the book who, while affected by the narrative, seemed unimpressed by Ishiguro’s writerly achievement. It’s almost embarrassing that my reaction to the novel was so different than other people’s. Could I have gotten it wrong, somehow? Was it possible that Never Let Me Go really wasn’t the masterpiece I thought it was? While I considered this, I never once really believed I had made a mistake in my estimation: it is a tremendous book. The fact that few other people see it as such does not change my view of it. It simply means that I see something in it that other people don’t. Hence my first object lesson from reading Ishiguro: genius isn’t always obvious to the mass of readers out there. Perhaps it just isn’t that noticeable with so many other distracting claims for our attention.

The second thing I’ve learned from Ishiguro also stems from Never Let Me Go: genre doesn’t matter. When you really think about it, categorizing a work based on its plot is a silly thing to do, and yet we are firmly locked into that prison of categorization, since almost all bookstores and libraries, as well as readers, demand that every work fit into a narrow slot. I commend Ishiguro for defying the convention of genre, incorporating elements from both science fiction and fantasy into realist narratives. In my view, the sooner we break the shackles of genre, the better. Good, responsible readers should never restrict themselves to a certain genre any more than good, imaginative writers should. A certain amount of artistic anarchy is always a good thing, releasing creative juices and livening things up.

And finally, the third thing I’ve learned is this: a good writer does not hit the bull’s eye every time he or she writes. The Remains of the Day and Never Let Me Go are truly wonderful books. An Artist of the Floating World is promising, but not nearly as good as Ishiguro’s later works.  The Buried Giant, I’d argue, is a failure–but it is a magnificent failure, one whose flaws emanate from the very nature of the narrative itself, and thus it transcends its own inability to tell a coherent story. I’ve learned from this that a writer should never be afraid to fail, because failing in one way might be succeeding in another, less obvious, way. This is as good a place as any other to admit that I have never been able to get through The Unconsoled. And as for When We Were Orphans–well, the less said about that disaster of a book, perhaps the better. I can’t imagine what Ishiguro was thinking there–but I will certainly defend his right to fail. And I am thankful that even a writer with such talent as Ishiguro does, from time to time, fail–and fail big. It certainly gives the rest of us hope that while we fail, we can still aspire to success.

I will close by saying that I am grateful to Kazuo Ishiguro for the wonderful books he’s written. If you haven’t read any of them, you should–and not just because some panel gave him an award. But I am just as grateful to him for the three important lessons he has taught me about the nature of writing.

 

On Self-Publishing and Why I Do It

Screen Shot 2017-09-21 at 1.11.27 PM

Let me get one thing straight right from the get-go: I know self-publishing is not the same thing as publishing one’s work through a legitimate, acknowledged publishing company. I also know that self-publishing is looked down upon by the established writing community and by most readers. In fact, for the most part I agree with this estimation. After all, I spent much of last year writing freelance book reviews for Kirkus Reviews, so I know what’s being published by indie authors: some of it is ok, but much more of it is not very good at all.

Knowing this, then, why would I settle for publishing my novels on Amazon and CreateSpace? This is a tricky question, and I have thought about it a great deal. Whenever anyone introduces me as an author, I am quick to point out that I am, in fact, just a self-published author, which is very different from a commercial writer. (And if at any time I am liable to forget this important fact, there are enough bookstores in my area that will remind me of it, stating that they don’t carry self-published books.) When I meet other writers who are looking for agents, I do my best to encourage them, largely by sharing with them the only strategy I know: Be patient, and persist in sending your queries out.

So why, since I know all this, do I resort to self-publishing my work? I’ve boiled it down to four main reasons.

First of all, I self-publish because I am not invested in becoming a commercially successful writer. I write what I want, when I want, and when I decide my work is complete, I submit it to an electronic platform that makes it into a book, which I can then share with family and friends and anyone else who cares to read it. In other words, for me writing is not a means by which to create a career, celebrity, or extra income. I have long ago given up the fantasy of being interviewed by Terry Gross on Fresh Air; my fantasies are more mundane these days.

Second, I do not need to be a commercial writer, with a ready-made marketing machine to sell my books, because I am not hoping to make any money from them. Rather, I look upon writing as a hobby, just as I look upon my interest in Dickens, Hardy, and the Brontes as a hobby. I am helped here by having spent many years engaged in academic research, a world in which publications may win their authors momentary notice, but certainly not any money, unless one happens to sell out to the lure of literary celebrity, as Stephen Greenblatt has. I have a few publications out in the academic world, but no celebrity and certainly no money to show for them–and I am totally fine with that. In my creative writing, I am lucky enough to have a hobby that satisfies me and costs me relatively little–far less, in fact, than joining a golf or tennis club would cost.

The third reason that I self-publish my work is that I actually enjoy doing so. There are some aspects of publication that have surprised me. For example, I have found that I really enjoy working with a great graphic designer (thanks, Laura!) to develop the cover of my novels. It is an extension of the creative process that is closely related to my work but something that I could never do myself, and this makes me all the more grateful and fascinated as I watch the cover come to life and do its own crucial part to draw readers into the world I have created.

As a retired writing professor, I realize how important revision and proofreading is, and to be honest, this is the only part of self-publishing that gives me pause, because I worry about niggling little errors that evade my editorial eye. But for the most part, I am old enough now to have confidence in my writing. Plus, the beauty of self-publishing is that it is electronic: if there are errors (and there are always errors, even in mainstream published books), I can fix them as soon as a kind reader points them out. So I suppose the fourth reason to self-publish lies in the fact that it is so very easy to do it these days.

These are four good reasons for me to self-publish, but the most important reason is that I apparently love to write, and self-publishing allows me to do this without worrying about submitting the same piece of work over and over again to agents and publishers, stalling out my creativity. While at the Bronte Parsonage Museum this past summer, I picked up a card that expresses how I feel about it, a quote from Charlotte Brontë: “I’m just going to write because I cannot help it.” (It is a testament to my literary nerdiness that I happen to know that this quotation comes from Brontë’s Roe Head Journal, but strangely enough, before I encountered it on a greeting card I never realized that it applied to myself as well as to Brontë.) In my idiosyncratic view, self-publishing allows the reader to decide whether a novel is worth reading, rather than punting that responsibility over to an overworked and market-fixated literary agent or editorial assistant. I am willing to trust that reader’s judgment, even if it means I will never sell many books.

And so today, as I am releasing my second self-published novel (Betony Lodge, available on Amazon and CreateSpace–and this is my last attempt at marketing it here on my blog), I am fully aware of the stigma of self-publishing, but I realize that what’s right for other writers may not be right for me. Today, then, I am taking my courage into my own hands and pushing that key to make the book go live.

And tonight I will be making my own champagne toast: here’s to living in the 21st century,  when digital publishing makes authors of us all!

My Short, Tragic Career as an Independent Scholar

 

20170717_121338

Several months ago, I had what seemed like a fantastic idea: now that I was retired from teaching English at a community college, I could engage in critical research, something I’d missed during those years when I taught five or more classes a semester. I had managed to write a couple of critical articles in the last few years of my tenure at a small, rural two-year college in Northern Michigan, but it was difficult, not only because of the heavy demands of teaching, but also because I had very limited access to scholarly resources. Indeed, it is largely due to very generous former students who had moved on to major research institutions that I was able to engage in any kind of scholarly research, a situation which may seem ironic to some readers, but which is really just closing the loop of teacher and student in a fitting and natural way.

And so last fall, on the suggestion of a former student, I decided to throw my hat in the ring and apply to  a scholarly conference on Dickens, and my proposal was chosen. In time, I wrote my paper (on Dickens and Music– specifically on two downtrodden characters who play the flute and clarinet in David Copperfield and Little Dorrit, respectively) and prepared for my part in the conference.

It had been close to 25 years since I had read a paper at a conference, and so I was understandably nervous. Back then, there was no internet to search for information about conference presentations, but now I was able to do my homework, and thus I found a piece of advice that made a lot of sense: remember, the article emphasized, that a conference paper is an opportunity to test out ideas, to play with them in the presence of others, and to learn how other scholars respond to them, rather than a place to read a paper, an article, or a section of a book out loud before a bored audience. Having taught public speaking for over a decade, I could see that this made a lot of sense: scholarly articles and papers are not adapted to oral presentations, since they are composed of complex ideas buttressed by a great many references to support their assertions. To read such a work to an audience seemed to me, once I reflected on it, a ridiculous proposition, and would surely bore not only the audience, but any self-respecting speaker as well.

I wrote my paper accordingly. I kept it under the fifteen-minute limit that the moderator practically begged the panelists to adhere to in a pre-conference email. I made sure I had amusing anecdotes and witty bon mots. I concocted a clever PowerPoint presentation to go with the paper, just in case my audience got bored with the ideas I was trying out. I triple-spaced my copy of the essay, and I–the queen of eye contact, as my former speech students can attest–I practiced it just enough to become familiar with my own words, but not so much that I would become complacent with them and confuse myself by ad-libbing too freely. In short, I arrived at the conference with a bit of nervousness, but with the feeling that I had prepared myself for the ordeal, and that my paper would meet with amused interest and perhaps even some admiration.

It was not exactly a disaster, but it was certainly not a success.

To be honest, I consider it a failure.

It wasn’t that the paper was bad. In fact, I was satisfied with the way I presented it. But my audience didn’t know what to do with presentation. This might be because it was very short compared to all the other presentations (silly me, to think that academics would actually follow explicit directions!). Or it could be because it wasn’t quite as scholarly as the other papers. After all, my presentation hadn’t been published in a journal; it was, as C.S. Lewis might have called it, much more of a “supposal” than a fully-fledged argument. Perhaps as well there was something ironic in my stance, as if I somehow communicated my feeling that research in the humanities is a kind of glorified rabbit hunt that is fun while it lasts but that rarely leads to any tangible, life-changing moment of revelation.

Yet this is not to say that humanities research is useless. It isn’t. It develops and hones all sorts of wonderful talents that enrich the lives of those who engage in it and those who merely dip into it from time to time. I believe in the value of interpreting books and arguing about those interpretations; in fact, I believe that engaging in such discussions can draw human beings together as nothing else can, even at the very moments when we argue most fiercely about competing and contrasting interpretations. This is something, as Mark Slouka points out in his magnificent essay “Dehumanized,” that STEM fields cannot do, no matter how much adminstrators and government officials laud them, pandering to them with ever-increasing budgets at the expense of the humanities.

And this is, ultimately, why I left the conference depressed and disappointed. I had created, in the years since I’d left academia, an idealized image of it that was inclusive, one that recognized its own innate absurdity. In other words, sometime in the last two decades, I had recognized that research in the humanities was valuable not because it produced any particular thing, but because it produced a way of looking at the world we inhabit with a critical acuity that makes us better thinkers and ultimately better citizens. The world of research, for me, is simply a playground in which we all can exercise our critical and creative faculties. Yet the conference I attended seemed to be focused on research as object: indeed, as an object of exchange, a widget to be documented, tallied, and added to a spreadsheet that measures worth.

Perhaps its unfair of me to characterize it in this way. After all, most of the people attending the conference were, unlike me, still very much a part of an academic marketplace, one in which important decisions like tenure, admission to graduate programs, promotions, and departmental budgets are decided, at least in part, by things like conference attendance and presentations. It is unfair of me to judge them when I am no longer engaged in that particular game.

But the very fact that I am not in the game allows me to see it with some degree of clarity, and what I see is depressing. One cannot fight the dehumanization of academia, with its insistent mirroring of capitalism, by replicating that capitalism inside the ivy tower; one cannot expect the humanities to maintain any kind of serious effect on our culture when those charged with propagating the study of humanities are complicit in reducing humanities research to mere line items on a curriculum vitae or research-laden objects of exchange.

I can theorize no solution to this problem beyond inculcating a revolution of ideas within the academy in an effort to fight the now ubiquitous goal of bankrupting the study of arts and humanities, a sordid goal which now seems to characterize the age we live in. And I have no idea how to bring about such a revolution. But I do know this: I will return to my own study with the knowledge that even my small, inconsequential, and isolated critical inquiries are minute revolutions in and of themselves. As we say in English studies, it’s the journey that’s important, not the destination. And in the end, I feel confident that it will take far more than one awkward presentation at a conference to stop me from pursuing my own idiosyncratic path of research and inquiry into the literature I love.