Inappropriate Songs from the Past (brought to you by SiriusXM Radio)

BennyGoodmanandBandStageDoorCanteen
By Film screenshot – Stage Door Canteen film, Public Domain, https://commons.wikimedia.org/w/index.php?curid=1194330

I am a fairly devoted listener of ’40s Junction, a channel on Sirius Radio that plays songs from the 1940s–except when they don’t. (To my lasting fury and frustration, I discovered this year that the station ceases its normal operations on November 1 and, for the next six weeks, plays “holiday music.” Now, I don’t know if the program managers decided that the people who listen to 1940s music are the same ones who enjoy endless Christmas carols, but if anyone from Sirius is reading this, here’s a hint: they aren’t.) One thing I’ve found out by listening to ’40s Junction is that  if one listens long enough, one can discover some real gems. I mean, we all know “Blues in the Night,” “I’ll Be Seeing You,” “Don’t Fence Me In,” and “String of Pearls,” but occasionally this station plays some songs I’ve never heard of, despite living in a certifiable time warp for my entire life. And so today I’m taking a break from politics and pessimism to discuss four of these little oddities from the past, complete with YouTube links, so that you can listen to them and judge for yourself. Above all, I’m curious about my readers’ reactions to these songs, so please do leave your comments on some or all of these songs.

I will start out with a song that has a catchy rhythm and some very interesting lyrics: “The Lady from 29 Palms.” It seems to have been recorded in 1946 or so, and there is an interesting reference to the explosive attraction of the lady in question, who is compared to “a load of atom bombs,” which, coming so soon after the bombing of Hiroshima and Nagasaki, is highly insensitive, to say the least. Yet it has a great sax line in the beginning, and with its really clever rhymes, I’d say this is a song that’s worth listening to.

Next on my list is a very odd little song that shocked me when I first heard it. Unlike “The Lady from 29 Palms,” “Who’s Yahoodi” is famous enough to have its own Wikipedia entry, which certainly  bears checking out. Hop over there, and you can read about the song’s origins on the Bob Hope Radio Program, when announcer Jerry Colonna got tickled as he introduced one of Hope’s guests, the young violin prodigy Yehudi Menuhin. Colonna made fun of the name, continued his joke through later programs, and eventually, it became a popular meme, although memes hadn’t been invented yet.  In 1940, songwriters Bill Seckler and Matt Dennis made a song out of it. The U.S. Navy also got some mileage out of the situation, naming one of its research programs “Project Yehudi.” I’ve linked to the Cab Calloway version, but there are several other versions, including an astoundingly antisemitic one (I know–the whole thing’s kind of antisemitic, but this version is really takes the cake). As with “The Lady from 29 Palms,” the song itself is catchy, with the kind of finger-snapping rhythm that makes tunes from this era so appealing. In addition, the song’s many references to secretive, spooky people who are there, but not there remind me of those Dr. Who beings, the Silence, who watch and influence human events, but are never seen by humans. Added to these odd but intriguing lyrics, there’s some enjoyable big-band music, with the necessary saxophone solos and brass rhythms creating a memorable song. How it disappeared is a mystery–unless, of course, the Silence got involved in the whole thing and wiped it from our collective memories.

The next two songs are about body-shaming. In a way, I’m surprised they found their way on the air at all in the present day, given the changing climate and hyper-awareness about body images we’ve seen in recent years. The first, “Lean Baby,” was recorded by Frank Sinatra and was very popular. But the Dinah Washington version is even better, so you should check that one out, too. Clever yet brutal lyrics make the song interesting, and once again the music is quite catchy. On the flip side of this “appreciation” of thinness is “Mr. Five-by-Five,” arguably the most successful of the songs I’ve mentioned here. At least seven singers have recorded versions of it, the most recent one in a 2013 movie (Gangster Squad), according to its Wikipedia entry. Here’s a version by Ella Mae Morse recorded in 1942. Again, there are some devilishly funny lyrics that, inappropriate as they are, make you laugh out loud–if you’re by yourself.

So, readers, what do you think about these four songs? Politically incorrect, a fascinating trip down memory lane, historical footnotes, or just oddities? I’m not sure what to make of them, but I am grateful that they are preserved, inappropriate or not, for us to listen to, consider, and critique them. So, thanks, Sirius Radio, for the memories–even if I have to put up with two months of Christmas music to get them.

 

My Short, Tragic Career as an Independent Scholar

 

20170717_121338

Several months ago, I had what seemed like a fantastic idea: now that I was retired from teaching English at a community college, I could engage in critical research, something I’d missed during those years when I taught five or more classes a semester. I had managed to write a couple of critical articles in the last few years of my tenure at a small, rural two-year college in Northern Michigan, but it was difficult, not only because of the heavy demands of teaching, but also because I had very limited access to scholarly resources. Indeed, it is largely due to very generous former students who had moved on to major research institutions that I was able to engage in any kind of scholarly research, a situation which may seem ironic to some readers, but which is really just closing the loop of teacher and student in a fitting and natural way.

And so last fall, on the suggestion of a former student, I decided to throw my hat in the ring and apply to  a scholarly conference on Dickens, and my proposal was chosen. In time, I wrote my paper (on Dickens and Music– specifically on two downtrodden characters who play the flute and clarinet in David Copperfield and Little Dorrit, respectively) and prepared for my part in the conference.

It had been close to 25 years since I had read a paper at a conference, and so I was understandably nervous. Back then, there was no internet to search for information about conference presentations, but now I was able to do my homework, and thus I found a piece of advice that made a lot of sense: remember, the article emphasized, that a conference paper is an opportunity to test out ideas, to play with them in the presence of others, and to learn how other scholars respond to them, rather than a place to read a paper, an article, or a section of a book out loud before a bored audience. Having taught public speaking for over a decade, I could see that this made a lot of sense: scholarly articles and papers are not adapted to oral presentations, since they are composed of complex ideas buttressed by a great many references to support their assertions. To read such a work to an audience seemed to me, once I reflected on it, a ridiculous proposition, and would surely bore not only the audience, but any self-respecting speaker as well.

I wrote my paper accordingly. I kept it under the fifteen-minute limit that the moderator practically begged the panelists to adhere to in a pre-conference email. I made sure I had amusing anecdotes and witty bon mots. I concocted a clever PowerPoint presentation to go with the paper, just in case my audience got bored with the ideas I was trying out. I triple-spaced my copy of the essay, and I–the queen of eye contact, as my former speech students can attest–I practiced it just enough to become familiar with my own words, but not so much that I would become complacent with them and confuse myself by ad-libbing too freely. In short, I arrived at the conference with a bit of nervousness, but with the feeling that I had prepared myself for the ordeal, and that my paper would meet with amused interest and perhaps even some admiration.

It was not exactly a disaster, but it was certainly not a success.

To be honest, I consider it a failure.

It wasn’t that the paper was bad. In fact, I was satisfied with the way I presented it. But my audience didn’t know what to do with presentation. This might be because it was very short compared to all the other presentations (silly me, to think that academics would actually follow explicit directions!). Or it could be because it wasn’t quite as scholarly as the other papers. After all, my presentation hadn’t been published in a journal; it was, as C.S. Lewis might have called it, much more of a “supposal” than a fully-fledged argument. Perhaps as well there was something ironic in my stance, as if I somehow communicated my feeling that research in the humanities is a kind of glorified rabbit hunt that is fun while it lasts but that rarely leads to any tangible, life-changing moment of revelation.

Yet this is not to say that humanities research is useless. It isn’t. It develops and hones all sorts of wonderful talents that enrich the lives of those who engage in it and those who merely dip into it from time to time. I believe in the value of interpreting books and arguing about those interpretations; in fact, I believe that engaging in such discussions can draw human beings together as nothing else can, even at the very moments when we argue most fiercely about competing and contrasting interpretations. This is something, as Mark Slouka points out in his magnificent essay “Dehumanized,” that STEM fields cannot do, no matter how much adminstrators and government officials laud them, pandering to them with ever-increasing budgets at the expense of the humanities.

And this is, ultimately, why I left the conference depressed and disappointed. I had created, in the years since I’d left academia, an idealized image of it that was inclusive, one that recognized its own innate absurdity. In other words, sometime in the last two decades, I had recognized that research in the humanities was valuable not because it produced any particular thing, but because it produced a way of looking at the world we inhabit with a critical acuity that makes us better thinkers and ultimately better citizens. The world of research, for me, is simply a playground in which we all can exercise our critical and creative faculties. Yet the conference I attended seemed to be focused on research as object: indeed, as an object of exchange, a widget to be documented, tallied, and added to a spreadsheet that measures worth.

Perhaps its unfair of me to characterize it in this way. After all, most of the people attending the conference were, unlike me, still very much a part of an academic marketplace, one in which important decisions like tenure, admission to graduate programs, promotions, and departmental budgets are decided, at least in part, by things like conference attendance and presentations. It is unfair of me to judge them when I am no longer engaged in that particular game.

But the very fact that I am not in the game allows me to see it with some degree of clarity, and what I see is depressing. One cannot fight the dehumanization of academia, with its insistent mirroring of capitalism, by replicating that capitalism inside the ivy tower; one cannot expect the humanities to maintain any kind of serious effect on our culture when those charged with propagating the study of humanities are complicit in reducing humanities research to mere line items on a curriculum vitae or research-laden objects of exchange.

I can theorize no solution to this problem beyond inculcating a revolution of ideas within the academy in an effort to fight the now ubiquitous goal of bankrupting the study of arts and humanities, a sordid goal which now seems to characterize the age we live in. And I have no idea how to bring about such a revolution. But I do know this: I will return to my own study with the knowledge that even my small, inconsequential, and isolated critical inquiries are minute revolutions in and of themselves. As we say in English studies, it’s the journey that’s important, not the destination. And in the end, I feel confident that it will take far more than one awkward presentation at a conference to stop me from pursuing my own idiosyncratic path of research and inquiry into the literature I love.

Making Art in Troubled Times

an1836p135-371-large
Image from the webpage of the Ashmolean Museum: http://britisharchaeology.ashmus.ox.ac.uk/highlights/alfred-jewel.html 

I will admit it: after the election in November, I succumbed to a sense of defeat. What is the point, I moaned, if autocracy and tyranny are not merely accepted but welcomed by the masses, if the great ideal of a democratic country is systematically dismantled before our eyes? Why bother with anything, much less with the last fifty pages of a novel that no one will ever read?

At the time, I was working through the last part of a story I’d begun a couple of years earlier, and I was ready to give it up, because, well, why would I finish it when the world as I know it is coming to an end? (My feelings arose not only because of the U.S. election results or the ensuing realization that a foreign power had tinkered with our “free elections,” but also because of the global rise of a dangerous populism, coupled with imminent global climate change.)

But a good friend gave me some advice, and I soldiered on and completed the draft. Right now, I am steadily working on it, revision after revision. And I am doing this not because I think my novel can change the world. It certainly won’t; it won’t be read by more than a hundred people, and that’s if I’m lucky.

But this short essay is not about the art of writing without readers; I will deal with that in a future post. For now, all I want to do is to encourage everyone who reads this blog to go on and continue their artistic activities. I say this not as a writer, or even as a reader, but as a scholar. And I have a very simple reason for doing so.

Art is the residue left by human culture. When civilizations disappear, when lives and institutions have crumbled into the dust, what remains is the art they created. Some of this art arises from genius, like the works of Mozart and Shakespeare; some of it comes from normal people, like the rest of us. But we need it all–every last scrap of it, not only the wonderful pieces that make us cry with joy or sadness, but even the average and ungainly works of art, because even bad art is an expression of human experience, and in the end, it is the experience of being human that binds us together on this lonely little planet.

So go ahead with your art. Draw, paint, weave, write, compose or play music. Do not worry that you are fiddling as Rome burns. Rome will, ultimately, burn–history tells us that. But what is left behind are wonderful murals that will take your breath away, mosaics, epic poems, statues and monumental structures. Don’t worry about whether your art will be appreciated; it is the act of making it that is important, not whether or not it is celebrated. Think of that lonely monk who produced Beowulf; he  was probably scared shitless that his Anglo-Saxon culture would be erased by the next Viking invasion, but he fought off this feeling of futility and kept going, thank goodness. Remember his small act of courage, try to emulate it, and above all, keep going.

Do not be afraid of working in the darkness; you may not be able to dispel it, but your work could provide light for others, not only now, but in the future as well.

Five Fascinating Facts about Charlie Chaplin

Charlie Chaplin holding a Charlie Chaplin doll. Source: Wikipedia
Charlie Chaplin holding a Charlie Chaplin doll.
Source: Wikipedia

 

Most of us know Charlie Chaplin as a star of silent movies, the iconic Little Tramp–a clown who made millions laugh during some of the hardest years of the early twentieth century. But he was more than a comic genius. I’d argue that even if he’d never been the most successful film comic to date, he’d still be remembered for the following achievements:

  1. Chaplin composed the tune “Smile,” as the background music for his masterpiece Modern Times (1936). A decade and a half later, John Turner and Geoffrey Parsons added the lyrics and title to the song, producing a major hit for Nat King Cole. It has since been recorded by Tony Bennett, Diana Ross, Michael Jackson, Josh Groban, and Robert Downey, Jr., as well as by the Japanese singer Misia.

2. Chaplin was one of the founders, along with  film pioneer D.W. Griffith, Hollywood power couple Douglas Fairbanks and Mary Pickford, and lawyer William Gibbs McAdoo (former Treasury Secretary under President Woodrow Wilson, his father-in-law) of United Artists Film Studio. Their goal was to maintain and preserve artistic independence in creating their own film work.

  • Original list of Stockholders of United Artists. Source: Wikipedia

    Original list of Stockholders of United Artists.
    Source: Wikipedia

3. The only earned Academy Award Chaplin ever won, despite his iconic stature in the film industry, was for the score of his film Limelight (1952). Turner and Parsons again added words to the tune, creating the song “Eternally.” Because Chaplin was exiled from the United States when Limelight was produced (see below), the Oscar was not awarded until 1973. The song was recorded by many artists, including Jimmy Young, Petula Clark, Michel Legrand, and Sarah Vaughan. (Yes, that’s a very young Claire Bloom below; she was Chaplin’s co-star in this fascinating film.)

4. When Chaplin sailed to London in September of 1952 for the premiere of Limelight, he was informed that he could not re-enter the United States without submitting to an interview regarding his political and moral behavior. At the height of McCarthyism, this revocation of Chaplin’s re-entry permit was tantamount to political exile. Chaplin, disgusted by what he called the “hate-beleaguered atmosphere” of  the U.S., settled in Switzerland, returning only once to the land that had seen his rise to stardom, in 1972, to receive an honorary Academy Award.

From http://www.floridanewsgrio.com/news/world/15159-1952-charlie-chaplin-banned-from-the-us.html
Source: Florida News Grio

 

5. Chaplin was not only a great comedian, but also a philosopher who worked in the medium of film. The Great Dictator (1940)  a powerful political satire that addressed the growing threat posed by Nazism–was made before the United States had declared war on Germany. (Incidentally, Chaplin worked on the score for the film as well, but the music was credited to Meredith Willson, who would go on to write The Music Man.) These days, in the midst of political unrest, Chaplin’s final speech of the film is frequently cited as an appeal to logic and sympathy in the face of mechanistic and rote obedience to power.

Film and Classical Music

One of the things I miss about teaching now that I’m retired is the ability to explore ideas and themes with others. I often compared teaching–at least, teaching at its best–to driving a tour bus. Sometimes you stop the bus and point out some interesting things (and in fact that is pretty much your purpose in life as a teacher), but often people on the bus know some pretty interesting stuff, too, and teaching at its best is when everyone starts sharing their information and their ideas. That part of my life still exists, but in a much smaller form, and I can’t rely on a job to make it happen any longer.

Hence this post. If I were still teaching, I’d be formulating a course on Film and Music, much as there are courses on Film and Literature, or Film and Shakespeare, or Film and Madness. But since I’m not planning on teaching such a course, I thought it would be fun to make a list, the kind you see on Buzzfeed, or even better, on InterestingLiterature (a great site, and not just because I had a guest post on it once) that highlighted some interesting movies that focus on classical music. Fair warning, however: some of these movies are not readily available, and only one of them is well known.

  • We’ll begin with the most famous movie in the list: Amadeus. Now, don’t get me wrong; I loved the movie when it came out in the 1980s, just as everyone else did. But I found it a bit dull and overacted when I watched it again a few months ago. Certainly it is a long movie, but it is visually spectacular. The music is excellent, too. If you haven’t seen this film, it might be good to start here, if only because everyone will expect you to have seen it.
i-2-1549_amadeus-wp4
Image from https://rovingpsyche.wordpress.com/2015/05/13/cinema-review-amadeus/, which provides an excellent review of the film.

 

  • Here’s a little gem that fewer people have seen than Amadeus: It’s called Nannerl, La Soeur de Mozart, or, in English, Mozart’s Sister. I found the portrayal of a very young Mozart and his sister in their family setting both refreshing and appealing. The Mozart in Amadeus can be quite bratty and silly, but in this film, such antics are easier to take, because here Mozart is ten or so years old. Of course, the film centers on Nannerl, Mozart’s older sister, whose musical gifts are acknowledged but circumscribed by her father. The intrigue portrayed in the film is fanciful yet appealing, and the music is, once again,  excellent.
539w
Image from a revew by Ty Burr which appeared in Boston.com
  • Mention Henry Purcell at a cocktail party, and you’ll receive blank looks. Most Americans don’t know that Henry Purcell was the first really great English composer. The film England, My England provides a view of Purcell’s life in a creative, time-tripping way: focusing on the attempts of a 1960s playwright to create a drama based on Purcell’s life, it spins off into that life itself, returning at times to its trendy 1960s setting. The musical scenes are pleasing, and the portrayal of Purcell (by Michael Ball) is convincing, drawing the viewer into the world of Baroque England.
MV5BODkwNjUwNzk2NV5BMl5BanBnXkFtZTcwNDQyMTk0MQ@@._V1_SY317_CR1,0,214,317_AL_
Image from Imdb.com

 

  • Emily Watson is one of my favorite actresses, so it makes sense that I would put the film Hilary and Jackie on this list, but I need to warn my readers that this film can be deeply troubling. It deals with Hilary and Jacqueline du Pre, musical prodigies who emerged on the classical music scene  in the 1960s, Hilary as a flautist and Jacqueline as a cellist. Be warned: Jacqueline died at the age of 42, having suffered from multiple sclerosis, which cut short her career when symptoms arose in 1973. The film is grueling at times, and not just because of the onset of the disease. And it is controversial as well, since it presents an unflattering view of Du Pre at times. But it provides a fascinating look at this important 20th-century musician, whose work has been described  as both ground-breaking and definitive.
MV5BMTIyMTQxNzIzOF5BMl5BanBnXkFtZTcwNzgzMzMyMQ@@._V1_SY317_CR3,0,214,317_AL_
Image from Imdb.com

 

  • I’ve saved the most unusual film for the end. In fact, the only way I was able to watch this film was by streaming the entire thing on YouTube. This film may not be for everyone: it follows the early life and career of Carlo Broschi, who took the name “Farinelli” when he appeared on the early 18th-century stage as a castrato singer. But it is an excellent film about a difficult subject. Like the  other films on this list, it is largely fictionalized, but the music is interesting and appealing, and the story itself so unusual as to be intriguing. To recreate the sound of a castrato’s voice, the voices of a female soprano and a male countertenor were digitally merged, resulting in this amazing aural amalgam. It is a unique and gorgeous sound–but you might want to compare it to this rendition of Handel’s aria (“Lascio chi’io panga”) by the male soprano Philippe Jaroussky, which relied on no such digital manipulation. Which is better? Add your comment below to join in the conversation. And please let me know if you know of other movies I should add to this list.
hqdefault
Image from YouTube

 

 

On Mondegreens and Willful Misunderstandings

Image from Wikipedia
Image from Wikipedia–Lucy the Australopithecus 

Once in a while, I hear about a new movie that I really want to see. It doesn’t happen often, because I really prefer old movies to new ones; I’m happiest when watching a movie from the 1930s or ’40s, and it takes a bit of gumption for me to sit down to watch a movie in color–a fact that really throws my students for a loop. Action and superhero movies bore me, and I usually end up falling asleep during them, or checking my wristwatch several times throughout the film.

But once in a rare while, I hear about a movie that really sounds interesting. The operative word here is “hear”: what I really do is hear the title, then ignore the movie’s description and single-handedly create a movie that I’d really want to see. The most recent example is the film Lucy, starring Scarlett Johansson. Now, a very quick internet search brings you to the official site, but that’s not the movie I envisioned when I heard the title. Somehow, I decided this movie was going to be about the discovery of Lucy, the hominid remains that shook up the world of anthropology in the 1970s. I created an entire plot in my head, which, while shadowy and only partially formed, revolves around archeologists. It’s set in the dry, dusty plains of Africa, where the drama emerged from a slow process of discovery, perhaps involving scholarly rivalry and personal conflicts, and maybe even a love story. This Lucy is, in my warped view, a recipe for a wonderful film, and every time I bump into the real film’s advertisements, I find myself quickly dismissing them, overwhelmed by a sense of ineffable disappointment.

Years ago, I did the same thing with the film Glengarry Glen Ross, which I decided was a film about Americans on a fishing trip to Scotland. It was a little like Deliverance, without the violence; perhaps it would be better to say that my sense of the film was that it was like Brigadoon, minus the magic and the music. Apparently, I couldn’t be further off in my characterization of the film.

I think we need a word to describe this type of willful misunderstanding, where, like Wordsworth in his poem Tintern Abbey, we encounter films which we “half create” (line 106) making unique alternative-reality films that exist only in our own minds. After all, because these alternative films arise from a misunderstanding, they’re not that much different from a mondegreen–a misunderstood song lyric, and there’s a whole slew of websites devoted to them. (You can read about them here and here.) Everyone has a mondegreen story to tell, usually involving a small child. For example, my daughter asked me, when she was five years old, why, in the nursery rhyme “Mary Had a Little Lamb,” the lamb’s fleas were white as snow, since the fleas on our little Sheltie were definitely black. Was it a different kind of flea? Or were Geordie’s fleas simply dirty?

At any rate, this topic has me thinking about the ways we misunderstand the things we hear about, and, because I deal with the written word so much, the things we read. Don’t worry–I’m not going to launch into a critical essay (though I am tempted to talk about Much Ado about Nothing, a Shakespeare play that focuses on the way we misread people and texts). Instead, I’m going to bring up a memory from my early childhood. My parents, in an apparent effort to provide their three children an introduction to the great books, bought us a volume of stories entitled something like “Great Stories of the World.” In this volume was a short synopsis of myths and legends mixed helter skelter, arranged with no attention to provenance or significance. Thus Beowulf, the first story in the book, was followed by an adventure involving Pecos Bill. I probably never made it past these two stories, which is why, as a graduate student studying Old English, I always felt I was missing something–until I realized that I was waiting for Pecos Bill to come in at the end of  Beowulf and slay the dragon, saving Beowulf and giving him a ride back to his mead-hall on a cyclone.

pecos lassoing cyclone
Image from http://ed101.bu.edu/StudentDoc/Archives/fall05/dfbrand/david%20brand%20ed%20101%20site%204.htm

I’m not sure whether other readers have this experience, but I do remember a fellow graduate student explaining that, like most of us Victorianists, she had seen the movie Oliver! well before she ever read the novel. During the movie, she explained, after Bill Sykes beats Nancy so savagely, she watched, transfixed, and noticed that although Nancy’s body is obscured behind a wall, she could detect her leg moving–and so as a small child, she decided that Nancy was not dead. Wounded, perhaps severely, but not dead. That impression, she explained, held sway each time she re-read the novel, and she had trouble convincing herself that Nancy had indeed been killed by her vicious boyfriend.

So I issue a call to readers–two calls, in fact. Have you ever misunderstood something in a film or a book and preferred your misunderstanding to the reality? And, if so, do you have a suggestion for what to call this situation? I look forward to your responses!


 

Confessions of an Amateur Musician

IMG_0002
A rare portrait of the author in a preposterous band uniform, ca. 1977

 

At 53, I am ridiculously—to be honest, pathetically—devoted to learning the clarinet. While most people my age are discovering the joys of gardening, genealogy, or golf, I spend too much time and money on a hobby that makes no sense. I haven’t always been this enthusiastic about music, either. Years ago, like many other high school students, I played in my school band–I even have a nice white sweater with a gold “S” (for Spring Woods High School) affixed to the pocket to prove it, a sweater which I’ve saved throughout the years, despite many cross-country moves. Please note, however, that I make no claim to have actually played in the Marching Band. I was a miserable failure at marching, and, consequently, I sat in the stands every Friday night during football season throughout my senior year, an alternate marcher, to be used only in case of dire emergency. Thankfully, it was never necessary for me to go on the field, and so I was allowed to sit there, alone, munching on cookies I had stowed in my tall busby helmet, watching the show. Although I was a horrible marcher, I was an average clarinet player (despite my inability to count), which was why I was allowed to stay in the band and why, ultimately, I scored that lonely varsity letter.

Years passed, and in my mid-forties, I took up the clarinet again. Since then, I’ve been on a quest to be able to play without making people around me cringe. I am improving, but every now and then I stop and ask myself why I’m doing all this–spending an hour or so a day about five days a week (on good weeks) practicing scales, pushing my poor, age-befuddled brain to learn about intervals and minor chords, perfect fourths and blue notes, and playing in a variety of community bands. The $64 question is, why do all this when the return on my investment will be, in practical terms, so small?

I have faced the following facts: I will never be a great, or even a pretty good, clarinet player. I will never zip up and down scales, double-tonguing effortlessly and gliding with ease through cadenzas, pulling emotions out of world-weary listeners. I will never be able to improvise freely with other musicians, transposing on the spot so I can play my clarinet with guitarists and piano players. I know that at this stage in my life, I will never attain more than “decent” amateur status–and I am not being falsely modest when I say that. If there’s one thing I’ve learned over the last few years, it’s what good music actually sounds like, which in turn means that I know what my own limits are.

Yet I continue to push against these limits, worrying at them like a dog chewing on a nylabone.

I’ve been pondering this question the last few months. The closest answer I can come to is this: Music adds beauty to my life. Even if I can’t be the musician I would like to be, the study of music is, for me, a celebration of art, and that is in many ways a celebration of life. It’s a way to stop the busy swirl of work and social obligations, to bask in the glory of one single note at a time, and to concentrate not only on the here and now (the moment of playing), but also on the there and then (the moment of composing). Playing the clarinet allows me to interpret music—limited though my understanding of it may be—and to put my thoughts, emotions, and what skill I have into my playing. It’s also a way for me to devote myself to a discipline I have neglected throughout most of my adult life. Think of it this way: most of us give up music right after high school, selling our dented and scuffed horns and concentrating on our college majors and professions. This means that at the very time when we understand what it means to be disciplined, when we have finally learned how to devote long hours of effort to attain a degree of mastery, we have already given up on an art form that might just afford us exquisite pleasure. In a way, that’s a real shame.

So that’s why I keep at it. I no longer expect to be that good, and I try not to compare myself to the people who have studied music professionally. I work hard not to wince when I hit wrong notes or get lost in difficult (or even not-so-difficult) phrasings. And I realize that while there are many people who play much better than I do, that is no reason to give up on the clarinet. I love the sound it makes, even when I play it myself, and I love the fact that slowly and surely, I am improving. If practicing scales and occasionally playing obligato with YouTube recordings of Coleman Hawkins are my way of paying homage to the art that is music, and if doing so makes me feel alive and appreciative of those who play better—and of those who play worse—than I do, then what other excuse do I need? Music is for everyone, even for those of us who play imperfectly, who squeak when we mean to dance our way through the highest notes, who get lost amid sixteenth note runs, who are tripped up by our own desire to make beautiful sounds.

And this is why I continue to play. It makes me feel alive; it gives me something to work towards; and, once in a rare while, I hit that one beautiful note, which, whether it is an accident or not, is so fitting, so right, that it brings tears to my own eyes. With that kind of allure, even my own nagging fear of failure isn’t enough to make me give it up.