Journal Entry

Monday, October 8, 2012

Clichés vs. allusions

Over in that tidily walled private enclosure, Facebook, Dora writes that "If you are a writer, you should ruthlessly excise clichés from your language. Not just from your writing, but from all language. You should avoid the expected, the comfortable, that which others will automatically agree with and you can say without truly thinking about... The good writers, the ones I respect, are authentic and original. They do not speak in clichés, even when describing what they had for breakfast."

I don't disagree with this. I am going however to disagree with something that I think lurks behind it, in its shadow, something it potentially tangentially evokes or implies, which Dora did not say (in some circles this practice of disagreeing-with-what-you-did-not-say is now officially known as "commiting Rosenbaum"... and you probably can't follow that link, thanks to Facebook).

The lurky-behind thing I'm going to disagree with -- which my hero Samuel R. Delany comes closer to suggesting outright in a wonderful post over at the Clarion Foundation -- is that one should eschew received language in general, and that originality of sequences of words is, specifically, a goal.

Is a given stereotyped pattern of speech -- the same words found together in the same sequence, spoken (or written) by many language users -- a cliché, an idiom, a term of art, an instance of jargon, an allusion?

The answer depends a lot on context. The centrality of the notion of "cliché" in our thinking about language, about writing, has everything to do with the post-Enlightenment, Modern enshrinement of originality as the cardinal virtue of art and thought, which is part of Modernism's whiggish neophilia in general.

If your primary model of the Artist is of that lone thinker who rises above that unthinking mass of men enslaved to the conformity of industrial society, pursuing the utterly original dream-vision consigned to her by the Muse, the very attention to which is an act of radical nonconformism and self-owned, self-originating, self-sovereign self-authorship... then saying anything that anyone else has said seems like a betrayal.

But of course, not all cultures have looked at art or thought like that. Classical and medieval discourse and art are densely allusive. If you're hanging out with Talmudic sages in 3rd century Babylon, or educated philosopher-merchants of 12th century Cairo, or qat-chewing intellectual shepherds in the Yemeni highlands today, the deal is that they are going to drop just a few words from the Bible or the Mishnah or the Quran or the Hadith, and those few words, because you're educated in the same deep textual culture as they are, combined with the nuances of your relationship and current situation, are going to carry huge rafts of meaning. Or a medieval painter can stick in the corner of a painting some image from Catholic hagiographic iconography -- a particular bird, or a pair of shoes -- and invoke an entire layered set of narratives, from the original life of the saint, to every political, social, artistic and cultural usage of that original narrative since.

In a context like that, it's the words that have been used before that are the most richly dense with information, the most evocative and powerful. By contrast, to use a totally new sequence of words, that no one has ever thought before, is to say something bare, plain, sterile, empty of nuance, and most likely banal. Of course, it's not that originality has no place in an allusive culture, but it's not a modernist originality of utter newness -- it's the emergent originality that arises from richly evocative standardized expressions in new juxtapositions and a new context.

One thing that's fascinated me about discourse over my lifetime, is that I have the feeling we are entering a new age of allusion. Internet memery, and other cultural productions which zoom through subcultures aided by the world's new flatness, seem increasingly to dominate discourse, especially online. And I (as a postmodern, not a modernist) tend to think that they enrich that discourse. And it honestly seems like this was less true thirty years ago, when we lived in a broadcast culture, rather than a web culture.

Then, the allusions tended to be restricted to Saturday Night Live skits ("schwing!", "could it be... SATAN"?) and they were reenactments. When you called someone the [noun]ster, you were evoking that particular skit on SNL, and that was about as far as it went. What we do the equivalent thing today -- when we say "X is the new Y" for some (original to us) X and Y, when we say "all your base are belong to us" or use that particular form of "really?" that has only been around for the last 5 years, it feels to me like we are doing something more equivalent to the way Colonial-era gentlemen used Homer and Cicero, or 9th-century Geonim used the Talmud. We are using stock phrases, not out of laziness, but because of the layered context they carry with them, and the joy of juxtaposing them in new contexts. It feels very different to follow up someone's odd sequence of words by adding "dot tumblr dot com", thereby turning it into a fictional Tumblr site devoted to that sequence of words now transformed into a reified concept with its own fandom, and also alluding to the xkcd comic that created the meme, than it does to say "don't put the cart before the horse." The latter is safe and complacent, it closes things off; the former is playful and hungry and about aperture.

So yes, you should avoid the lazy and the tame -- at least, you should avoid it when your goal is to awaken and encourage thinking and connection, which it is when you wear your writer hat (I reserve the right to use business and IT jargon and buzzwords when my goal is to lull people in a meeting into a friendly stupor, or to defuse tension). But "authentic and original" should not be defined too narrowly, and "cliché" is often a matter of perspective. Sometimes saying what has been said before is more potent and alive and rich than saying what has never been said before.

Posted by benrosen at October 8, 2012 01:09 PM | Up to blog

What gets up my nose is the sense (from certain people) that the enjoyment of cliche-free prose is morally superior to the enjoyment of referential prose. This is, I think, implied in "The good writers, the ones I respect, are authentic and original." I have no idea what authenticity is, frankly, so I will leave that one alone, but I like and respect lots of writers for whom originality is secondary. Such as, oh, William Shakespeare. Or P.G. Wodehouse. Or John Barth. Or Carol Duffy. Or the committee of the King James Bible. I am also fascinated by the process of adaptation--are the writers who adapted "Cranford" for television to be sneered at for failing to be original? Surely the thing is to judge on the merits, with originality being one criterion, and not among the most heavily weighted.

Now, having said that, novelty is awesome, too. There are plenty of writers I like and admire for their originality. Either for their originality of plot or of phrasing, or sometimes (not often) for both. Often for novelty in world-building, although not often, then, for the other kinds as well.

There's a Santayana line about consistency that says that consistency is a jewel, and that like a jewel, it is surprising how much some people are willing to pay for it. I think that applies to, well, nearly every good thing, but certainly to originality. Because, yes, shiny, great. But also? Have we not all struggled to comprehend the prose of someone too successfully avoiding common turns of phrase?

And yet--if you like that sort of thing, then that's the sort of thing you like. I like a different sort of thing (well-handled plot, mostly, and flamboyant characterization, and evocative language, with lots of predictability to make it all go down smooth), and I try not to allow myself to feel morally superior because I would rather read that someone had ham and eggs for breakfast than eggs and ham.

Hm. That wound up a ranty-rant, didn't it? I suppose as a reader I shouldn't take advice for writers personally, but I do.


Posted by: Vardibidian at October 8, 2012 04:52 PM

I'm glad you brought up the issue of allusion, Ben (even if it wasn't precisely what Dora was talking about ;) because it's something I tend to enjoy/appreciate, especially in songs. Antje Duvekot's "Judas" ( is a perfect example: by casting a potential school shooter and his victim as Judas and Jesus the song humanizes both versions of Judas and alludes to two important stories (the Gospel stories of Christ's Passion and the American stories of families and schools and school shootings), making for a much richer song. (Plus, it's pretty.)

I think this could also apply to fanvids, though I'm not sure what you think of that. While fanvids are a dime a dozen, there are some that are very specific in which scenes they show at which points in a song so that you have to know the show/movie/whatever really well in order to appreciate the significance. (The end of is a good example of this. is one of my favorites but may not mean much to someone who's less of a JAG junkie, or less devoted to Webb. No, I don't know why I like him so much either.) Others take a song that isn't a duet but distribute the lines/verses between two different characters in order to comment on them individually as well as their relationship, thereby changing also the interpretation of the song. ( and are two of my favorites for that.) This latter type might also appeal to a broader audience: I think the Kirk/Spock vid especially requires relatively little background beyond to appreciate, though it's more meaningful if you know which bits take place on Vulcan and who certain side characters are.

Posted by: Emily Gilman at October 8, 2012 10:58 PM

I've come at the idea of Allusion Culture primarily from the direction of reference humor, something I make much use of and have done some philosophising on. One facet of my thinking on this is that we're not really living in the same kind of world as that which made such wide use of Biblical or mythological allusions, because though there are far more potential sources of common references now (The Simpsons, Monty Python, dozens of successful movies/comics/TV shows...), that very fact makes each source less widely known. We're not a culture where you can reasonably expect everyone who hears you to understand the reference you're making, whether it be Loaves and Fishes or "I'm not dead yet!".

Instead, references can be a way of *separating* subcultures, rather than providing a common vocabulary. Spec-Fic folk will be able to pepper their conversations with Lord of the Rings and Firefly references, making some people get much more out of their words... but making them more opaque to those not in that subculture. The same would go for groups revolving around Reddit, or pop-celebrities, or whatever. So I can only partially agree with your notion that we're entering an allusion era equivalent to that of 9th-century Geonim. If anything, we're making that type of common reference-currency harder to come by, except in narrower, siloed, groups.

Posted by: Jim Moskowitz at October 9, 2012 03:06 AM

Jim, I think, though, that to some extent the groups were always siloed, and that that goes hand-in-hand with a culture of allusion. Those Geonim were walking down crowded Babylonian streets amid Eastern Christians, Mithraists, Zoroastrians, and Tartar and Slavs animists who wouldn't have known a passage of Talmud had it bit them on the butt, and even within Judaism there were various strands and mystical traditions in which super-secret allusions would convey piles of meaning inaccessible to outsiders -- Karaites wouldn't get the same gags as Rabbinicals, nor Chariot-mystics as remnant Saducees. You'd have fractal layers of associations-of-allusion, which feels very much like now..

Posted by: Benjamin Rosenbaum at October 17, 2012 04:22 PM
Post a comment

Please choose one:

Thank you. Remember personal info?