header-image

Refer Madness

Writing in an Age of Allusion
DISCUSSED
The Room Between Other Rooms, The Remote Possibility of Finishing a Three-Decker Novel, Hulking Selectrics v. Cyclonic Leaf-Blowers, The Failure to Go on Buddhist Retreats, The View of Originality That Is So Unoriginal It Is Boring, To Be Always Elsewhere, The Call for New Clichés, The Library Bulging in Your Pants

Refer Madness

Robert Cohen
Facebook icon Share via Facebook Twitter icon Share via Twitter

Here, in its entirety, is a short story by Kafka, called “The Wish to Be a Red Indian”:

If one were only an Indian, instantly alert, and on a racing horse, leaning against the wind, kept on quivering jerkily over the quivering ground, until one shed one’s spurs, for there needed no spurs, threw away the reins, for there needed no reins, and hardly saw that the land before one was smoothly shorn heath when horse’s neck and head would be already gone.

This sentence, a plaintive and mysterious journey into white space, both expresses and embodies a yearning as physical as it is metaphysical, a yearning typical of its writer and indeed all writers: the wish to be a more natural, less self-conscious being. Or to put it another way, to not be a writer at all. Let’s face it, no one’s better at writing about not wanting to be a writer than someone who actually is a writer. And among writers, no one’s better than Kafka, for whom wanting and being are almost never in sync. “The impossibility of not writing, the impossibility of writing German, the impossibility of writing differently,” he laments in one letter. “One might also add a fourth impossibility, the impossibility of writing.” But then he’s hardly alone in this. “Live all you can”—the Master himself says, or rather writes—“it’s a mistake not to.” Who wouldn’t trade in all that fretting irritability, all that envy and eye strain, for the life of an (OK, rather absurdly notional) “Indian,” someone wild and natural and sure, plunging headlong through space with such velocity and purpose that no spurs or reins are necessary? Though maybe headshort would be more accurate in this case, for the head in Kafka’s story, when it finally appears, is something of an afterthought; a head as attenuated from its body as a Giacometti, a head syntactically and existentially, almost Twilight Zone–ishly, “already gone.”

We find this same itch for unfettered animal movement—for existence reduced, or enhanced, to mere body-in-motion, shorn of mental entanglements and obligations, and “quivering with the fever of life”—­everywhere in Kafka’s work, to say nothing of his life. Take this entry in The Blue Octavo Notebooks: “He leaves the house, he finds himself in the street, a horse is waiting, a servant is holding the stirrup, the ride takes him through an echoing wilderness.” Or the narrator of “The Departure,” who, having ordered his servant to fetch his horse from the stables, is stopped at the gate and asked where he’s going. “I don’t know,” he says. “Just out of here, just out of here. Out of here, nothing else, it’s the only way I can reach my goal.” “So you know your goal?” the servant asks. “Yes,” he replies, “I’ve just told you. Out of here—that’s my goal.”

But of course one can’t get out of here in Kafka’s world. There may be horses, but they are never ours; there may be Indians, but they are never us; there may be gates that lend access to the Law, but for us they’re forever closed. For us there are always shackles, cages, constrictions. We are not pure beings, not whole selves; not animals, not gods. For all the purity of our aspirations, we live, as Kafka did, in the middle of things, in a room between other rooms, a self among other selves, in what literary types call a “liminal space.” Trapped between two realms, the earthly and the heavenly, we’re unable to fully inhabit, or escape, either one, but can only gesture longingly in both directions, flailing our useless limbs, like an upended beetle trying to get out of bed. “It’s the old joke,” Kafka writes. “We hold the world fast and complain that it is holding us.”

Which brings me to my subject: our ways of holding the world fast, even in fiction—especially in fiction—and how, the faster the great world spins, and the smaller, more involuted, more densely thronged with interconnections it feels, the tighter we clutch onto its fabric. And I mean this literally: with the fingers, the very digital instruments, of our hands.

*

Speaking of digital instruments: it’s been pretty well documented by now that this great web we’re all caught in and borne up by is changing us every which way, not just how we interact (or don’t), but how we read (ditto) and think (ditto). And how do I know this? The same way you do: by power-browsing websites, skimming articles by the likes of Nicholas Carr and Jaron Lanier—or rather skimming articles by other writers who have skimmed articles by the likes of Carr and Lanier—memorizing a fact or two, a quote or two, and interweaving them with my own anecdotal experience and those of friends. This is what passes for “knowledge” now. There’s no need to make a moral judgment about this (OK, maybe there is a need to make a judgment about this), but those of us who read and write for a living can’t help but wonder about the implications—even if those implications, like the technologies that give rise to them, present a moving target, a machine perpetually on fast-forward.

Where reading is concerned, we all know the liturgy by now, and can recite it by heart like our own gloomy Spenglerian Kol Nidre. Our Father, Our King, we have sinned before you: we’ve scanned, we’ve skimmed, we’ve skipped blithely from link to link and retained nothing; and for most of us the prospect of buying, let alone opening, let alone finishing, the sort of
fat three-decker novel—
Middlemarch, The Magic Mountain, The Man Who Loved Children—we used to devour almost routinely has become as lonely and remote, as haloed by a nimbus of virtuous nostalgia, as a trip to the moon.

On the other hand, how could it be otherwise? “We are not only what we read, we are how we read,” writes the developmental psychologist Maryanne Wolf. Reading is not instinctual; it’s learned behavior, and like all learned behavior it affects and to some extent reshapes the neural circuits in our brains. Studies show that readers of ideogrammatic languages, like Chinese, develop a radically different mental circuitry than do those whose written language employs an alphabet. Given that the average American spends eight and a half hours a day in front of a screen, that’s a lot of rewiring going on. No wonder our heads feel so thronged—like people who spend too much time in the gym, for all our furious running, somehow at the end of the day it turns out we haven’t quite gone anywhere. Meanwhile the screens keep blaring. “When things come at you very fast,” Marshall McLuhan once said, “naturally you lose touch with yourself.” And, we might add, with “things,” too.

It stands to reason, then, that if our reading is changing, our thinking must be changing, and if our thinking is changing, then so are both the stuff and the way we write. Or perhaps it’s only the pace of change that’s changing. A century ago, Walter Benjamin warned of a menacing new form of communication that was “incompatible with the spirit of storytelling.” Information, he said, was the “rustling in the leaves that drives away… the dream-bird that hatches the egg of experience.” And if the very machine one writes—and, increasingly, reads—one’s dreamy stories upon is also a roaring, cyclonic leaf-blower? What then? According to Nietzsche, “Our writing equipment takes part in the forming of our thoughts.” And he should know: once he switched from the pen to the typewriter, his own prose rhythms, never exactly temperate to begin with, turned jittery and declarative, full of aphorisms, puns, and staccato proclamations. You can see this on the page, or rather feel it on the page: the brutal, clattering force of a new technology seeking and finding expression in thought, rather than the other way around.

Writers of my generation, many of whom wrote their first books on hulking Selectrics, understand this all too well. The days when you’d have no words in your head and yet go on typing anyway, as if riding the wave of some vast, furiously thrumming current. But then came the giddy liberation of those first excursions into “word processing”—to be relieved of all that laborious retyping! that interminable cutting and pasting!—and then, as the futzing and tinkering, the finding and replacing, the control-c’s and control-v’s, grew more and more habitual, the dawning recognition that the great liberator might in fact be only another occupier, with its own ruthless and quietly insidious laws. That the bars of the cell are plastic rather than iron does not make them, in the end, any less confining. Whether it makes them more confining is a question well worth asking, or googling, or typing into one’s smartphone, provided of course that one has a smartphone, which I personally do not.

No, my own phone, I’m afraid, is a dumb phone—a phone so remedial, so not-gifted and not-talented, so slablike and lethargic and inexpressive, it can barely get it up to ring. My phone has no idea what an app is. It sends no email, it can’t read a blog, it plays no music, it has no GPS, and if I hold it up to one of those little squares of frozen static that sit like Rorschach blots beside virtually every material item in America, my phone just stares at it dumbly, breathing through whatever the equivalent of its mouth is. To be fair, at some point no doubt my phone was the very newest, smartest thing, sleek and powerful as a phaser. But I didn’t buy it at that point. No, by the time I bought it, the heyday of your basic Samsung clamshell flip-style had long since passed into history, like the rotary dial, the telegraph, and the carrier pigeon before it, which was why I was able to get it so cheap (the primary, in fact the only factor in my decision), and why the moment they saw it my children laughed and rolled their eyes, as if this was only the latest, most damning evidence of cluelessness in that losing trial, my life.

But never mind. I don’t own a smartphone (yet) for a very simple reason: because I know I would abuse it. Like most people who cultivate an air of brooding intensity, I am in fact a rather shallow person with lousy concentration and dilettantish tendencies. Already I don’t so much live my life as refer to it. Already each day for me is a Talmudic text, a dense, ever-evolving symbolic narrative that can be comprehended, or for that matter endured, only by a prodigious amount of intertextual commentary. Nothing for me is ever simply itself, solo and unencumbered; there’s always an entourage of references loitering around, emptying the minibar and stealing the robes. A raging old man is always a Lear; a big talker always straight out of Bellow; every unhappy family Tolstoyan in its own way. To be sleepy is Oblomovian; to be restless Chatwinian; to be hungry Hamsunian; and the sight of my enormous honker in the mirror, it goes without saying, is positively Gogolesque. In short, there is virtually no element of my conscious life (about my unconscious life it’s harder to say; I am unconscious of my unconscious life) that is not contaminated by reference, and I don’t even own a smartphone yet.

My wife finds all this objectionable not just on the obvious gender grounds (i.e., that it makes me tiresome, boring, and pedantic, like a clerk in a used-record store, or a guy who writes in to sports blogs), but on a deeper spiritual level as well, in the sense that if I am always comparing, always referring, I am never just being. And the person I am never just being, it goes without saying, is myself. No wonder I hate yoga, she thinks. No wonder I don’t meditate, or go on Buddhist retreats, or write poetry. Everything that goes on outside my head has a set of corresponding and often much more pressing references inside it, and everything inside it has a set of corresponding and much more pressing references outside it, and so much of the time I am neither here nor there but rather, as Milan Kundera might say, elsewhere.

Case in point: in an ideal world I would not have made reference to Kundera just now, because it sounds so pretentious, especially as I’m only name-checking the title of his novel and not even delving into the substance. Then too in an ideal world I would not go on to compare this condition of feeling like one slim, heavily annotated volume in a vast universal library to something out of Borges or Jung, though in truth these do seem like fair comparisons, or in any case inevitable ones. But there you go: there’s no way to diagnose this affliction that doesn’t succumb to it that much further. The cure is indistinguishable from the disease. No matter how much one struggles to wriggle free, in life and on the page, from the headlock of reference, there’s no way out, only further in, submitting to its grip even further, quoting Bellow, for instance (“Perhaps, being lost, one should get loster”), and just generally ransacking the library for references to others who have grappled with this same problem of making reference to others, as if we are all condemned to flop and thrash forever in the same infinitely pliable and inclusive net.

Well, you’re thinking, so what? It’s the postmodern condition. Our heads are full of secondhand data, viral videos, forwarded twitters and tweets. Nothing to be done. Anyway, it’s not such a terrible affliction, in the scheme of things, to have a library vaster than Alexandria’s bulging in the pockets of your pants. Access to knowledge, or information anyway, is no longer the preserve of the culturati, the expensively overeducated 1 percent. Occupy Positively Fourth Street! How can we not get down with that? And if the cost of all this access is that every bullshit remark one advances over dinner gets shot down by some wise guy googling under the table; or that watching a movie with your teenage son means being peppered by factoids from IMDb, not just the gross receipts but the AD, the filmography of the second lead, and the name of the catering company that supplied the bagels; or the unsettling suspicion, as you flit from link to link, that someone has a financial stake in the crumbs of data you leave behind, that what you assume to be voluntary behavior is in fact highly manipulated, that, as Carr puts it, “it’s in their economic interest to drive us to distraction”… well, so be it. A small price to pay. And if it’s maybe a little harder than it used to be to concentrate on, well, anything—to locate, amid so much refracted noise, the timid sonar-beam of your own personal signal, your own original and essential truth—so be that, too. Every revolution has its winners and losers. Besides, notions like “personal” and “original,” absent the usual ironic air quotes, haven’t aged very well, have they? They’re faded and baggy as old T-shirts: you can hardly go out in them anymore. No, even irony is looking a bit shopworn these days. Irony, after all, is only a gloss on something, not the thing itself; like Google, it blurs the line between knowingness and knowledge, feeds upon preexisting sources that it vaguely cheapens and diminishes along the way.

And if nothing is just itself, but only a link that points us somewhere else, where does it end? Or should it end? After all, what’s so great about originality, anyway? The energies of evolution, in art as in life, are cyclical, interconnecting, codependent; we progress, if at all, not by leaps and bounds but by tiny incremental variations. “Nothing goes away,” Thomas Pynchon once wrote. “It only changes form.”

Pynchon himself was never shy, early on, about name-checking his influences, be they Henry Adams, Dashiell Hammett, the Beats, or, in one jauntily anachronistic hat-tip midway through Gravity’s Rainbow, Ishmael Reed. Like V.’s Herbert Stencil, he’ll strew discrete bits of referential data around like so many clues at a crime scene, then make us do the detective work of connecting the dots—all the while hinting that the joke is on us, that no one stands behind the curtain, no one’s working the levers; that, confronted with the blank screen of the case that is the world, we are forever projecting onto it our own plots and conspiracies, our own constellated reference points, our own fiercely homespun webs. Better an imagined coherence, even a paranoid one, than no coherence at all. Hence it’s no surprise to discover the tracks of other writers all over V.’s shifting sands, like that massive bigfoot, William Gaddis’s The Recognitions—a novel all about the virtues, or necessities rather, of copying the old masters. For all the farcical camouflage of its thousand pages, the book delivers its theme brazenly in a droll bit of shorthand: “Orignlty not inventn bt snse of recall, recgntion, pttrns alrdy thr.”

Half a century later, from the far side of the postmodern age, this view of originality as more discovery than invention, more systemic wave than idiosyncratic particle, seems so ubiquitous, so culturally assimilated—so unoriginal, in short—it’s almost boring to talk about. “I wonder if this thing we call originality,” David Mitchell posits in a recent interview, “isn’t an electric motor powered by the two poles of the already done and the new twist, or the familiar and the far-out.” If that sounds a bit mechanistic and dream-factoryish, Mitchell sees no need to apologize: like Pynchon, or his more obvious influence, Murakami, he delights in conflating high and low cultural forms, as happy to plunder from Captain Jack Sparrow or The Matrix as from Nabokov, Borges, or Calvino. Indeed, reading Cloud Atlas, with its zigzagging narratives, its gorgeously overstuffed setpieces, and its declamatory rhetorical flourishes, is like being locked in the wardrobe closet of some insanely well-funded amateur provincial theater. Let’s see, here’s a sea captain’s hat, here’s a tweedy English jacket, a pair of leather pants from San Francisco, a silken waistcoat in the Ottoman style, some sort of fusiony, futuristic kimono… The presence of so many florid costumes, so many disparate narrative forms in a single work (sea story, dystopian sci-fi, ’70s political thriller, and others) creates its own dizzying centrifugal effect(s). It’s as if Mitchell has set out to trump Samuel Goldwyn’s maxim, cited approvingly in the same interview: “Let’s have some new clichés.”

Given the breadth of his historical reach, the acuity and supple facility of his prose, the boldness of his refusal to cover his own source-tracks—for Cloud Atlas is riddled with attention-seizing references to Melville, Defoe, Huxley, Orwell, Thornton Wilder, and many other novels, including other David Mitchell novels—and the giddy relish with which he spins his genre-hopping, borderline-pulpy yarns, Mitchell may turn out to be the closest thing we have to a representative writer for this globalized age, in that he makes this vast, unsettling interconnectivity of ours both subject and method, both engine and fuel. Or perhaps it’s that he portrays this as a good thing. (“Separateness, that’s what went wrong… Everything withholding itself from everything else… Everything vain, asserting itself”: Gaddis again.) Characters in Mitchell tend to move like clouds across spatial and temporal and personal lines, drifting along preexisting songlines toward memory and myth, deeply inlaid species codes. For all the violence and melodrama, he offers an affirmative vision, the bright side of the Pynchonian moon. Here, too, is a flickering projection of coherence. And hasn’t it always been our lot, as writers, readers, and human selves, to warm our hands by the same fires, sharing the same source materials, passing along the same tribal myths and narrative codes, all of us living, for better and for worse, in the same tent city, the same far-flung, shimmering web?

If so, then artists are only embroidering a fabric that already exists. As Emerson puts it: “Old and new make the warp and woof of every moment. There is no thread that is not a twist of these two strands. By necessity, by proclivity, and by delight, we all quote.”

Upon examination, of course, every strand of the old turns out to be more of a thickened braid, a double or triple helix in its own right—as I have just demonstrated, for example, by quoting Emerson in a way that implies I have a volume of his essays open beside me, when in truth I have copied this quote not from Emerson but from Jonathan Lethem’s ingenious Harper’s piece “The Ecstasy of Influence”—an essay that at once documents, investigates, and (because it comprises, almost entirely, quotes from other writers) formally embodies an “open source” approach to art. And I’d be willing to bet that Lethem, as he wrote, didn’t have Emerson open beside him either.

It’s all fair use, in other words—in fact in those words—and that’s fine. Same as it ever was. Plagiarism, as plagiarists are always eager to remind us, is a fairly recent concept anyway. The ancient poets copied freely from any text they deemed useful, often verbatim and without citation, recasting existing works to suit their purpose and taste. The Hebrews ripped off the Canaanites. Virgil ripped off Homer, Dante ripped off Virgil, Matthew and Luke ripped off Mark; Shakespeare ripped off Plutarch, Eliot ripped off Shakespeare, Dylan rips off everybody, and—my sweet lord—George Harrison ripped off the Chiffons. And so it goes. It’s an eternal, self-renewing process, the great daisy chain of literary influence. Or, to phrase it less romantically, language is the common whore whom each writer tries to make his own bitch.

That’s a direct quote from Auden, by the way, though it pains me to cite him in this context, as I haven’t read Auden in years, and so consequently have no idea how and where I came by this quote, which is therefore come to think of it not direct at all. Maybe it’s not even from Auden. It sounds more like Mickey Spillane. If only I owned a smartphone I could tell you for sure. Though surely by now you have long since gotten your own smartphones out and could just as easily tell me.

Still, whatever we want to call this tradition—fair use, open source, public commons, digital sampling, everything up for grabs and equally viable—how such referential sampling plays out over time for the working artist, let alone that beleaguered, almost comically antiquated species, the individual in society; whether it opens outward to a rich world of dizzying possibilities or flattens into a shallow, enervating plane of the already-done, remains, to me, anyway, yet another open question.

When I talk about flattening I have in mind someone like John Leonard, the late book critic, whose gift for referentiality, for going culturally broad as a way of going artistically deep, for turning every review into a swinging cocktail party with himself in the role of maniacally chatty, thoroughly indiscriminate host, is both virtuosic and exhausting. Take, for example, his Times review of Harry Potter and the Order of the Phoenix. First he ushers in the distinguished early arrivals like Tolkien, Joseph Campbell, and C. S. Lewis; then he proceeds down the hall—by way of Wonderland, Camelot, Brigadoon, Macondo, and Oz—to the library, where William Blake, George Orwell, Jacques Lacan, Lewis Carroll, Cyrus of Herodotus, Saint John of the Cross, Saint Teresa of Ávila, the Old Testament, the New Testament, the Hindu Krishna, the Epic of Gilgamesh, and “The Song of Roland” await. Along the way he stops to chat with some boisterous, colorfully dressed old friends: Mary Poppins, the Brothers Grimm, Scheherazade, Hercules, Godzilla, Sinbad the Sailor, the Flying Dutchman, Luke Skywalker, T. H. White, Snow White, Peter Pan, Caliban, Superman, and Doctor Who. And, wait, did we leave out Judith Krantz? Because you can be sure Leonard didn’t. How could we leave out Judith freaking Krantz?

While it’s easy, and probably advisable, to make fun of Leonard’s promiscuous way with the caps key, he’s hardly unique. The entire oeuvre of George Steiner, as John Simon once said, “reads like a university library card catalogue hit by a tornado.” This is more or less what critics do with books: they fling other books at the page like so much spaghetti and see what sticks. Never mind that half or more of such references, as any honest writer will concede, are unintentional; the reader gets the final say. “Inter-textuality, like all aspects of literary reception, is ultimately located in reading practice,” writes the theorist Don Fowler. “Meaning is realized at the point of reception, and what counts as an intertext and what one does with it depends on the reader.” The very form of such criticism as Leonard’s reminds us that connectivity is the reader’s province, that no work is ever truly singular but demands to be read in the constellated lights of every other work, lights it both absorbs and then atomizes in myriad directions toward works already written and those far off in the dark matter, waiting to emerge.

Borges, in his marvelous essay “Kafka and His Precursors,” takes this idea, as he takes everything, one step further in the direction of the cosmic, unfurling a multidimensional map of literary influence where the streets are not one-way or two-way but rather comprise an infinite series of forking paths. “The fact is that each writer creates his precursors. His work modifies our conception of the past, as it will modify the future. In this correlation, the identity or plurality of men doesn’t matter.” Like Gaddis and Mitchell, Borges, the blind librarian, seeks to undermine, if not negate, our conventional notions of originality—even as he also seeks to rescue them through the even more original, more creative act of referential reading.

Let’s pause here to concede the obvious: novels have always depended upon reference, both intertextual, to the novels that precede them, and extra-, to the great, teeming world beyond their bindings. Think of Stendhal’s Paris, Dickens’s (or Zadie Smith’s) London, Dostoyevsky’s St. Petersburg, Joyce’s Dublin, Paley’s New York, Bellow’s Chicago… these are not strictly inventions but rather loosely inventions, their atmospheres dense with soot particles of the “real,” references to the preexisting, the unalterable and uninventable, toward which they are often deferential to the point of fetishistic. Illusion and allusion go hand in hand. Even a quaint old convention like the name-dash (as in “He wrote to Madame G—” or “That day she made the long train journey to S—”) manages to conflate realism and artifice in such a way as to heighten and subvert any distinction, suggesting at once the inside authority of personal discretion (you and I both know who I’m really talking about here, so why spell it out?) and the impersonal mystery of art (why not spell it out?). “I hate things all fiction,” Byron wrote. “There should always be some foundation of fact for the most airy fabric—and pure invention is but the talent of a liar.” Generally the line between what goes on inside the margins and what goes on outside is porous at best; at worst it’s so thin as to barely register. To straddle this line is to generate either a peculiarly artful form of narrative tension (as in Georges Perec’s stunning Holocaust work, W, or the Memory of Childhood, with its alternating planes of poorly remembered facts and allegorical invention), or a kind of narcissistic blur. Or, in the case of someone like Marguerite Duras—whose narrators, even as they fumble in a memory-fog toward the window of public realities, keep bumping into the mirror of their own artifices—both. “Yes, it’s the big funereal car that’s in my books… It’s a Morris Léon-Bollee. The black Lancia at the French embassy in Calcutta hasn’t yet made its entrance on the literary scene.”

It seems unlikely that this was what Byron had in mind. Nor would he approve, in all likelihood, of fiction like Kafka’s or Beckett’s, which is almost mystically purged of fact, shorn of anything that resembles a foundation other than the grave. And yet it’s doubtful that writers on the other, more material end of the spectrum, who litter their work with the proper nouns of the day, would charm him much either. How much purity is too much, in short, and how much not enough? How much of our reading of Don Quixote, say, depends at least in part upon some acquaintance with the chivalric romances it’s lampooning? What about Madame Bovary? What about Virginia Woolf? If the moderns require us to revisit the very premoderns from which they’re radically departing, and high-modernist strategies like parody and myth (think of Ulysses, of Lolita) draw their fullest power and extension from systematic reference to bodies of extratextual material, and then postmodernism comes along and gleefully rides that horse into the sunset, or rather toward the sunset… for somehow we never quite make it to the terminal point, the point beyond which one cannot turn back, but invariably wind up stranded in that noisy, brightly lit cul-de-sac we call pastiche… how will we ever get free of the library, out into the muck and mess of the day?

Pastiche, of course, gets a bad rap these days—“parody without any of parody’s ulterior motives,” Fredric Jameson calls it, “amputated of the satiric impulse, devoid of laughter”—as Las Vegas gets a bad rap, and yet both have their appeal, as well as a certain cultural inevitability. “Ironically,” Kurt Andersen remarks,

new technology has reinforced the nostalgic cultural gaze: now that we have instant universal access to every old image and recorded sound, the future has arrived and it’s all about dreaming of the past. Our culture’s primary M.O. now consists of promiscuously and sometimes compulsively reviving and rejiggering old forms. It’s the rare “new” cultural artifact that doesn’t seem a lot like a cover version of something we’ve seen or heard before. Which means the very idea of datedness has lost the power it possessed during most of our lifetimes.

Still, even as we decry the practice of cannibalizing the museum, it’s hard to resist the opportunities it offers for serious play, for trying on masks and appropriating genres in a way that both transgresses and transforms them. “Writers signify upon each other’s texts,” says Henry Louis Gates, “by rewriting the received textual tradition.” We have become so accustomed to this signifying in the arts—in music, in painting and photography, in such cinematic pastiche-meisters as Todd Haynes, the Coen Brothers, and Quentin Tarantino—we may no longer even be conscious of it as such, which may be why recent novels have had to work extra hard to call attention to it. I’m thinking of novels like Percival Everett’s Erasure, which works off Ralph Ellison; Mat Johnson’s Pym, which riffs on Poe; Arthur Phillips’s The Tragedy of Arthur, with its gleefully extended nonparody of Shakespeare; and Benjamin Markovits’s Byron-soaked Childish Loves; and many others. Sometimes it seems like everything is pastiche, that nothing in our culture is immune to infection by that corrosive germ. These are the moments that make us almost literally sick. When no equal-and-opposite force is deployed against it, referentiality, for all its aesthetic justifications, comes off as mere glorified cleverness, so much smart-­alecky dick-swinging: a low-impact sport where the brainiacs and bookworms finally get picked first. Of course, sometimes it’s nice just to be on a team. The anxiety and loneliness of creation make us long for sociability, for commonality, for historical continuity, some wider safety net for the self as it plunges toward the void of the page.

Anyway, Benjamin knew what he was talking about: there’s no going back. Like it or not, the tide of information isn’t likely to recede. One must find a way not to drown but to swim. Our grand allusions should be not destination but departure point, not discovery but “recgntion.” What forms this will or should take is impossible to predict in anything but their multiplicity, their flexibility, their spirit of “impure invention.” Not the lie of pure originality, but not the echo chamber of mere secondhand reference, either. As Valéry puts it: “Nothing more original, nothing more unique than to feed off others. But they must be digested—the lion achieves his form by assimilating sheep.”

*

All of which takes us a long way from Kafka’s Red Indian, that pure, untrammeled being. Or does it? Maybe in the end Kafka is just too severe, too much the perfectionist, too willing to spurn the communal enterprise, with its contaminations and compromises, in art as in life. None of us, in the end, actually are Red Indians. We’re more like the mouse folk who listen to Josephine, or the crowd that files by the Hunger Artist’s cage—transfixed, for a while anyway, by admiration for that very purity we can neither achieve nor maintain—before we return to the stream of human traffic he calls the Verkehr. But then even Kafka’s story is not entirely reference-free. It, too, directs us outward, first to whatever a “Red Indian” might be, and then to whatever historical context might help us comprehend how and why a finicky, neurotic German-writing Jew in the middle of the Austro-Hungarian Empire might refer to such a figure. If ours is not an age of pure expression, neither was Kafka’s. No age is conducive to purity, which is probably why we’re always longing for it. (Didn’t minimalism flourish in the 1960s?) So even if it’s true that accelerated access to data of every kind will continue to tip the balance even further from invention to fact, or to the kind of heavily-googled works of social realist fiction that depend upon technology to provide the weapons they will use (feebly, for the most part) to attack it; and even if it’s true that as a consequence we are now exploring the outer reaches or lower depths of referentiality, wondering how far the rubber band might stretch before it snaps; we are still impelled onward, I would argue, by the same old longings, the same old methods. Still using language like feelers, inching our way toward the edge beyond which language can’t go, the happiness (as they say) that writes white.

More Reads
Essays

Beach House as Nostalgia Museum

Alan Michael Parker
Essays

The Forgotten Ephemera of Genuis

Noah Sneider
Essays

Gold, Golden, Gilded, Glittering

Rachel Cohen
More