Читать книгу About Writing - Samuel R. Delany - Страница 8

Оглавление

An Introduction

Emblems of Talent

I

In July of 1967 I waited in a ground-floor room, yellow, with dark wainscoting and wide windows giving onto Pennsylvania greenery. First, four holding notebooks, followed by three in sneakers, two more with briefcases, another six in sandals and Bermudas, then another three laughing loudly at a joke whose punch line must have come just outside the double doors, followed by two more in the denim wraparound skirts that had first appeared that decade, then still another two with fools-cap legal pads, who looked as nervous as I felt, most with long hair except an older man and a middle-aged woman, both gray (and one woman, also in Bermudas and sandals with black hair helmet-short), some twenty-five students wandered in to sit on the couches circling the blue carpet. Behind a coffee table, I coughed, sat forward, said hello and introduced myself—and began to teach my first creative writing workshop. Repeatedly, in the thirty-five years since, I’ve been surprised how far and fast that July has fallen away.

For more than a decade (1988–99), at the University of Massachusetts at Amherst, now in a pale orange space—the well of a hall called Hasbrook—now on the stage of the Hurter auditorium above the university museum, I taught an introductory lecture course in the reading of science fiction, with, each term, ninety to a hundred-fifty students. Sometime during my first three lectures, I would step from behind the podium, look out over the space that, in the early 1960s, some architect had thought “the future” ought to look like, and ask for a show of hands from students interested in writing the kinds of stories we were reading. Perhaps five to ten people scattered throughout those hundred-plus would fail to raise a hand.

The rest were eager to write.

As well, for over half those years (usually in that same hall, but once in a cement cellar room with nozzles for Bunsen burners on the worn demonstration desk), to the semicircles on semicircles of students with their notebooks ranged around me, I delivered another lecture course on the reading of general short fiction. At the start of the term I would ask the same question to a similar number. Here, perhaps fifteen or twenty out of my hundred, my hundred-fifty students would admit to not wanting to be writers.

Among a notable sector of the country’s college students oriented toward the humanities, the desire to write is probably larger than the desire to excel at sports.

Over the thirty-five years since I began to teach creative writing (with almost as many years writing about and teaching literature), I have asked hundreds of students why each wanted to write. By far the most common answer was, “I don’t want to do what my parents do. I don’t think they’re happy in their work.” Readily I identify with those feelings. Neither of my parents finished college—not uncommon during the Depression of the 1930s. For different reasons, both might have been happier if they had. Both were energetic and creative. Too much of that creativity was drained off in anxieties. Most worries are a matter of telling oneself more or less upsetting stories of greater or lesser complexity about one’s own life. Turning that ability outward to entertain others, rather than inward to distress oneself, has to have some therapeutic value.

Even in a good university creative writing program, however, the number of graduates who go on to publish fiction regularly in any venue that might qualify as professional is below ten percent—often below five. Were we talking about medical school, law school, engineering, or any other sort of professional training, that would be an appalling statistic. Art schools fare better in turning out professional artists of one sort or another than creative writing programs do in turning out professional writers. But even with such distressing results, writing programs are currently one of the great growth areas of the modern university.

Though vast numbers of people want to write fiction, the educational machinery set in place to teach people how doesn’t work very well.

While this book puts forth no strategies for correcting the situation, it discusses some reasons why this is the case—and why it might be the case necessarily. As well, it deals with three other topics and the relations between them. One—which it shares with most books on writing—is, yes, the art of writing fiction. The other two are far less often discussed in classes and rarely figure in such “how-to” books. First, how is the world structured—specifically the socio-aesthetic world—in which the writer works? Having said that, I should add that this is not a book about selling, marketing, or promoting your manuscripts. Rather, it is a book about the writer’s world and how that world differs from the world of other people—as well as how that world is organized today differently from the way it was organized twenty-five, thirty-five, and seventy-five years ago, when most of the tales about writing that we still read today, the mythology of Pound (1885–1972), Joyce (1882–1941), Gertrude Stein (1874–1946), F. Scott Fitzgerald (1896–1939), T. S. Eliot (1885–1960), Ernest Hemingway (1899–1961), and the other high modernists, first sedimented. Second (and finally), this book discusses the way literary reputations grow—and how, today, they don’t grow. (Clearly, they must grow in a very different way from the way they grew a hundred or two hundred years back, because the field in which they grow is so much larger and structured so differently.) In the letters and interviews here, I consider these last two questions—the writer’s world and the writer’s reputation—from the point of view of the writer who strives after high quality and who wants to be known for what she or he actually accomplishes. (I am not interested in reputations that develop when publishers decide they’ve found a “money-making” idea that they can flog into profit through advertising and publicity.) Frankly, I know of no other book on writing that treats all three (the art of fiction, how that art fits into the world today, and the nature of the writer’s reputation), or shows the ways they interrelate. But I have tried to write one—because they do.

II

The essays here were written to stand alone. More or less, they introduce themselves. The letters and interviews following them—if only because it’s somewhat unusual to include such documents in such a book—may need some intellectual context.

The first letter makes a point only in passing that is nevertheless fundamental. So I stress it now.

Though they have things in common, good writing and talented writing are not the same.

The principles of good writing can be listed. Many people learn them:

(1) Use simple words with clear meanings whenever possible. (Despite the way it sounds, this is a call for clarity, not a bid for simplicity.)

(2) Use the precise word. Don’t say “gaze” when you mean “look.” Don’t say “ambled” or “sauntered” or “stalked” when you mean “walked.” (And don’t say “walked” when you mean one of the others.) As far as the creative writer goes, the concept of synonyms should be a fiction for high school and first- and second-year college students to encourage them to improve their vocabularies. The fact is (as writers from Georg Christoff Lichtenberg [1742–99] in the eighteenth century to Alfred Bester [1913–87] in the twentieth have written), “There are no synonyms.”

(3) Whenever reasonable, avoid the passive voice.

(4) Omit unnecessary modifiers. As a rule of thumb, nouns can stand up to one modifier each; thus, if you use two—or more!—have a good reason.

(5) For strong sentences, put your subject directly against the verb. Preferably, when possible, move adverbial baggage to the beginning of the sentence—or to the end, less preferably. Don’t let it fall between subject and verb. Except for very special cases (usually having to do with the intent to sound old-fashioned), do not write “He then sat,” “She suddenly stood,” or “He at once rose.” Write “Then he sat,” “Suddenly she stood,” or “He rose at once.”

(6) Omit unnecessary chunks of received language: “From our discussion so far it is clearly evident that …” If it’s that evident, you needn’t tell us. “Surely we can all understand that if …” If we can, ditto. “In the course of our considerations up till now clearly we can all see that …” If it follows that clearly and we can all see it, we’ll get the connection without your telling us we’ll get it. If the connection is obscure, explain it. “It goes without saying that …” If it does, don’t. “Almost without exception …” If the exceptions are important enough to mention, say what they are; if they’re not, skip them and omit the phrase mentioning them. Make your statements clearly and simply. If you need to include qualifications of any complexity, don’t put them in awkward clauses. Give them separate sentences.

(7) Avoid stock expressions such as “the rolling hills,” “a flash of lightning,” “the raging sea.” “Hills,” “lightning,” and “sea” are perfectly good words by themselves. Good writers don’t use such phrases. Talented writers find new ways to say them that have never been said before, ways that highlight aspects we have all seen but have rarely noted.

(8) Good writing rarely uses “be” or “being” as a separate verb. Don’t use “be” or “being” when you mean either “becoming” (not “It had started to be stormy,” but “A storm had started”) or “acting” (Not “She was being very unpleasant,” but “She was unpleasant”), except in dialogue or in very colloquial English. By the same token, avoid “There are” and “There were” whenever possible. Except in colloquial situations, don’t write “There were five kids standing in line at the counter.” Write “At the counter five kids stood in line.”

(9) Don’t weigh down the end of clauses or sentences with terminal prepositional phrases reiterating information the beginning already implies.

Here’s an example of that last: “I turned from my keyboard to stack the papers on the desk.” Since the vast majority of keyboards sit on desks, you don’t need that terminal prepositional phrase “on the desk.” If you turned from the keyboard to stack some papers “on the floor” or even “on the kitchen table,” that “on the floor” or “on the kitchen table” would add meaningful information to the visualization. But, in the context of the last three hundred years of office work, “on the desk” is superfluous.

You can consider this next a tenth rule, or just a general principle for good style: use a variety of sentence forms. Try to avoid strings of three or more sentences with the same subject—especially “I.” While you want to avoid clutter, you also want to avoid thinness. Variety and specificity are the ways to achieve this. The rules for good writing are largely a set of things not to do. Basically good writing is a matter of avoiding unnecessary clutter. (Again, this is not the same as avoiding complexity.)

You can program many of these rules into a computer. Applied to pretty much any first draft, these rules will point to where you’re slipping. If you revise accordingly, clarity, readability, and liveliness will improve.

Here again we come up with an unhappy truth about those various creative writing and MFA programs. If you start with a confused, unclear, and badly written story, and apply the rules of good writing to it, you can probably turn it into a simple, logical, clearly written story. It will still not be a good one. The major fault of eighty-five to ninety-five percent of all fiction is that it is banal and dull.

Now old stories can always be told with new language. You can even add new characters to them; you can use them to dramatize new ideas. But eventually even the new language, characters, and ideas lose their ability to invigorate.

Either in content or in style, in subject matter or in rhetorical approach, fiction that is too much like other fiction is bad by definition. However paradoxical it sounds, good writing as a set of strictures (that is, when the writing is good and nothing more) produces most bad fiction. On one level or another, the realization of this is finally what turns most writers away from writing.

Talented writing is, however, something else. You need talent to write fiction.

Good writing is clear. Talented writing is energetic.

Good writing avoids errors. Talented writing makes things happen in the reader’s mind—vividly, forcefully—that good writing, which stops with clarity and logic, doesn’t.

Talent appears in many forms. Some forms are diametric to each other, even mutually exclusive. (In The Dyer’s Hand, W. H. Auden [1907–73] says most successful writers overestimate their intelligence and underestimate their talent. Often they have to do this to preserve sanity; still they do it.) The talented writer often uses specifics and avoids generalities—generalities that his or her specifics suggest. Because they are suggested, rather than stated, they may register with the reader far more forcefully than if they were articulated. Using specifics to imply generalities—whether they are general emotions we all know or ideas we have all vaguely sensed—is dramatic writing. A trickier proposition that takes just as much talent requires the writer carefully to arrange generalities for a page or five pages, followed by a specific that makes the generalities open up and take on new resonance. Henry James (1843–1916) calls the use of such specifics “the revelatory gesture,” but it is just as great a part of Marcel Proust’s (1871–1922) art. Indeed, it might be called the opposite of “dramatic” writing, but it can be just as strong—if not, sometimes, stronger.

Here are other emblems that can designate talent:

The talented writer often uses rhetorically interesting, musical, or lyrical phrases that are briefer than the pedestrian way of saying “the same thing.”

The talented writer can explode, as with a verbal microscope, some fleeting sensation or action, tease out insights, and describe subsensations that we all recognize, even if we have rarely considered them before; that is, he or she describes them at greater length and tells more about them than other writers.

In complex sentences with multiple clauses that relate in complex ways, the talented writer will organize those clauses in the chronological order in which the referents occur, despite the logical relation grammar imposes.

Here is a badly organized narrative sentence of the sort I’ve read in dozens of student manuscripts handed in by writers who want to write, say, traditional commercial fantasy:

(A) Jenny took a cold drink from the steel dipper chained to the stone wall of the corner well, where, amidst the market’s morning bustle, the women had finished setting up their counters and laying out their tools, implements, and produce minutes after the sun had risen; she had left the sandal stall to amble over here.

The good writer would immediately want to break the above up into smaller sentences and clarify some antecedents:

(B) Jenny took a cold drink from the steel dipper chained to the stone wall. With the market’s morning bustle, the women had finished setting up their counters and laying out their tools, implements, and produce. Only minutes after the sun had risen, Jenny had left the sandal stall and ambled over to the corner well.

Certainly that’s an improvement; and it hides some of the illogic in the narrative itself. But a writer who has a better sense of narrative would start by rearranging the whole passage chronologically:

(C) Minutes after the sun had risen above the wall, amidst the market’s morning bustle, the women finished setting up their counters and laying out their tools, implements, and produce. Jenny left the sandal stall to amble over to the corner well, where, from the steel dipper chained to the stone wall, she took a cold drink.

At this point, the two sentences still need to be broken up. But at least the various clauses now come in something like chronological order. This allows us to see that each fragment can have far more heft and vividness:

(D) Minutes after the sun cleared the market wall, foot-prints roughened the dust. Tent posts swung up; canvas slid down. Along the counters women laid out trowels and tomato rakes, pumpkins and pecan pickers. Jenny ambled from under the sandal stall awning. At the corner well she picked up a steel dipper chained to the mossy stones for a cold drink. As it chilled her teeth and throat, water dripped on her toes.

Talented writing tends to contain more information, sentence for sentence, clause for clause, than merely good writing. Example D exhibits a variety of sentence lengths. Yes, the images arrive in chronological order. But more than that, the passage paints its picture through specifics. It also employs rhetorical parallels and differences. (“Tent posts swung up; canvas slid down.”) It pays attention to the sounds and rhythms of its sentences (“trowels and tomato rakes, pumpkins and pecan pickers”). It uses detailed sensory observation (the drink chills “her teeth and throat”). Much of the information it proffers is implied. (In D that includes both the bustle and the fact that we are in a market!) These are among the things that indicate talent.

I do not hold up D as a particularly good (or particularly talented!) piece of writing, but it shows a rhetorical awareness, a balance, a velocity, a particularity, and a liveliness that puts it way ahead of the others. Above and beyond the fact that they are logically or illogically organized, versions A through C are, by comparison, bland, formulaic, and dull. What distinguishes the writers of A, B, and C is, in fact, how good each is. But D alone shows a scrap of talent—and only a scrap.

Good writing avoids stock phrases and received language. Talented writing actively laughs at such phrases, such language. When talented writing and good writing support one another, we have the verbal glories of the ages—the work of Shakespeare, Thomas Browne, Joyce, and Nabokov.

Talented writing and good writing sometimes fight. The revisions necessary to organize the writing and unclutter it can pare away the passages or phrases that give the writing its life. As often, what the writer believes is new and vivid is just cliché confusion. From within the precincts of good writing, it’s easy to mistake talent’s complexity for clutter. From within the precincts of talent, it’s easy to mistake the clarity of good writing for simplicity—even simple-mindedness. Critics or editors can point the problems out. The way to solve them, however, is a matter of taste. And that lies in the precincts of talent.

III

The early German Romantics—Schiller (1759–1805), the Schlegel brothers, Wilhelm (1767–1845) and Friedrich (1772–1829), and Ludwig Tieck (1773–1853), that is, the smart Romantics—believed something they called Begeisterung was the most important element among the processes that constituted the creative personality.

I think they were right.

Begeisterung is usually translated as “inspiration.” Geist is the German word for “spirit,” and “Be-geist-erung” means literally “be-spirited-ness,” which is certainly close to “inspiration.” As the word is traditionally used in ordinary German, though, it is even closer to “enthusiasm”—“spirited” in the sense of a “spirited” horse or a “spirited” prizefighter. For the Romantics, Begeisterung was not just the initial idea or the talent one had to realize it. Begeisterung was both intellectual and bodily. A form of spirit, it was also a mode of will. To the Romantics, this enthusiasm/Begeisterung carried the artist through the work’s creation. If there were things you didn’t know that you needed in order to write your story, your novel, your play, with enough Begeisterung you could always go out and learn them. If your imagination wasn’t throwing out the brilliant scenes and moments to make the material dramatic, with Begeisterung you could arrive at such effective material through dogged intelligence, though it might take longer and require more energy. If you lacked the verbal talent that produced vivid descriptive writing, well, there were hard analytic styles that were also impressive, which you could craft through intellectual effort—though you would have to attack the work sentence by sentence. But however you employed it, Begeisterung is what carried you through the job. Begeisterung could make up for failures on other creative fronts.

Begeisterung is what artists share over their otherwise endless differences: enthusiasm for a task clearly perceived.

Over the range of our society the artist’s position is rarely a prosperous one—certainly not in the beginning stages and often never. The increased size of the new, democratic field that today produces both readers and writers, the increase in competition for fame and attention—not to mention the increased effort necessary to make a reasonable living from one’s work—all transform a situation that was always risky into one that today often looks lunatic. Begeisterung/enthusiasm is about the only thing that can get the artist through such a situation.

The decision to be a writer is the decision to enter a field where most of the news—most of the time—is bad. The best way to negotiate this situation is to have (first) a realistic view of what that situation is and (second) considerable Begeisterung. As Freud knew, Begeisterung is fundamentally neurotic. The critic Harold Bloom has suggested that what makes artists create is rank terror before the failure to create, a failure that somehow equates with death. When the artist discovers creation can, indeed, allay that fear, it produces the situation and form of desire that manifests itself as Begeisterung. When, from time to time throughout the artist’s life, Begeisterung fails, often terror lies beneath.

Having mentioned the basic importance of Begeisterung, I’ll go on to outline another use.

Let me describe two students in a midwestern graduate creative writing workshop I taught once. One was a young man of twenty-six from a solidly middle-class background, who had entered the university writing program with extremely good marks and high scores on his GREs (Graduate Record Exams). From the general discussion of the student stories we analyzed in the workshop, clearly he was an intelligent and sensitive critic. Certainly he was among the smartest and the most articulate of the students in the group. He was not particularly interested in publishing, however, and in a discussion during which I asked students what they wanted to do with their writing and where they saw themselves going, he explained that he wanted to improve his writing and eventually publish a collection of stories in a university series that was committed to doing graduate student and junior faculty work. He had no particular series in mind but was sure one such existed, which would accept his work, preferably without reading it, purely because some other writer—perhaps a workshop teacher—had judged him personally “ready for publication.” When I told him I knew of no such series, nor had I any personal criteria for “publishability” other than finding a given story a rewarding and pleasurable read, he was not at all bothered. If such a series did not exist now, he was sure that in four or five years it would—because that was the right and proper way the world should work. Through continuing in workshops, he would eventually get his chance. If he didn’t, finally it didn’t matter. He felt no desire to have his work appear from a large commercial press, however, or from a small press only interested in supporting work it judged of the highest quality. As soon as any sort of competitive situation arose, he felt there must go along with it some bias based on nonesthetic aspects—actually an interesting theory, I thought. Though he sincerely wanted to improve his work for its own sake, he felt, when and if his work was published, it should be published because there was a place that published work such as his and it would simply be, so to speak, his turn. Competition, he believed and believed deeply, was not what art was about. He articulated this position well. The other students in the class were all impressed with his commitment to it—as, in fact, was I.

Myself, I had seen no evidence of what I could recognize as talent in his writing, however, and his stories struck me as a series of banal romances in which the hero either discovered his girlfriend was cheating on him and left her sadly, or another girl began an affair with a hero recently cheated on and stuck to him despite his gloom. They were well written, in precisely the sense I describe above, but they were without color or life—and always in the present tense. That made them, he explained, sound more literary. And that’s the effect he wanted.

If I had seen what I could recognize as talent, I might have been even more interested in his aesthetic position. But personally I could not distinguish his stories from many, many others I had read in many other workshops. Nor did the fact that they seemed so similar to so many others bother him at all, since—as he claimed—competition was not the point.

Once he used the term “classical” about what he wanted to achieve in his own stories. “But,” I said, “your model doesn’t seem to be the great classical stories of the past, but rather the averaged banality of the present.”

“Well,” he said, “perhaps that is today’s classical.”

I couldn’t take the argument much beyond that point. As a teacher, for me to say too much more would have been unnecessarily insulting—and I felt I had already come close to crossing a line I didn’t feel was good for purely pedagogical reasons. I decided to let him have the last word.

In the same class was a young woman of twenty-nine, from a working-class background. She was no slouch either as a practical critic, but she had nowhere near the self-confidence of the first student. Her GREs were eccentric: high in math, low in English. Her written grammar was occasionally faulty. Often she seemed at sea when critical discussions moved into the abstract. Several times, in several descriptions in her stories, however, she had struck me as talented—that is, she had made me see things and understand things that I had not seen or understood before. Interesting incidents were juxtaposed in interesting ways in her stories. Her characters often showed unusual and idiosyncratic combinations of traits. Words were put together in interesting ways in her sentences. But it was also clear that her stories were pretty much an attempt to write the same sort as most of the other students in the class, which tended to be modeled on those of the first young man—in her case with the sexes more or less reversed. When I asked her what she wanted to do with her writing, she said she’d like to go on and “be a writer” and “publish books,” but she offered it with all the hesitation of someone confessing to a history of prostitution.

At one point, after we had just read the story for class, I mentioned that Joyce had written “The Dead” in 1907, when he was twenty-five, though it was not published till 1914. I told them I would like to see their stories aspire to a similar level of structural richness and a similar richness of description of the various interiors, exteriors, and characters.

Immediately the young man objected: “You can’t tell us that! That just paralyzes us and makes us incapable of writing anything.”

But three weeks later, the young woman handed in a story she had gone home and begun that same night. It was far more ambitious than anything she’d done previously: incidents in the story had thematic and structural resonances with one another, and the physical description of the places and characters was twenty-five to thirty-five percent richer than anything she’d previously handed in. When I mentioned this to her after the workshop, she said, “I guess I went back to my math—that’s what my degree is in. I made a little geometric picture of how I wanted the parts of the story to relate to each other. Has anybody ever done that before?” I said I often did it myself, but that it seemed too idiosyncratic to talk about in a general workshop. (Her comment is one of the things, however, that convinced me to write about it in the essay “Of Doubts and Dreams.”) As well, it was the first piece I’d seen from her that was not basically a disappointed romance about a graduate student with no mention of how she supported herself. (The woman was happily married to a very successful and supportive engineer.) Instead, she had taken another hint from Joyce and mined her own childhood material for her tale, in her case the Pittsburgh foundry where her Hungarian father had worked and the men and women who’d worked with him (I’d never known women worked at foundries before), whom she used to know when she was a child. She’d based her main character on a young woman, a few years older than herself, who’d had a job there, something that seemed to my student exciting and romantic. When she’d been twelve she’d desperately envied this young working woman of seventeen. A few years later, however, when she herself had reached twenty, she now realized this wonderfully alive young woman was actually trapped in a dead-end job by family and social forces, which led nowhere. Speaking to me privately after class, she said, “When you pointed out how old Joyce was when he’d done it, I realized there was no reason I couldn’t do it, too.” Though the fact was, her story shared nothing with Joyce’s save the jump in descriptive and structural richness.

I have said most of the news about writing is bad. But much of the news—such as the age writers were when they wrote this or that work—is neutral. However neurotic its basis, Begeisterung or its lack is what turns these neutral facts into good news (Hey, I can do that!) or bad news (Nobody can do that!). Perhaps you can see from these last examples why the usual translation of Begeisterung—“inspiration,” without the added energy of enthusiasm—doesn’t quite cover the topic.

The idea that everyone can have a turn at publication is unrealistic—nor, outside a carefully delineated student context, do I think it’s desirable. When our current-day democratizing urge works to render the competition fairer, I’m for it. But that is not the same thing as art without competition. Today many young writers see self-publication as a way to sidestep what they also see as the first round of unfair competition. Marcel Proust, Gertrude Stein, Stephen Crane (1871–1900), and Raymond Roussel (1877–1933) all self-published notable works—just as Edgar Rice Burroughs (1875–1950) self-published some commercially successful ones (the Tarzan series, for example). But that is only to say that, for them, the competition began after publication, not before.

The other important fact—important enough that I would call it the second pole of my personal aesthetic, as Begeisterung is the first—is that literary competition is not a zero-sum game with a single winner, or even a ranked list of winners—that all-too-naive image of the canon in which, say, Shakespeare has first place and the gold cup, followed by Chaucer (c. 1343–1400) with the silver, in second place, Milton (1608–74) with the bronze, in third, with Spenser (c. 1552–99) and Joyce competing for who gets fourth and who gets fifth … The concept of literary quality is an outgrowth of a conflictual process, not a consensual one. In the enlarged democratic field, the nature of the conflict simply becomes more complex. Even among the most serious pursuers of the aesthetic, there is more than one goal; there is more than one winner. Multiple qualities and multiple achievements are valued—and have been valued throughout the history of the conflicting practices of writing making up the larger field called the literary. That multiplicity of achievement can value Vladimir Nabokov (1899–1977), and Samuel Beckett (1906–77), G. K. Chesterton (1874–1936), and Virginia Woolf (1882–1941), Seamus Heaney (b. 1939), and Meridel Le Sueur (1900–1996), George Orwell (1903–50), and Joanna Russ (b. 1939), Nathanael West (1903–40), Robert Louis Stevenson (1850–94), and Nella Larsen (1891–1964), Edmund White (b. 1940), and Grace Paley (b. 1922), Junot Díaz (b. 1960), Vincent Czyz (b. 1963), John Berger (b. 1926), and Willa Cather (1876–1947), Susan Sontag (1933–2004), J. M. Coetzee (b. 1940), Dennis Cooper (b. 1953), Amy Hempel (b. 1951), Michael Chabon (b. 1964), Ana Kavan (1901–68), Sara Schulman (b. 1958), and Kit Reed (b. 1954), Josephine Saxton (b. 1935), Erin McGraw (b. 1957), Harlan Ellison (b. 1934), Luiza Valenzuela (b. 1938), Mary Gentle (b. 1956), Shirley Jackson (b. 1919), JT Leroy (b. 1980), Thomas Pynchon (b. 1937), Linda Shore (b. 1937), Amy Bloom (b. 1953), Ursula K. Le Guin (b. 1929), Vonda N. McIntyre (b. 1948), Carol Emshwiller (b. 1921), Leonora Carrington (b. 1917), Lynn Tillman (b. 1947), L. Timmel Duchamp (b. 1950), Richard Yates (1926–92), Andrea Barrett (b. 1954), David Foster Wallace (b. 1962), Heidi Julavitz (b. 1968), Ben Marcus (b. 1967), Michael Martone (b. 1955), Hilary Bailey (b. 1936), Christine Brooke-Rose (b. 1923), Octavia E. Butler (b. 1947), Adam Haslett (b. 1970), Anita Desai (b. 1937), Zora Neale Hurston (1891–1960), and Raymond Carver (1938–88), Malcolm Lowry (1926–66), and Wyndham Lewis (1882–1957), Tillie Olsen (b. 1912), and Raymond Chandler (1888–1959), Robert Glück (b. 1945?), and André Gide (1869–1951), Chris Offut (b. 1958), and Denis Johnson (b. 1959), James Joyce, Henry David Thoreau (1817–62), Lewis Carroll (1832–98), and Chester Himes (1909–84), Ralph Waldo Emerson (1803–82), and Wilson Harris (b. 1921), and Jean Rhys (c. 1890–1970), and John Crowley (b. 1943), Rikki Ducornet (b. 1943), Richard Wright (1908–60), and Djuna Barnes (1892–1982), Walter Pater (1839–94), Olive Shreiner (1855–1920), Thomas M. Disch (b. 1939), and Paul Goodman (1911–72), Oscar Wilde (1854–1900), Honoré de Balzac (1799–1850), Francis Bacon (1561–1626), and Rebecca Brown (b. 1955), Charles Baudelaire (1821–67), and Georg Büchner (1813–37), Michel de Montaigne (1533–92), and Naguib Mahfouz (b. 1911), Mary Caponegro (b. 1975), Marianne Moore (1887–1972), and Henri de Mantherlant (1895–1972), Melvin Dixon (1950–92), Daryl Pinckney (b. 1948), Roger Zelazny (1937–95), Randall Kenan (b. 1963), and Don Belton (b. 1956), Guy Davenport (1927–2005), and D. H. Lawrence (1888–1930), Hart Crane (1899–1931), and Jean Toomer (1897–1968), Ethan Canin (b. 1960), William Gass (b. 1924), Bruce Benderson (b. 1955), Ursule Molinaro (1923–2000), Paul West (b. 1930), Alan Singer (b. 1948), James Alan McPherson (b. 1943), Sandra Cisneros (b. 1954), Breece D’J Pancake (1879–1952), Michael Moorcock (b. 1939), and R. M. Berry (b. 1947), Edward Gibbon (1737–94), Richard Powers (b. 1957), John Galsworthy (1867–1933), and James Gould Cozzens (1903–78), Steve Erickson (b. 1950), Brian Evenson (b. 1966), Knut Hamsun (1859–1952), and Victor Hugo (1802–85), and any of the three hundred or five hundred or fifteen hundred others any literate reader would have to add to such a list. The greater their literacy, the more names they will add—and the more they will disagree over. Indeed, such a list only becomes useful as we read its biases and blindnesses, its gaps, its errors, its incompletenesses. The diversity and difference among such lists make the literary field rich and meaningful—not some hierarchical order that might initially generate one such list or another. Difference and diversity as much as education and idiosyncrasy will always defeat and shatter such a hierarchy after more than six or seven names are forced into it.

And that’s a good thing, too.

When you have read widely among these indubitably good writers, you must make an average image for yourself of their inarguably talented work—and realize that is what your own work must be better than. And you must realize as well, one way or another, that is what they are all (or were all)—living and dead—doing.

IV

Begeisterung was formulated and written about by a group of Germans some two hundred years ago. But the nature of the literary world has changed mightily since.

I’ve already used the phrase “the enlarged democratic field.” But what exactly are we talking about? Today the functionally literate population is more than fifty times the size it was 190 years ago in 1814, which is to say just at the time when ideas from Germany such as Begeisterung were first making their way through England and France both. It was after the Napoleonic wars but before the mid-nineteenth-century republican revolutions in Europe and the Civil War in the United States.

The Revolution of 1848 in France and the other uprisings within a few years of it throughout the continent (and, a dozen years later, the American Civil War) were armed battles between wealth concentrated in the old-style widespread agricultural capitalist system and wealth concentrated in the new-style widespread industrial capitalist system. Both in Europe and the United States, these conflicts were brought on by rising populations and changing technologies. First in Europe, then in the United Sates, new-style industrialism won.

Today a far higher percentage of the world’s population lives in cities than ever before. Public education has made advances that would have been inconceivable a century ago, much less two centuries. In England toward the end of the first two decades of the nineteenth century, the major poets of that time numbered six: Wordsworth (1770–1850), Coleridge (1772–1834), Blake (1757–1827), Byron (1888–24), Keats (1795–1821), and Shelley (1792–1822), all of whom were writing in the year 1814.

By general consensus some fourteen poets of considerable, if minor, interest were also writing then: Robert Southey (1774–1843), poet laureate in his day but known now only as the poet Lewis Carroll parodied in some of his Alice and Wonderland poems; Thomas Moore (1779–1852) was as famous as his close friend Lord Byron was in his day; he allowed Byron’s journals to be burned—and is himself now unread, although his Irish Melodies receives a passing mention in Joyce’s “The Dead”; during his lifetime Sir Walter Scott (1771–1832) was far better known as a poet than a novelist; his novels all appeared anonymously and he did not acknowledge their authorship till 1827; also there is Leigh Hunt (1784–1859), mostly of interest because he figures so importantly in Keats’s biography. He was far better known than Keats during his lifetime. As well as his famous statement of radical reformist religious thought, “Abou Ben Adhem,” he wrote a charming poem to Thomas Carlyle’s wife, Jane, that sticks in the mind, “Jenny Kissed Me”:

Jenny kiss’d me when we met,

Jumping from the chair she sat in;

Time, you thief, who loves to get

Sweets into your list, put that in!

Say I’m weary, say I’m sad,

Say that health and wealth have miss’d me,

Say I’m growing old, but add,

Jenny kiss’d me.

Poets practically unknown in their time whom scholars have since rediscovered and found interesting include George Darley (1795–1846), Winthrop Mackworth Pread (1802–39), and John Clare (1793–1864). I’ve now named seven minor romantics. Someone might add another seven, to make, perhaps, fourteen. But today all fourteen concern a small group of professors and graduate students, enough to fuel the odd doctoral thesis and interest readers particularly focused on the period. At the time, however, when those six major and fourteen minor poets were writing, there were considerably less than 2.5 million people in the British isles who could read and write well enough to be poets of such ranking.

Today the current literate field (in American English, say)—at least twenty-five times the size of the field of 1814—might be expected to hold twenty-five times six major poets producing poems of an interest comparable to those of Blake, Coleridge, Wordsworth, Byron, Keats, and Shelley (that is to say, 150 major poets), and twenty-five times fourteen minor poets (or 350) of considerable interest.

That’s about what the statistics are.

The fundamental difference between the world of 1814 and the world of the present day is that six major and fourteen minor poets is a knowable field. Arthur Symons’s The Romantic Movement in English Poetry (New York: Dutton, 1909; Symons: 1865–1945) gives essays on 87 poets born after 1722 and dead by 1868—his own cut-off point for the romantics—arbitrarily as all such dates must be, but still eminently sensible. In a final chapter, “Minors,” he mentions another 52 poets who fall within the same period. Though it may take a decade or more of reading, a single reader can be familiar with the totality of that field. No single reader can be thoroughly familiar with the works of 150 major poets and 350 minor poets. Symons’s 340-page book could not cover the major English language poets alive today—much less give a comprehensive survey of both the major and the minor poets whose births and deaths were contained within the last 140 years. Thus, the doling out by the literate readership of fame, merit, or even simple attention is an entirely different process from what it once was.

When Lord Byron’s poem The Corsair was published in 1814 (that year when all the romantic poets we’ve mentioned were writing), queues began to form outside London bookstores four and five hours before they opened, and, by the time the doors unlocked, those queues stretched around the block. The Corsair sold ten thousand copies on the first day of publication, and three hundred thousand in the next year. (That is to say, by the year’s end, a copy was owned by just over a quarter of the people in the British Isles who could have actually read it.) And when the poet, novelist, and playwright Victor Hugo (b. 1802) died in Paris in 1885, his funeral was a four-day state affair, notably longer and finally grander than, say, the funeral of President Kennedy (b. 1917) on his assassination in 1963. Two years before, in 1883, when opera composer Richard Wagner (b. 1813) died in Venice, his funeral was not much smaller.

Today the deaths of artists simply do not constitute such national events. A far greater percentage of the society has seen the works of Steven Spielberg or George Lucas than ever saw Wagner’s operas—or saw Hugo’s plays or read his poems and novels. But though Spielberg’s and Lucas’s works cost more to make and make more when they appear, when at last these film directors go, neither is likely to have the same sort of final send-off as Wagner or Hugo—which is another way of saying that today even the most popular arts fit into the society very differently from the way they once did, a century or two centuries back.

From time to time all the major Romantic poets—and probably most of the minor ones as well—gave readings of an afternoon or evening in their homes or at the homes of their friends. Throughout the nineteenth century, writing their recollections of the French poets Rimbaud (1854–91) and Baudelaire, people described such occasions. The practice continued up through World War I and over the period between the world wars. William Merrill Fisher (1889–1969) writes about one such poetry reading “at home” in New York City, which the young Austrian-born American poet Samuel Greenberg (1893–1917) attended. From the sixties, I recall a woman who lived on Greenwich Village’s Patchin Place telling about such a gathering when she was a student at Smith College in the late thirties, at which Edna St. Vincent Millay (1892–1950) read from her recent work.

I was born in the opening year of America’s involvement with World War II. Only twice in my life have I been to such an “at home” reading. Once was with an elderly German woman in Vermont, when I was eighteen (I am now sixty-three). The second time was about a half-dozen years ago, when a bunch of graduate students at the University of Michigan dressed up in eighteenth-century costumes and gave a “tea,” at which a few people read their poems. That is to say, it was in imitation of a discontinued practice.

Consider, though: while all of them gave readings at peoples’ houses, neither Byron, Shelley, nor Keats ever gave a public reading of his poems during his brief life. (Prose writers such as Dickens and Wilde went on lecture tours, which even brought them to America, where they often did readings. But they did not come as poets.) A development of the last sixty years and almost certainly encouraged by the large number of returning soldiers to universities after World War II, the public reading, started at the 92nd Street Y in New York City in the years between the two world wars and was given a large boost by the popularity of the poet Dylan Thomas’s public readings. The idea spread to San Francisco art galleries and Greenwich Village coffee shops through the forties and fifties. Today readings are a staple of college campuses and bookstores with any literary leanings whatsoever, so that even the likes of Barnes & Noble sponsors them. “Open mike” readings and poetry slams are a regular part of contemporary urban culture. Only a few nights ago, with my life-partner, Dennis, I went to see Def Poetry Jam, a staged reading on Broadway, at the Longacre Theater, by nine urban poets (one of whom I’d taught with out at Naropa University in Boulder, Colorado, the previous summer); the event grew out of the Home Box Office television series Def Poetry Jam. Now writers such as Maxwell Anderson (1888–1959), Robinson Jeffers (1887–1962), Ira Gershwin (1896–1983) and Dorothy and DuBose Hayward, Gertrude Stein, Langston Hughes (1902–67), and Dylan Thomas (1914–53) all had verse plays (or operas) on Broadway. But despite Fiona Shaw’s one-woman presentation of T. S. Eliot’s The Waste Land in 1996, this is the first time, I suspect, that contemporary poetry, read by the living poets themselves, has hit the Broadway stage. Def Poetry Jam received a wildly enthusiastic standing ovation. Once again, art and the artist—specifically the literary arts and the literary artist—fit into the society in a very different way today from the way they did in previous epochs.

When the change is this great, a phrase such as “the position of the artist has changed” no longer covers the case. Rather, such positions (where, as W. H. Auden once put it, “the artist is considered the most important of the state’s civil servants”) are no longer there for artists to fill—while other positions, however less exalted, are. This is tantamount to acknowledging that art—specifically poetry and prose fiction—has become a different sort of social object from what it once was, as English-language literature itself became a different social object when, shortly after World War I, it first became a topic of university study and so became the object we know today and more and more ceased to be the study of the philology of the language from Old English through Chaucer through Elizabethan English to the present—what “English Literature” had mostly meant before World War I, when it was taught at London and Edinburgh Universities in the 1880s and 1890s.

The things we look for “literature” to do in our lives, how we expect it to do them, and the structures of the social net in which it functions have changed. This is not even to broach the displacements, transformations, and borrowings effected by movies, television, or, most recently, the internet.

While these changes are very real, sometimes we can make too much of them. Art is a tradition-bound, tradition-stabilized enterprise. Often those folks newly alerted to the changes want to see a total erasure of the slate, allowing us to do anything and everything in completely new ways. But it is the traditions—especially (and paradoxically) the traditions of experimentation, originality, and newness—that make it so difficult for so many to see the changes in the actual object itself and that cause those caught up in the rush for originality to end up repeating, often to the letter, the experiments of the past and—save those among them familiar with more of the workings of art’s history—producing works that are just not very original. Often they find the common audience bored or uninterested by their efforts—and the more sophisticated, unimpressed. No more than any other enquiry into aesthetics can this book solve such problems. But it does not ignore them either.

In his manifesto “No More Masterpieces” that forms the centerpiece for his influential collection of essays on art, The Theater and Its Double (1938), the French actor, writer, and director Antonin Artaud (1896–1948) wrote:

One of the reasons for the asphyxiating atmosphere in which we all live without possible escape or remedy—and in which we all share—is our respect for what has been written, formulated, or painted, what has been given form, as if all expression were not at last exhausted, were not at the point where things must break apart if they are to start anew and begin afresh.

We must have done with this idea of masterpieces reserved for a self-styled elite and not understood by a general public …

Masterpieces are good for the past. They are not good for us. We have the right to say what has been said and even what has not been said in a way that belongs to us, a way that is immediate and direct, corresponding to present modes of feeling, and understandable to everyone.

It is idiotic to reproach the masses for having no sense of the sublime, when the sublime is confused with one or another of its formal manifestations, which are moreover always defunct manifestations.

How could one not agree? Or not applaud? Or not run off to spread the news? But as the little history that I have already given suggests, in the seventy years since Artaud wrote his manifesto, the “masterpiece” as it was conceived of in the nineteenth century that Artaud is polemicizing against is by and large no longer part of the active aesthetic landscape. (We read Ulysses for pleasure and are even awed by it. But nobody would try to write out another one, full-scale, any more than someone would try to write another Hamlet in verse.) As well, no one has been able to get around the fact that the “masses” really require education.

Since Artaud wrote his manifesto, it has become widely evident that when the “masses” are left to themselves, the artists who are trying to “say what has been said and even what has not been said in a way that belongs to us, a way that is immediate and direct,” are precisely the artists who most bore and bewilder the masses, who flock instead to the old, tried, tired, and true—not in terms of the classic sublime, but in terms of the formulaic, the violent, and the kitschy. Despite the fears of the moralists, the masses don’t seem to retain any lasting interest even in works of pornography.

V

My education as a writer has been a diverse one. In just over forty years of publishing, I’ve read a handful of books about writing that I felt have saved me some time. (I’ve read many others that gave me little or nothing.) Among the ones I found useful were:

ABC of Reading (1934), by Ezra Pound: Pound’s cranky, cantankerous, wildly opinionated, and wholly individual notions of literature can come as a vivifying breath to those who have endured the teaching of literature as an authoritarian enterprise done this way and not that way. He is almost always right and almost always interesting. Those whose literary educations took a more laid-back form sometimes have difficulty, however, conceiving of whom he could be polemicizing against. Indeed, Pound is a good example of what rebels sound like seventy or eighty years after they have been almost entirely successful.

The Autobiography of Alice B. Toklas (1933) and Lectures in America (1934) are both by Gertrude Stein. I devoured the first in a Vintage paperback shortly after I turned seventeen. In that book, Gertrude Stein tells the nineteen-year-old composer and writer Paul Bowles, “If you don’t work hard when you’re twenty, Paul, no one will love you when you’re thirty.” It’s the first piece of literary advice I ever remember conscientiously deciding to take. The book also brought home to me a lesson without which it is almost impossible to become a professional writer: from it I first learned that the writers who wrote books, the writers who created published works, brilliant works, exciting works, were people. They had bodies. They lived in actual houses. They ate meals. They liked certain of their acquaintances and disliked others. They had personalities. They were neither gods nor primal forces—voices alone, moving, bodiless, through space and time. Their day-to-day humanity ceded them the material for their art. This was as true for Shakespeare, Chaucer, and Milton as it is for Jay Wright (b. 1935), Richard Powers (b. 1957), Angela Carter (1940–92), Michael Cunningham (b. 1952), Alice Munro (b. 1931), and William Ernest Gaines (b. 1933). Had I not suffered this revelation at seventeen, I wouldn’t have published my first novel three years later at twenty.

Having read and so much profited from one of Stein’s books at seventeen, at nineteen I gambled on a second, Lectures in America—and again lucked out. Among these half dozen meditations on English literature, Stein writes, “The paragraph is the emotional unit of the English Language.”

There are myriad technical reasons to begin a new paragraph: another character speaks, the narrative switches focus to what another character is doing, the writer changes to a new rhetorical mode (from external action to internal reverie, from internal reverie to external description), and, of course, the all-purpose change of topic. But all are, finally, one form or another of movement between Stein’s emotional units. In reading over his or her own prose, the writer who can forget the emotions that impelled the writing and can respond to the modulations in the emotions the words on the page actually evoke will generally be able to solve the problem of when to begin a new paragraph, that is, when the tenor of those emotions shift—and it’s time for a new line and an indentation.

Stein was famous for writing in a kind of baby talk, with many repetitions and what was often taken as a childish disregard for punctuation. In that same collection, in what some critics during her lifetime called “Stein-ese,” she wrote:

The thing that has made the glory of English literature is description simple concentrated description not of what happened nor what is thought or what is dreamed but what exists and so makes the life the island life the daily island life … And in the descriptions the daily, the hourly descriptions of this island life as it exists and it does exist it does really exist English literature has gone on from Chaucer until now … That makes a large one third of English literature. (14–15)

Description (or psychological analysis, or any other rhetorical mode associated with fiction) without story to support it risks becoming interminable. But story without description soon becomes insufferably thin. “Good prose,” Flaubert wrote his mistress, the aspiring writer Louise Colet, “is stuffed with things”—another observation of what I suspect is only a different aspect of Stein’s perception.

The more I read and reread Gertrude Stein, the more I am convinced that, for writers, she is the most important critic-writer between Walter Pater and Antonin Artaud—with both of whom, indeed, she overlaps.

I have always found George Orwell’s (1903–50) essay “Politics and the English Language” (1948) a wonderfully clarifying document. In certain circles, during the 1970s and 1980s, Orwell’s piece was used widely as a writing aid for college freshmen and sophomores, most of whom were neither sophisticated nor widely read enough to take in its points. It has never been a popular text with beginning writers (an audience it was never intended for). As well, if you read it carelessly, it can be taken as attacking some of the cherished pleasures of those of us who enjoy the rarified heights of literary theory and its attendant complex rhetoric. But Orwell’s essay is for people who seriously want to write—and who have done enough general reading in fiction, journalism, and criticism so that it is possible that they might even succeed; the essay offers little help to writers who still must learn how to put together grammatical or logical sentences or a coherent argument. Orwell’s piece is specifically for people who can write what passes for “competent” prose, but who need someone to point out why their “competent” efforts are so often empty or worthless. The essay has occasioned a barrage of recent attacks. But the best I can say for them is that most of the essays I have read are notably more muddled than Orwell’s. What these academics are muddled about is why Orwell’s bit of eminent good sense does not turn students who can’t write into thinkers who can. They fail to see that their students have not read enough, while Orwell’s piece is addressed to people who have read too much—specifically too much of the wrong thing, and in the wrong way.

When the writing of literary theory is bad (and often at the general academic level it is), what usually makes it bad is something Orwell’s essay points to—what Orwell calls “operators” or “verbal false limbs,” which save the writer the trouble of “picking out appropriate verbs and nouns, and at the same time pad the sentence with extra syllables which give it an appearance of symmetry” (130).

Often when graduate students find themselves having to write about a difficult passage from, say, Flaubert or Heidegger, which is developing a point (or, more usually, a portion of a larger point) that the graduate only dimly understands, again and again I have seen one or another of them break the passage up all but arbitrarily and put “On the one hand he writes” in front of the first part and “But on the other hand, he says” before the second—or, indeed, any other possible verbal limbs that effect the same suggestion of symmetrical contrast—establishing the idea that a single passage outlining a development actually expresses contrasting or contradictory ideas. The rest of the student’s paper—or section of the paper—will cite other examples of one or the other of these “two contradictory ideas,” sometimes through the medium of a shared word or phrase or sometimes just through hazily similar notions.

Only yesterday morning, while marking a Ph.D. preliminary exam, I found one such false contrast—not in the student’s answer but in one of the questions posed by a colleague: “D. H. Lawrence called the novel ‘the bright book of life.’ Contradicting this, however, he also said that the novel was the receptacle of the most subjective responses to the world. Choose three novels written between 1850 and 1950 in which subjectivity is fore-grounded and discuss them in terms of the formal techniques the writer employs to present or invent the modern subject.” The fact is, there is no contradiction between the novel’s function as ‘the bright book of life’ and its presentation of subjective responses to the world—since subjective responses to the world are part of life. It’s far too limited a reading that would assume “the bright book of life” referred only to the object world around us. The relation is one of “as well as,” not one of “contradicting this.” The point is to understand how B follows from A, not how it contrasts with it. But such careless articulation often suggests to someone whose critical lens is not highly focused that the discernment of such “contrasts” represents “close reading,” or that finding contrasts that aren’t there is the way to trace out some “problematic” or “aporia” (Greek for “contradiction”) in the passage, when all it does is sow confusion on top of misunderstanding. One can write clearly about complex notions. Those complexities still require concentration, repeated reading, and careful articulation to get them clear.

Orwell discusses this process in the context of political journalese, in which the commentator will use “the appearance of symmetry” to set up conceptual antitheses where no antitheses exist. But today this is what makes three out of four graduate student papers (not to mention too many “higher thoughts” from the already securely tenured) reaching after the heights of theory flounder off into fogged failures of logic, leaving their works all-but-pointless exercises in verbiage.

Again, it’s professors, journalists, graduate students, and critics who do write for others who need Orwell’s piece—not undergraduate students who don’t.

I am a lover of the verbal sensuality and conceptual richness of Jacques Derrida (1930–2004), Michel Foucault (1926–84), and Jacques Lacan (1901–81), just as I enjoy William Faulkner (1897–1962), John Cowper Powys (1872–1963), and Charles M. Doughty (1843–1926). I delight in reading them and rereading them, in teaching them and teaching them again; and I enjoy equally John Ruskin (1819–1900), Thomas Carlyle (1795–1881), Edgar Allan Poe (1809–49), and Walter Pater (1839–94).

Still I think, basically, Orwell is right.

Another fine and informative book for people who write regularly and understand the mechanics of writing is Jacques Barzun’s Simple and Direct: A Rhetoric for Writers (1975; revised 1985). Rich in the history of words, the book is particularly good at explaining why some mistakes are, indeed, mistakes. Here’s an analysis from Barzun’s book that dramatizes particularly well one of Orwell’s points, using the example “They said they had sought a meaningful dialogue on their demands, which, as they made clear before, are non-negotiable.”

Meaningful is usually quite meaningless. Does the writer mean productive, fruitful, satisfactory, fair-minded? It is hard to say; the word dialogue is too vague to suggest its proper epithet, and taken together with non-negotiable, it lands the writer in self-contradiction; for what is there to discuss if the issues are not subject to negotiation? The only tenable sense is: “They faced their opponents with an ultimatum.” This result is a good example of the way in which the criticism and simplifying of words discloses a hidden meaning.

Barzun’s chapter on frequently confused words is far more thorough than, say, the one in the ever popular Strunk and White (The Elements of Style, 1959), and it lets us know something about the history of those confusions, which are often more complex than they appear. “Restive,” for example, does not mean restless—or at least up until the Second World War, it didn’t. It was the adjective from “rest” and meant fixed, immobile, or stubborn. Now it means almost anything. Barzun points out how the poor use of words by careless writers makes writers more sensitive to the language less willing to use them for fear of being misunderstood. Barzun’s book is not a remedial text. It’s another grown-up text for grown-up writers.

Other works that I have found useful and stimulating include the essays in W. H. Auden’s The Dyer’s Hand and Forewords and Afterwords; William Gass’s Fiction and the Figures of Life and The World within the Word and Habitations of the Word; Guy Davenport’s The Geography of the Imagination, Every Force Evolves a Form, and The Hunter Gracchus; and Jorge Luis Borges’s Other Inquisitions and This Craft of Verse; as well as Hugo von Hoffmannsthal’s The Lord Chandos Letter (1902). This last is a fictional letter from a young Renaissance writer, presumably to Sir Francis Bacon, explaining why the twenty-eight-year-old young man is giving up literature. If you are feeling discouraged, Hoffmannsthal’s text is all but guaranteed to make you want to get back to writing. Also a turn-on in two very different modes are G. E. Lessing’s Laocoön and Laios Egri’s The Art of Dramatic Writing.

As well, I’m a fan of E. M. Forster’s 1927 meditation Aspects of the Novel. We’ll get to that one shortly.

In his astute and useful essay “On Writing,” Raymond Carver says he doesn’t like tricks, cheap or otherwise. Yet the creation of a certain order of particularly vivid description is a trick. (I discuss it in two essays, “Thickening the Plot” and “Of Doubts and Dreams.”) It is one of the many tricks that, in his own writing, Carver generally eschews. While he was an extraordinary creator of moving and poignant miniatures, and while his descriptions are always adequate for his own narrative purposes, few would cite him as a master of description per se.

Yet the “trick” I speak of was used by Flaubert and Chekhov and the great American short-story writer Theodore Sturgeon. Buoyed by a raft of other descriptive planks, Joyce uses it particularly effectively in Ulysses and “The Dead”; all Virginia Woolf’s mature fiction relies on it more or less heavily, as does Richard Hughes’s, Harry Matthews’s, William Golding’s, Vladimir Nabokov’s, John Updike’s, Lawrence Durrell’s, William Van Wert’s, Gene Garbor’s, Antonia Byatt’s, Robert Coover’s (particularly in his early “realist” novel The Origin of the Brunists), William Gass’s, John Gardner’s, Angela Carter’s, Harlan Ellison’s, Luisa Valenzuela’s, Guy Davenport’s, John Crowley’s, Charlotte Bacon’s, and Rikki Ducornet’s—indeed, just about every writer known for both beauty of language and vivid scene painting. The reason to call it a “trick,” rather than a technique, strategy, or method, is because it doesn’t always work in every instance with every reader every time. It rarely works in the same way with the same reader in repeated readings of the same text. Because it’s fundamentally psychological, its success tends toward a statistical existence across a general audience. Yet, statistically, readers find it highly pleasurable, even though three or four readers will often argue over why it works and when, indeed, it doesn’t. This book several times discusses how it’s done. If you can wrap your mind around it, it’s interesting to try.

Before we get on to the “how,” though, let’s talk a bit about the “why” and the “what.”

VI

During a recent conversation I was having with a friend, he picked up his well-read Vintage paperback of Ulysses, opened it to page 36, and said, “Listen to this: ‘On his wise shoulders through the checkerwork of leaves the sun flung spangles, dancing coins.’ Now, I love that sentence. But why is it better to write that than, say, ‘Sunlight fell on him through leaves’? Or even to omit it altogether and get on with the story, our day in Dublin?”

Actually my friend had already given the reason: because he loves it. A possible reason to love it is because it makes two things pop up in the mind more vividly than does the sentence “Sunlight fell on him through leaves.” One is what specifically happened at that particular time when light fell through those particular leaves; it has been described. In some light, in some venues, when someone walks under a tree, the bits of light simply slide over him or her. In others, such as this one, when, yes, a breeze is passing, they dance. The second thing that pops up is your awareness of the possibilities for the person in that space of shadow and light—in Joyce’s case the jocularly anti-Semitic Mr. Deasy, whose know-nothing claim that there are no Jews in Ireland sets up a controlling irony for the novel: Leopold Bloom, who represents Ulysses to Stephen’s Telemachus, is a Dublin Jew. The combination of specific description and strong implication (in this case, the irony in the word “wise”) is one that, to a statistically large sampling of readers, affords a more vivid reading experience than the simple “statement of information.” As well, because the sentence mimes what it describes—that is, it dances—in a manner I discuss in the essay “After Almost No Time at All the String on Which He had Been Pulling and Pulling Came Apart into Two Separate Pieces So Quickly He Hardly Realized It Had Snapped, or: Reflections on ‘The Beach Fire,’” it calls up a chain of further implications about the way perceptions and words dance and are flung about through the day, which the reader can take as far as he or she wishes.

Now, such combinations of presentation and implication are a trick—though it’s one used by the J-Writer who wrote many of the really good parts in the early books of the Bible (the story of Adam and Eve in the Garden of Eden, for instance), by Homer throughout the Iliad and the Odyssey, and by Shakespeare in his plays and sonnets; also by Joyce, Woolf, and Nabokov. (Pater located it as an element in the true genius of Plato, above and beyond any of his specific philosophical arguments.) I persist in calling it a trick because of these, yes, intermittent successes. But it works for enough readers, enough of the time, to keep writers such as G. K. Chesterton and Djuna Barnes in print, when the political (or, in Chesterton’s case, religious) content of their work has become highly out of favor, if not downright repellent. We love a sentence only partially because of what it means, but even more for the manner and intensity through which it makes its meaning vivid.

People with whom the trick tends not to work include people who are just learning the language and/or who have no literary background in their own or any other language before they start. It tends to include people who know exactly what they’re reading for, and who are not interested in getting any other pleasure from a book except the one they open the first page expecting.

The vividness comes from a kind of surprise, the surprise of meeting a series of words that, one by one, at first seem to have nothing to do with the topic—striding under a tree on a June day—but words that, at a certain point, astonish us with their economy, accuracy, and playful vitality. Again, some of it will work on one reader, whereas others will only find it affected. But it’s managed to remain of part of literature for several thousand years.

Now, “Sunlight fell on him through leaves” has a precise economy and its own beauty. We can enjoy that, too. But the other—through that combination of specific statement and implication—puts a higher percentage of readers closer to the pulse and texture of the incident. Rhetorically, it makes a greater number of educated readers feel there’s a shorter distance between words and occurrence. What we are talking about here is the (very real) pleasure of good writing versus the delight of writerly talent.

If an early nineteenth-century essayist had written, “The true and the beautiful are largely the same and inextricably entailed. That is one of the few self-evident facts of the modern world. Indeed, I believe, if you have understood that, you can pretty much negotiate the whole of modern life,” I doubt anyone would remember it today.

But around 1820, at the conclusion of his poem in five ten-line-stanzas, “Ode to a Grecian Urn,” Keats wrote:

Beauty is truth, truth beauty,—that is all

Ye know on earth and all ye need to know.

The economy, symmetry, and specificity here—the performance of its meaning through implication, accuracy, and bodily rhythm (the rhythmic and alliterative emphasis on “all,” “need,” and “know”; its encompassing of both wonder and “on earth” despair)—lift it to a level of immediacy that won’t shake loose from the mind. “Beauty is truth, truth beauty,” is, of course, in the same rhetorical mode as “Sunlight fell through leaves.” But “—that is all / ye know on earth and all ye need to know” is a statement that implies a broad and complex argument. As you unravel those implications, you can find yourself facing a declaration of the tragic limits of what, indeed, can be known: you really don’t know anything else, and the bare sufficiency of that basis for knowledge has been the universe’s great gift to humanity, a gift from which all law and science and art have been constructed. For behind all we presume to be knowledge, whether correct or incorrect, some correspondence between elements in the world must have been noted at some time or other, a correspondence that was once assumed beautiful, fascinating, or at least interesting—before anyone could go on to judge it useful, efficient, or functional. A correspondence must be noticed before it can be evaluated, can be judged. What makes us notice anything is always some aspect of the aesthetic. The three categories—the useful, the efficient, the functional—already must at least begin as aesthetic constructions, which, only after they have been established through aesthetic correspondences, can go on to support usable judgments on what subsequently we can find in them. That is how all knowledge—however useful—has its basis in the apperception of the beautiful—even to the hideously ugly and the painful. When Keats’s words have impelled my thoughts in this direction, his lines have made me weep the way the tragic knowledge we took with us on our expulsion from Eden occasionally does.

To have that response to the Garden of Eden story, I have to read the text very slowly, leaving out the first chapter of Genesis that contains the famous seven days of creation (introduced by the P-Writer—or Priestly Writer—some three hundred years later). I have to follow what the J-Writer in the eighth century BCE alone put down, word by word, phrase by phrase; and I have to follow the Hebrew version beside two or three English translations, as my own Hebrew is simply not good enough to read it in the original unassisted. I have to pay particular attention to the humor of the text (“I bet you thought snakes always crawled on the ground,” the J-Writer, who first wrote her tale in the later years of the Court of David, jests with her audience; “I bet you thought all human beings had been born out of women for all time.” Critic Harold Bloom and biblical scholar Richard E. Friedman both feel that the J-Writer was likely a sophisticated court lady in the late years of King David’s court). I have to pay particular attention to the multiple meanings of the infinitival intensifiers in the Hebrew, sometimes indicated by italics in the King James version, as well as all the specific information we learn from overhearing the words of YHWH, first in his poetic explosion at the serpent, and finally in his anxious mutterings as he sets the angels with their flaming sword to guard the way back into Eden—which mutterings, of course, reveal to us, after the fact, the most important thing that Adam and Eve (she gets her name only after God gives her and Adam clothes of skins) learned when they ate of the tree of the knowledge of what’s good and what’s bad (etz hada’at tov v’ra): “We are doomed fools; we made a tragic choice; we ate from the wrong tree! We ate from the tree of the knowledge of tov (good) and ra (bad), and the ra (the bad thing) we now know in our bones is that we should have eaten from the other tree—the perfectly licensed tree of life! We now know our choice was mortally bad—for immediately we had to become too busy with our shame to compensate for our error: the knowledge of our mortality (one with the knowledge of how we missed out on immortality), which we have just gained—the knowledge that shuts us out of the garden.”

Those textual details lead me through the implications that such are the inevitable repercussions of all human learning. To learn anything worth knowing requires that you learn as well how pathetic you were when you were ignorant of it. The knowledge of what you have lost irrevocably because you were in ignorance of it is the knowledge of the worth of what you have learned. A reason knowledge/learning in general is so unpopular with so many people is because very early we all learn there is a phenomenologically unpleasant side to it: to learn anything entails the fact that there is no way to escape learning that you were formerly ignorant, to learn that you were a fool, that you have already lost irretrievable opportunities, that you have made wrong choices, that you were silly and limited. These lessons are not pleasant. The acquisition of knowledge—especially when we are young—again and again includes this experience. Older children tease us for what we don’t know. Teachers condescend to us as they instruct us. (Long ago, they beat us for forgetting.) In the school yard we overhear the third graders talking about how dumb the first graders are. When we reach the third grade, we ourselves contribute to such discussions. Thus most people soon actively desire to stay clear of the whole process, because by the time we are seven or eight we know exactly what the repercussions and reactions will be. One moves toward knowledge through a gauntlet of inescapable insults—the most painful among them often self-tendered. The Enlightenment notion (that, indeed, knowledge also brings “enlightenment”—that there is an “upside” to learning as well: that knowledge itself is both happiness and power) tries to suppress that downside. But few people are fooled. Reminders of the downside of the process in stories such as that of Adam and Eve can make us—some of us, some of the time, because we are children of the Enlightenment who have inevitably, successfully, necessarily, been taken in—weep.

We say we are weeping for lost innocence. More truthfully, we are weeping for the lost pleasure of unchallenged ignorance.

Before the Enlightenment stressed the relationship between knowledge and power, there was a much heavier stress on the relationship between knowledge and sex. Freud retrieved some of that relationship in Leonardo da Vinci and a Memory of His Childhood. (The first intellectual problem almost all children take up, Freud pointed out, is where do babies come from, the pursuit of which soon catapults us into the coils and turmoils of sexual reproduction.) It perseveres, of course, in the concept of “knowing” a woman or a man sexually. It is there in the J-Writer’s version of the Adam and Eve story as well: To know that sex leads to procreation is immediately to want to control it (especially among beleaguered primitive peoples), to set up habits (covering the genitals or other body parts) to dampen the sexual urges. But any effort to keep them under control is to instill habits that produce shame and embarrassment when violated, even in pursuit of procreation itself, to say nothing of innocent, guilt-free copulation. As a deeply insightful “pre-Enlightenment” text, the Adam and Eve story figures this aspect of the tale forcefully just as it figures that death will come before we can do anything about it: that knowledge is the burning blade preventing reentry into the garden and a return to the tree of life. The tragic implications repeatedly produce real tears in me—as I suspect they have for many readers over the centuries.

The story of Eden is a short, ironic tale to teach children a religious tradition—that can make an adult (and, in my case, an adult who happens to be an atheist) weep. That’s among the things that, through statement and implication, stories can do. Such implications as nestle in Keats’s ode and the J-Writer’s Eden story are so broad that, today, most of us would probably figure, “Don’t even try it!”

But both work.

When one approaches Keats’s conclusion about truth and beauty through the historical set-up of Ode to a Grecian Urn’s previous 48 lines, his observation can take the top of your head off. Keats is, after all, the master of accuracy and implication among the English romantic poets, working toward vivid immediacy. Indeed, like Joyce’s story “The Dead” and Lawrence’s tale “Odour of Chrysanthemums,” Keats’s poem is one of the gentlest, one of the most powerful retellings of the tale of the Edenic expulsion implicit in the gaining of any and all knowledge (in Keats’s case, it is the particular knowledge called happiness implicit in domestic social beauty).

Again, not everyone is affected by these texts in this way; nor is each reader affected by them in the same way every time she or he reads them. But enough readers find that they work enough of the time to preserve specific description and withheld implication as valued techniques of the literary, both in prose and poetry. Writings that employ those techniques generously often seem more immediate, more protean, and more vibrant over the long run than works that eschew them for a safer rhetoric and more distanced affect.

I think of myself as a reader with broad, if not actually catholic, tastes. When I have tallied it up, I find that I spend as much on reading matter weekly as I do on food—now that my daughter is grown—for a family of two. That includes a fair amount of eating out. As much as I love to read, however, I enjoy reading far fewer than one out of twenty fiction writers. (That’s currently living and publishing fiction writers.) Certainly I read more books than I actually like. Telling you a bit more about the kind of reader I am will, then, suggest something about the strengths—and the limitations—of the book to come.

VII

My approach to story is conservative—all but identical to the one E. M. Forster (1879–1970) put forward in his 1927 meditation, Aspects of the Novel. (I said we’d return to it.) Because Forster says it well and succinctly, I quote rather than paraphrase. Only then will I point out the few ways in which Forster and I differ.

If you ask one type of man, “What does a novel do?” he will reply placidly: “Well—I don’t know—it seems a funny sort of question to ask—a novel’s a novel—well, I don’t know—I suppose it kind of tells a story, so to speak.” He is quite good-tempered and vague, and probably driving a motor bus at the same time and paying no more attention to literature than it merits. Another man, whom I visualize as on a golf-course, will be aggressive and brisk. He will reply: “What does a novel do? Why, it tells a story of course, and I’ve no use for it if it didn’t. I like a story. Very bad taste on my part, but I like a story. You can take your art, you can take your literature, you can take your music, but give me a good story. And I like a story to be a story, mind, and my wife’s the same way.” And a third man he says in a sort of drooping regretful voice, “Yes—oh, dear, yes—the novel tells a story.” I respect and admire the first speaker. I detest and fear the second. And the third is myself. Yes—oh, dear, yes—the novel tells a story … The more we look at the story (the story that is a story, mind), the more we disentangle it from the finer growth it supports, the less we shall find to admire. It runs like a backbone—or may I say a tapeworm, for its beginning and end are arbitrary … It is a narrative of events arranged in their time sequence—dinner coming after breakfast, Tuesday after Monday, decay after death and so on. Qua story, it can only have one merit: that of making the audience wonder what happens next. And conversely it can only have one fault: that of making the audience not want to know what happens next. These are the only two criticisms that can be made on the story that is a story … When we isolate the story like this and hold it out on the forceps—wriggling and interminable, the naked worm of time—it presents an aspect both unlovely and dull. But we have much to learn from it. (25–28)

And we have much to learn from Forster’s description of it, as well. Paradoxically, story itself does not have a beginning, middle, and end (though any particular story must have these in order to be satisfying): story itself, however, is “interminable” and (incidentally) chronological, “the naked worm of time.” The famous “beginning” and “end” (of the “beginning,” “middle,” and “end” triad) are simply narrative strategies for mounting the endless train of narrative and strategies for dismounting. When writers try structurally to harmonize the beginning and ending strategies with what is going on in the mid-game, we approach the problems grouped under the rubric “narrative art.”

Within his ellipses (the parts I have elided with the traditional three dots), Forster talked about the age, strength, and power of story, for which he had much respect. So do I. (She-herazade of The Thousand Nights and a Night is the heroine of the passages I’ve omitted. Look them up.) But in terms of the problems before us, that is not to the point. Indeed, what is to the point is that, in most of the narratives we are presented with today, be they sitcoms, TV miniseries, movies, or even news accounts, the stories we get are mostly bad. With some extraordinary exceptions throughout the history of all these fields, most comic books, TV series, and action movies don’t have good stories. Neither do most published novels, and for the same reason: the logic that must hold them together and produce the readerly curiosity about what will happen is replaced by “interesting situations” (or an “interesting character”), which don’t relate logically or developmentally to what comes before or after. That is to say, they are wildly illogical. We cannot follow their development, even—or especially—if we try. If we look at them closely, they don’t make much sense. The general population, day in and day out, is not used to getting good stories. This has two social results.

First (on the downside), it probably accounts for why there is so little political sophistication among the general populace. Political awareness requires that people become used to getting rich, full, complex, logical, and causative accounts of what is going on in the world and, when they don’t, regularly demanding them. But with television and most films and books, they get little chance.

Second (on the upside), it produces a relatively small but growing audience interested in and hungry for experimental work. Paradoxically, most experimental work is simpler than the traditional “good story.” As far back as 1935, in his introduction to his selected poems, Robinson Jeffers called the techniques of modernism “originality by amputation.” Formally, it’s still a pretty good characterization—which is probably why normative fiction (and figurative painting) persists. What it leaves out, however, is that the nature of the experiment is rarely a negative one. It’s a positive one. E. E. Cummings (1894–1963) began his lines with lower-case letters throughout his career—as has Lucille Clifton (b. 1936) throughout hers. But the experiment is only secondarily about not beginning your lines with upper-case letters. It’s about the effect gained by beginning your lines with lower-case letters. It is a matter of exercising the attention to focus on smaller elements that, in a “good story,” would only be perceived in concert with many others. The long-term effect of experimental work is the heightening of the microcritical abilities among readers, so that, among other things, we get better at criticizing those “good stories” that turn out to be, in reality, not so good after all.

I do not believe the only purpose of the contemporary, the experimental, or the avant-garde is to increase our appreciation of the traditional. Both have rich and distinct effects, pleasures, and areas of meaning. But as the legacy of high modernism (through which most of us come to the contemporary and the avant-garde) makes clear, the normative and the experimental relate; they nourish each other.

What distinguishes story from a random chain of chronological events that all happen to the same character, or group of characters, is causal and developmental logic. This logic alone is what makes one want to find out what happens next. Most beginning writers are, however, unaware of how fragile the desire to know what comes next actually is—or how easily it’s subverted.

Turning readers’ attention from the future to the past with a flashback will almost always slay that desire, unless that flashback answers a clear question set up in the previous scene—and answers it clearly and quickly.

In my creative writing classes today rarely do I get a short story of more than six, eight, fifteen pages that doesn’t have at least one flashback in it. Rarely does it work. Understand, I have no problem with realistic flashbacks—but in life, flashbacks are just that: flashes. They last between half a second and three seconds, ten at the outside. Thus, in texts, they are covered in a phrase or two, a sentence, three sentences, or five sentences at most.

Try to think about a single past event concertedly for more than ten seconds, without the present intruding strongly. Unless you are talking about a specific past event with another person, who is stabilizing your attention with questions and comments (or, indeed, unless you are writing about it, so that your own recorded language helps stabilize your thought), it’s almost impossible. Indeed, what’s wrong with most flashback scenes in most contemporary fiction is that they are simply unrealistic: by that I mean the scene where, on Friday night, Jenny sits in front of her vanity putting on her makeup, in the course of which she thinks back over the entire progression of her relationship with Steve—for the next six pages!—whom she is going to meet later that evening; or the scene where Alan is walking down the street Monday morning, during which he runs over the last three months’ growing hostility with his foreman, Jeff—for eight pages!—whom, when he arrives at work, he will confront to demand a raise. Nine times out of ten, both these stories simply begin at the wrong place. The first really starts with Jenny’s meeting Steve. The second begins the first time Jeff’s hostility manifests itself to Alan.

The “subjectivity of time” that writers and philosophers have been going on about for the last hundred years or so has to do with whether or not time passes quickly or slowly—not whether it passes chronologically. Of course conscious and unconscious memories constantly bombard our passage through the present. The web of unconscious memories and associations is what makes the present meaningful, decipherable, readable. That web is why a frying pan on a stove, a book on a shelf, and a broom leaning in the corner register as familiar objects and not as strange and menacing pieces of unknown super-science technology from a thousand years in the future. Spend some time observing how these memories arrive, how long they stay, how they add, expand, subvert, or create present meaning before you plunge into another flashback. It may save you time and preserve believability as well as free you from a bunch of stodgy fictive conventions.

I want to be clear—because several readers have misunderstood me in earlier versions of this same argument. What I’m arguing against here is not flashbacks in themselves. Even less am I against a conscientious decision to tell a story in something other than chronological order. (To repeat: I enjoy experimental fiction. For me to come out against nonlinear storytelling would simply be a contradiction.) What I object to is the scene whose only reason is to serve as the frame for an anterior scene because the writer has been too lazy to think through carefully how that anterior scene might begin and end if it were presented on its own, and so borrows the beginning and ending of the frame scene, which—equally—has not been chosen because anything of narrative import actually happens in it. What I’m reminding you is that flashbacks themselves began as a narrative experiment: If you’re going to experiment, one that has a reason will always win out over one without any thought behind it, one we simply indulge because, today, that’s the way everyone else does it.

Here is a rule of thumb that can forestall a lot of temporal clutter in your storytelling. Consider the scene in which the flashback occurs. Ask yourself, “Has anything important happened in the scene before the flashback starts? Has any memorable incident taken place? Have we seen any important change? Has the character done anything more than sit around (or walk around) and think?” If the answer to all these questions is no (and thus the only purpose of the present scene is to allow the character to remember the past incident in the flashback proper), consider omitting the frame and telling the flashback scene (after deciding on its true beginning and a satisfying conclusion) in the order that it occurred (often it’s the first scene—or one of the first—in the story) in terms of the rest of the narrative’s incidents.

The fictive excuse for the flashback is that it is a product of memory. The reason for fiction, however, is that it provides the explanatory force of history. This may seem like an overly grand statement. But give it a little thought.

We live our lives in chronological order.

When we remember them, however, our mental movement is almost entirely associational.

Listen to people who are not trying to solve a particular problem reminisce with one another. One good meal leads to another. One sadness leads to another sadness, till suddenly it becomes too much and the conversation leaps to pleasure or silliness or gossip. It’s only when human beings want to solve a problem or figure out the causality behind something that they carefully try to reconstruct chronological order.

If you’ve ever done it with someone else, you know how hard it can be.

Why did we lose the war? Because before we marched off to fight we didn’t start out with good weapons and well-trained men. Why was last year’s crop so good when the crop before that was so poor? Because the river flooded and left a deposit of silt over the land that promoted rich growth—while just before the year of the poor crop no flooding occurred at all. Chronological causality is how history begins, and that can only be supplied by chronological order. Only the concomitant cross-checking and stabilizing by notation and the pressure to be accurate and exact that two or more people remember in dialogue with one another creates history. What one person remembers by himself, while it may be a contributing element to history, is precisely not-history until it enters into such a dialogue. Chronology is our first historical mode.

Fiction is an intellectually imaginative act committed on the materials of memory that tries for the form of history.

That’s why a political climate pushing the individual to see her- or himself as autonomous and self-sufficient is, by definition, a climate unsupportive of rich and satisfying fiction. (This is not the same as an individual writer in her or his work pushing against a climate of conformity and security to assert her or his individuality.) A climate that discourages research and open discussion is usually pretty distrustful of good fiction as well.

However much, as readers, we lose ourselves in a novel or a story, fiction itself is an experience on the order of memory—not on the order of actual occurrence. (History is an even higher level of abstraction.) It looks like the writer is telling you a story. What the writer is actually doing, however, is using words to evoke a series of micromemories from your own experience that inmix,* join, and connect in your mind in an order the writer controls, so that, in effect, you have a sustained memory of something that never happened to you.

That false memory is what a story is.

Among other things, the writer’s art comprises various techniques to make that unreal memory as clear and vivid as possible. That clarity, that vividness, is entirely dependent on the order and selection of her or his words. Again, one might say that the fiction writer is trying to create a false memory with the force of history. The problem with the flashback is, again, that we don’t have too many memories of memories that we recognize as such—or, when we have them, rarely are they the most vivid among our memories. Thus, the flashback is a tricky technique. Think about its problems if you’re going to use it. When you find yourself telling your story out of chronological order, ask yourself if it adds anything truly necessary or important to the telling, or is it just laziness or bad habit, a failure to think through the tale logically to (and from) a beginning.

Here, now, are the places I differ from Forster. First, I, too, am the reader who says, “Yes—oh dear, yes—the novel tells a story.” I too very much fear the second reader. (“I like a story. Very bad taste on my part, but I like a story. You can take your art, you can take your literature, you can take your music, but give me a good story. And I like a story to be a story, mind, and my wife’s the same way.”) But while I fear him, unlike Forster, I don’t detest him. For while I believe (one) that the second reader is profoundly mistaken and needs to be the focus of most of today’s educational energy and (two) that he is the audience that most corrupts both critical and commercial approaches to the popular arts, I also feel that such audience members are educable, in a way that Forster probably didn’t. Today, this reader’s haunts are not golf courses but rather the active fandoms of TV, comic books, science fiction, and other venues particularly appreciative of paraliterature or popular culture. The fact is (which puts me close to Forster once more), I recognize that without some story—temporal, developmental, logical—most writing is simply not recognizable as fiction. But having said that (and this moves me away from Forster, even as it sets me in antagonistic opposition to Forster’s golfer), I see no particular reason why all writing, even if it begins by appropriating the name “novel,” “story,” or, indeed, “poem” or the name of any other genre, needs to be immediately recognizable as belonging to the genre label it carries. I have gotten great pleasure from “short stories” that were nothing but sequences of numbers, random words, or abstract pictures, not to mention comic books—a medium I love. I’ve gotten pleasure from J. G. Ballard’s “condensed novels,” which are collections of impressionistic fragments running only seven or eight pages each (see The Atrocity Exhibition, 1967). I have gotten pleasure from poems where the words were chosen by any number of games or operationalized systems or semantic or aesthetic tasks or within any of a variety of constraints. But all this will be discussed in its place. Genres are ways of reading, ways of understanding, complex moods, modes, and chains of expectations—discourses, if you will—and as such there is as much aesthetic pleasure (and use!) to be found in opposing those expectations as in acquiescing to them.

Sixty years ago, that witty and sensible critic Leonard Knights (“How Many Children Had Lady Macbeth,” 1948) noted: “Only as precipitates from memory are plot and character tangible; yet only in solution has either any emotive valency.” This is what Forster’s dull, ugly worm is all about. Plot, character, and the structure that constrains and embodies them are the solutes that effloresce into emotive force within the solution of those “finer growths.” Those “finer growths” through which the plot and characters achieve their emotive fullness are, themselves, controlled by structure. Most of the interminable discussions of plot in writing texts are useless because finally plot has no existence by itself; it is only a single aspect of a more complex process (which I call structure); and if the writer tries to deal with only the plot by itself, he or she ends up twisting at that dried-up little worm, which, when it effloresces, may or may not swell to proper shape and effect, depending entirely on the solute—the finer growths—it arrives in.

This book teases apart how writing works: what the process of its making consists of; and how its making is made by and remakes the world.

These are huge topics.

As the reader can see, this is not a large book.

My comments about them are suggestive rather than definitive. Still, with what notions we can harvest here—the ones we can speak of intelligibly—I hope my readers can begin to figure out how to do what they want on their own.

VIII

What sorts of stories do I enjoy?

What do I read for?

I read for information. Clearly, forcefully, and economically given, information constitutes my greatest reading pleasure. I cotton to Ezra Pound’s oft-quoted dictum, “Fundamental accuracy of statement is the ONE sole morality of writing.” (Raymond Carver quotes it in his fine essay “On Writing.”) Notice, however, Pound says “morality,” not value. The first information I read for, at least in fiction, is usually visual and generally sensual. I want to know where I am, and in particular what that place looks like, smells like, sounds like, and feels like. If the writer can make me sensorily aware of his or her setting—trick me into seeing/hearing/smelling it vividly (again, vivid description is a trick, and a more complex trick than simply laying out what’s there), so much the better. Throughout my fictions I want Stein’s one-third “description simple concentrated description not of what happened nor what is thought or what is dreamed but what exists and so makes the life the island life the daily island life”—or, if not, then something that I will find equally interesting.

The next kind of information I read for is any tone of voice in the writing that is informative itself about the story, about how the story is getting told. Just who is the narrator? Should I trust that narrator? Should the narrator awake my suspicions? Should I like the narrator or not like the narrator? Should I look up to the narrator? Or should I assume the narrator is my equal? What is the narrator’s attitude toward the characters who occupy the foreground of the fictive field? And toward those in the background? And to the other characters? And the situation and the setting itself? If the writer keeps giving me those shots of vocal and sensory information, forcefully and with skill, I can be happy with any one of the narrative stances above, because I am disposed to trust the writer creating that voice and painting the pictures—whether I “like” a character or not. Even if the narrator gives me mostly vocally modulated analysis (Proust, James, Musil …), I can be happy with the tale—though probably a reader other than I will have to discover that book and alert me to its excellences before I read it. (Proust, James, and Musil are not writers I’d have been likely to pick up on my own and stick to without some critical preparation. Joyce or Nabokov I might well have.) Those fictive works that make their initial appeal through tone of voice—often a tone solidly bourgeois, educated, ironic—can take on more complex concepts and explore them through a level of formal recomplication that is often richer than the relatively direct fiction writer can achieve. But the greatest failures in this mode occur when the voice runs on and on without ever managing to erect the narrative structures that create beauty, resonance, and finally meaning itself. These failures usually hinge on a misunderstanding we have already seen: the confusion of “the literary effect” with an effect of tone rather than an effect of form that can even contour the tone (a confusion I would say my very smart twenty-six-year-old male creative writing student had fallen into).

Writers working in this mode, however, should avoid creative writing workshops. Little or nothing in such works can be criticized on the workshop level. Often the resonating structures take 60, 130, 300 pages to construct. By the same token the most successful works in this mode (Proust, late James, Dorothy Richardson’s Pilgrimage, Anthony Powell’s Dance to the Music of Time, Joyce of Finnegans Wake, Gertrude Stein of The Making of Americans and Lucy Church Amiably, Marguerite Young’s Miss McIntosh, My Darling, James McElroy’s Plus and Men and Women, William Gaddis’s The Recognitions …) do not find their audience quickly. (In his Alexandria Quartet, Lawrence Durrell tried to have it both ways and was, I feel, remarkably successful—though the reader has to commit himself to the whole thing. Moreover, most of Durrell’s theoretical folderol about axes and so forth is simply distracting nonsense.) Those structures have to be built just as clearly—in their own larger, more generous terms—and the writing must eventually seem just as economical, if such works are to garner a readership.

When I read, I am also aware of tone (apart from tone of voice) and mood, and often a quality that can only be called beauty. Still, a writer who tries to go for them directly without giving me a hefty handful of writerly stuff on the way is usually not going to make it.

He walked into the room and saw Karola sitting there. She was beautiful. He thought of flowers. He thought of butterflies. He thought of water running in the forest.

The writer who begins a story with these sentences is probably very aware of tone—but is not really giving me, as a reader, much else. (I would be getting even less, if it were in the present tense—“He walks into the room and sees Karola sitting there. She is beautiful. He thinks of flowers. He thinks of butterflies. He thinks of water running in the forest”—more “tone” and even less voice.) It is much easier for me to be interested in a story that begins:

He walked into the little room with the white plaster ceiling and the wooden two-by-fours making rough lintels above its three windows. Karola sat at a small table, her forearm in the sunlight. When he looked at her ear, he remembered the pink and white flowers in his aunt’s kitchen garden back in New Zealand. By her tanned cheek, some of her white-blond hair lifted and shook in the breeze, and he remembered the flaxen butterflies flicking in and out of the sunlight and shadow of the big Catalpa outside in the green and gray Bordeaux landscape they’d been staying in three summer months now. Just standing there, just looking at her, he felt the same surge of pleasure he’d felt, a year before, when he’d come around the rocks in the twelve acres of forest his aunt had purchased for the farm in that last, sweltering New Zealand winter, and he’d seen the falling water for the first time, how high it was, how it filled his head with the sound of itself, how cool it looked in the winter heat. Karola did that to him.

Although I can’t know or even be sure, I suspect the first writer wanted to describe something as interesting and richly detailed as the second writer, but was afraid to, or was just imaginatively incapable of it—or, perhaps, had gotten distracted by thinking only about tone. But as a reader, I find the second more interesting.

As I said, the vocal approach I can also find interesting:

He stepped into the room—Jesus, it was so white—but Karola was sitting there. If you’d asked him, later, what he’d been thinking right then, he would have answered, “I don’t know what to tell you. I thought she was beautiful. I did, really. It’s stupid, yeah. But I thought about flowers. You think about flowers, you think about butterflies. That’s just what’s going to happen with some guys. And waterfalls in the forests, that kind of thing—I thought about them, too.” But then—right then—standing just inside the door, a dozen memories flickering in and out of his consciousness, he thought only: “She’s beautiful.”

Here, in terms of direct information about the scene described, this third writer is giving no more than the first one. But what it lacks in specific detail and associative richness, it starts to compensate for by giving a sense of a person, with a voice, that lets us know a fair amount about the character, either as it infects the narrative voice (“Jesus, it was so white!”) or directly (“I don’t know what to tell you. I thought she was beautiful. I did, really”).

Personally I find the tone and the mood of the second and third examples much more interesting than the tone and mood of the first. In all three cases, tone and mood would be things not to violate, as the story—or at least the scene—progresses. With number 3 (“He stepped into the room—Jesus, it was so white—”), I’d probably want something to start happening on Forster’s “pure story” level faster than I would with number 2. (In number 1, I’d want something to happen almost by the next sentence, or the tale would lose me.) Too much of number 3’s foot shuffling and embarrassment grows quickly tiresome though. Soon I’d want some proof that this personality, this sensibility, this observer was worth my time to stay with. He’s got perhaps another three sentences in which to observe something interesting and tell it to me in an interesting way. Almost certainly I’d have more patience with the second narrator—because what he gives me is informatively richer. I’m more willing to let the second narrator take time to build up my picture of where these people are, who they are, what their relationship is, and suggest how, in the course of the tale, it’s going to develop. So the second narrator has about five more sentences in which to let me know a lot more about the woman at the table (or let me know why the narrator doesn’t know it). Wouldn’t it be interesting if, say, in either example 2 or 3, Karola turned out to be a Palestinian and six or seven years older than our narrator? As soon as her hair began to turn white, she bleached it platinum. There, in France, with her current young New Zealander, who finds her so fascinating, she’s working on a book about her country’s archaeology …

Still, with examples two and three I have more trust in the writer than I do with example I—a trust that, in terms ranging from mood to plot, either writer 2 or 3 may still betray with the next sentence. However promising I find their openings, both tales could dissolve, equally and easily, into clutter. Unless the writer is really setting us up for a very conscious effect, number I telegraphs a general thinness that is the hallmark of contemporary dullness. And if the narrator doesn’t win my trust soon, I’m likely to enjoy only a narrator whose tone and character I personally like. And if the narrator never gains my trust, however much I personally like the narrator or sympathize with his or her politics or recognize the situation, for me the work remains—if I keep reading, and most of the time I don’t—an entertainment, rather than a work of art. Finally, I want all this information—whether sensual or tonal—given me economically. If, after even three, five, seven sentences, I have not gotten one or the other of these orders of information, and I find myself spotting extraneous words and phrases that tell me nothing of interest, phrases that withhold information rather than present it, expressive clumsinesses and general lack of writerly skill, then I am disgruntled. (Vast amounts of fine literature wait to be read. Many more skilled writers exist than I can read in a lifetime. Unskilled writers don’t hold much interest for me. Bad writing makes me angry.) If the elements of the sentence could be better arranged so as to give the information more swiftly, logically, forcefully, I am equally unhappy. (I don’t particularly enjoy having to rewrite the writers I read, sentence by sentence. I want the writer to have done that work for me.) In my experience, three such clumsy sentences in a row usually indicate that the text will be littered with them. Despite whatever talent is manifested, they signal that the imaginative force needed to develop an idea clearly and explore it richly is likely lacking. In turn this means that even should I enjoy the story, I am not likely to point it out as an exemplum of one idea or another (unless it’s an example of what not to do); nor am I likely to sketch out the development of its idea as praiseworthy in any of my own critical writing. (As Emily Dickinson wrote, “Nothing survives except fine execution.”) While any of the information I enjoy might be worked up to form what might easily be called a good story, if I don’t enjoy the economy and force of

the presentation (the word for this level of presentation is “style”), from experience I know the tale will simply not be worth the time and energy I must put into reading it. These are the books—the nineteen out of twenty—I put down and rarely come back to.

Fortunately there are other readers who read—no less critically than I—for a different order of writerly and readerly priorities and pleasures. In their critical writing, such readers are always guiding me to things I might have missed, as I hope, in turn, now and again I can guide them to something interesting. Of course what is likely the case is not absolutely the case. Three dull, bland, or clumsy sentences don’t always mean an impoverished work. I would have missed out entirely on the considerable pleasures of Leonid Tsypkin or W. G. Sebald had I only read the opening page or pages of either, under my own critical regime—not to mention Theodore Dreiser, a great novelist (for many readers, including me) despite his style.

Nevertheless, the above represents my own priorities. It outlines my own aesthetic gamble, if you will, in the greater process of working to sediment the new or revised discourses that stabilize the systems of the world and make them better. (The purpose of fiction in particular and art in general is not to make the world better, directly and per se. But, despite the protests of all the apolitical critics, they [art and fiction] still help, if only because, as critics from Pater to Foucault have acknowledged, they do make life more enjoyable—specifically the time we spend reading them. If they didn’t, we wouldn’t bother.) In pursuit of such ends, the above gives the parameters around which my own set of dos and don’ts for fiction are organized—and thus suggests where their limits lie. Unless another critic has alerted me to pleasures that will only come after 50, 75, 150 pages, these are the texts I’m likely to abandon after a few thousand words or so—if not a few hundred.

Although I believe thirty-five years of teaching creative writing have helped me become more articulate about my readerly responses than I might have been without them, and while there are many other good readers of types different from mine, I do not think I am all that uncommon. I believe the kind of reader I am has a contribution to make in the contestatory wrangle producing that social construct, literary quality. But because human beings are a multiplicity, there can be no fixed and final canon, despite whatever appearance of stability any given view of the canon suggests. This is why no single book can tell folks how to write fiction that will join the canon. Having seen the canon change as much as it has in the years between my adolescence and the (I hope) forward edge of my dotage, I’m content with the forces that retard that change as much as they do.

Balzac, Dreiser, and Sebald; Lawrence, Barthelme, and Bukowski are all extraordinary writers, for extraordinarily different reasons. All are writers who at one time or another I’ve gorged on; but all are writers about whom I end up feeling, finally, that a little goes a long way. To enjoy any and all of them requires a fertile and lively mind; fertile and lively minds find things of interest, and thus may also find greater or lesser amounts of what’s in this book interesting. They may also find some things here painful, if not crashingly irrelevant, even as they marvel that someone could go on at such lengths as I do about fiction while spending so little time on fiction’s oh-so-necessary social content.

Because in the realm of art all absolute statements are suspect, the most I can say is that I am still willing to gamble on the fact that, by and large, most of the writers whose works I would lay down and not return to are ones who don’t contribute very much (except by their all-important negative examples), and the exceptions are precisely those glorious ones that prove, in the sense of test, the rules and principles on which my overarching aesthetic rests.

The less interested either we or our characters are in their jobs, incomes, families, social class, landlords, friends, neighbors, and landscapes (i.e., how they are connected to the material world around them), the less we have to write about. This may be why the highly individualistic but highly isolated heroes of genre fiction—from Conan the Conqueror to James Bond—often seem so thin in relation to those of literary fiction. This is why the strength of such stories that feature them tends to be on an allegorical—i.e., poetic—level, rather than on the level of psychological (not to mention sociological) veracity.

IX

I’ve already suggested that the desire to hear our stories in chronological order may begin with the desire to have our fictions take on the image of history. Eighteenth-century novels such as Fielding’s Tom Jones (1749) were often called histories (the novel’s full title is Tom Jones, the History of a Foundling). Readers of the twenty-four-years-earlier Robinson Crusoe (1726) initially flocked to the book because they thought they were getting the thinly fictionalized “history” of the actual adventures of a sailor named Alexander Selkirk, who had famously spent time on a desert island, as had Crusoe; Defoe even encouraged the rumor that he had interviewed Selkirk in order to write his book, though almost certainly that was untrue and just a publicity move.

Supporting him through fourteen meaty novels, Dickens’s great discovery in the nineteenth century was that what happens to us as children directly influences the adults we turn out to be, both in terms of our strengths and in terms of the shortcomings we must overcome. Thus plot in the novel in particular—and in fiction in general—became, for Dickens, part of a structure of incidents that not only tell the story but also move us among the kinds of incidents that explain what happens in terms of certain kinds of causes as well as the given moments of history needed to understand them: in Dickens’s case, particular childhood incidents and (later in his novels) the adult happenings particularly affected by them.

The fictive discovery of the eighteenth century was that the forces of history were themselves large determinants of our interests, wants, and desires. Nor is that lesson forgotten during the nineteenth. In Stendhal’s The Red and the Black (1830), Julien Sorel’s life is entirely determined through having to live in the social retrenchments following the expansions of the Napoleonic wars. In Les Misérables of 1862, Victor Hugo shows how the social advancement of working-class criminal Jean Valjean is as dependent on the turmoil accompanying the early years of the age of republican revolutions in Europe in general and France in particular as that advancement is dependent on Valjean’s own character—while seven years later, in 1869, almost as if it were posed as a counterargument by Flaubert, Sentimental Education details how its middle-class hero, Frédéric Moreau, misses out on opportunity after opportunity to behave as a moral hero over the period that includes the revolution of 1848, through his own romantic daydreaming coupled with his personal inhibitions.

The nineteenth century’s particular addition to the novel might be seen as a realization that the conflicts between social classes and the desires that cross class lines—along with the aforementioned Dickensian discovery of family and childhood as a complex force in the creation of character—propel the machinery of the world.

Beyond the one-third that is “description of the daily island life,” the glory of the nineteenth-century novel was its ability to present dramatically, in logical if not chronological order, the complex of reasons that cause things to work out as they do: What elements in his own miserly character interact with his disappointments in the world to make Ralph Nickleby hang himself? Bitter and rigid police inspector Javert is obsessed with his belief in Jean Valjean’s subhumanity and fundamental evil. How and why, then, after Valjean saves Javert’s life, once they meet just outside the Paris sewers during the Revolution of 1832 does Javert subsequently go to pieces, finally allowing himself to fall from a bridge into the Seine and drown? How does the brilliant provincial inventor David Séchard end up a happy man, even though most of the profits from his discovery of the way to make paper from artichoke fiber have been stolen from him? How does David’s childhood friend, the aspiring poet Lucien Chardon, end up a miserable suicide in a Paris jail cell, where he has been imprisoned for murder?

The dramatic richness and resonance with which these questions are answered contribute to making Nicholas Nickleby (1838–39), Les Misérables (1862), and Lost Illusions (1834) great novels.

Drama suggests that if we simply hear what Ralph, Javert, David, and Lucien say to other people and watch what they do, we shall understand their fates. The novel adds: For full understanding, we must also know how they think and feel, as well as how they are enmeshed in “the daily island life.” In short, it adds the most productive parts of psychoanalysis and Marxism to the historical mix.

The twentieth century’s particular refinement on these exploratory and explanatory novelistic structures, from Proust and James to Joyce and Woolf, was that, in the lives of real people, all these elements were now further granulated across the individual play of swirling subjectivity, either dramatically though artfully rendered stream of consciousness techniques (as in Woolf and Joyce), or through precise analysis (as in James and even more so in Proust), on an even more nuanced, more complex level. By adding the focus on the subjective, however, such writers do not forget the social.

If one or more (or indeed all) of the characters in a story are unaware of the sociohistorical levels that contour where they are and the choices they have open to them in the world, it doesn’t particularly matter. But, as the writer is less and less aware of these sociohistorical levels in the course of structuring her or his tale (that is, when the structure of the story does not carry us through a set of incidents, places, and descriptions that, apart from or in conjunction with the “plot,” help explain those positions and those choices), the tale seems thinner and thinner, regardless of its subjective density.

To generalize all this and say that fiction that is unaware of the historical dimensions, both of the genre and of the aspects of life it chooses to portray, tends to be thin and relatively uninteresting sounds hopelessly high-falutin’, even arrogant. But there it is. Certainly this is the failing of the “sin and sex in the suburbs” genre, which over the sixties, seventies, and eighties produced such a memorable amount of unmemorable writing. Its plots so rarely moved the characters through any situations that allowed the characters (or the readers) to see what had stalled these characters in that landscape, or what was preventing them from leaving it, or why they could not transform it into something more humanly satisfactory. Similarly it is the major failing of the genre that has come largely to replace it through the culture of university creative writing and MFA programs: “sin and sex in graduate school,” where, in story after story, the characters never consider the absurdly low exploitative salaries they are actually teaching for, how they supplement those salaries into the possibility of living, what they hope to achieve through the sacrifice, and what in all likelihood the overwhelming majority will actually achieve—and the discrepancies between vision and actuality.

The non-high-falutin’ way to say it is to point out that from the beginning of fiction as we know it, the basic way to produce a richly interesting fictive situation is to take a person from one social stratum and carefully observe him or her having to learn to deal with folks from another, either up or down the social ladder: the bourgeois young man who must learn how to live and work among sailors (Kipling’s Captains Courageous, 1887) or the poor working-class fellow who must learn to negotiate society (Jack London’s Martin Eden, 1913); Becky Thatcher’s social rise from impoverished poor relation to society’s heights in Vanity Fair (1848) or Odette de Crécy’s rise from demi-mondaine, through a stint as the cultured Swann’s mistress, till finally she becomes the Duchesse de Guermantes, which provides the running story thread through the grand tapestry of Remembrance of Things Past* (1913–27). Fiction feels most like fiction when it cleaves most closely to such situations—and, as its stories stray further and further from such interclass encounters, it feels thinner and thinner.

Another way to sum up much of what we have said above is another unhappy truth:

One way or the other, directly or indirectly, good fiction tends to be about money.

Whether directly or indirectly, most fiction is about the effects of having it or of not having it, the tensions caused between people used to having more of it or less of it, or even, sometimes, the money it takes to write the fiction itself, if not to live it. Supremely, it’s about the delusions the having of it or the not having of it force us to assume in order to go on. Like Robert Graves’s famous and equally true statement about poetry, however (“All true poetry is about love, death, or the changing of the seasons”), the generality ends up undercutting its interest. Like Graves’s statement, one either recognizes its truth or one doesn’t. Both need to be acknowledged. Neither needs to be dwelled on.

Probably I am drawn to such overgeneralizations—“All true poetry is about love, death, and changing of the seasons,” “All good fiction is about money”—because I am not a poet, and not (primarily) a writer of realistic fiction. Thus I like statements that do a lot of critical housekeeping for me—possibly, certain poets or fiction writers might argue, too much to be useful.

“All good fiction is about money” probably appeals to me because, while I acknowledge the necessity of the economic register in the rich presentation of social life (like Forster’s necessity for some story if we are to recognize the text as fiction at all), the economic is, nevertheless, not the most interesting thing to me as a reader personally (in the same way that story is not the most interesting thing either to Forster or to me). But stories that never address money or the process by which we acquire it—if not directly then indirectly—are usually stillborn.

As far as all fiction being about money, the good news is that over the last three hundred years so many indirect strategies have been developed to indicate the money that controls the fiction that often the reader—sometimes even the writer—is not aware of the way the monetary grounding that functions to elicit the fictive “truth effect” is actually present in her or his tales. Still I think it’s better to know than to gamble on its happening. When the writer doesn’t know, and can’t provide such information directly or indirectly, allowing the reader to sense the economic underpinning of the tale through the representation of work or otherwise, the fiction usually registers on the reader as thin or lacking in staying power.

The better news is that, regardless of guidelines people writing about it lay down—guidelines that I or my students have from time to time found useful—they are only guidelines. There are no rules. The truth is, fiction can be about anything. I don’t believe the best of it changes the world directly—though many people felt that works such as Uncle Tom’s Cabin (1851) and Les Misérables (1869) were pretty effective in their day. (When President Lincoln was introduced to Harriet Beecher Stowe, he reputedly met her with the words, “So this is the little lady who made this big war!” And the popularity of Hugo’s novel is often counted as influencing many people to support late 19th century welfare reforms.) One of New York’s historical public catastrophes, resulting in twenty-three deaths and over a hundred wounded, the Astor Place Riots of 1849 were sparked by two rival productions of Macbeth, playing in New York on the same night, in theaters half a dozen blocks apart, one starring the American Edwin Forest and the other featuring the English-man William Macready. Again, art no longer functions in the society the way it once did: it functions in different ways. And it can help people understand how those who live and think in ways different from themselves can manage to make sense of the world. The pleasures from writing fiction—and even more, the pleasures from reading it—easily become addictions. Some of the guidelines above may, I believe, have something to do with why our society continues to organize itself so that such addictions are not only common and continuous but often flower in such wonderful ways, ways that manifest themselves in provocative and satisfying stories and novels across the range of genres, literary and paraliterary.

—Buffalo, Philadelphia,

Boulder, and New York

July 2000–April 2005

Partial List of Works Cited

Artaud, Antonin. “No More Masterpieces.” In The Theater and Its Double. New York: Grove Press, 1958.

Barzun, Jacques. Simple and Direct: A Rhetoric for Writers. Rev. ed. Chicago: University of Chicago Press, 1994.

Borges, Jorge Luis. This Craft of Verse. Cambridge: Harvard University Press, 1998.

Carver, Raymond. “On Writing.” In Fires. New York; Vintage Books, 1983.

Egri, Lajos. The Art of Dramatic Writing. New York: Simon & Schuster, 1972.

Forster, E. M. Aspects of the Novel. New York: Harcourt Brace Jovanovitch, 1927.

Hofmannsthal, Hugo von. The Lord Chandos Letter. Translated by Russell Stockman. 1902. Marlboro, Vt.: Marlboro Press, 1986.

Lacan, Jacques. “Of Structure as an Inmixing of an Otherness Prerequisite to Any Subject Whatsoever.” In The Structuralist Controversy: The Languages of Criticism and the Science of Man. Edited by Richard Macksey and Eugenio Donato. Baltimore: Johns Hopkins University Press, 1972.

Orwell, George. “Politics and the English Language.” In In Front of Your Nose (1945–1950): The Collected Journalism, Essays, and Letters of George Orwell, vol. 4, edited by Sonia Orwell and Ian Angus, 127–40. New York: Harcourt Brace Jovanovich, 1968.

Pound, Ezra. ABC of Reading. New York: New Directions, 1934.

Stein, Gertrude. The Autobiography of Alice B. Toklas: Selected Writing of Gertrude Stein. 1933. New York: Vintage Books, 1975.

———. Lectures in America. 1935. New York: Vintage Books, 1975.

_________

* At John Hopkins University, in October 1966, the French psychiatrist Jacques Lacan discussed this all-important signification process in a paper entitled “Of Structure as an Inmixing of an Otherness Prerequisite to Any Subject Whatsoever” (reprinted in The Structuralist Controversy). This “inmixing,” this “intertrusion,” this “conjoining,” is what allows scenes, sentences, and even words to signify.

* More recently translated as In Search of Lost Time.

About Writing

Подняться наверх