READING SUBTLY

This
was the domain of my Blogger site from 2009 to 2018, when I moved to this domain and started
The Storytelling Ape
. The search option should help you find any of the old posts you're looking for.
 

Dennis Junk Dennis Junk

Too Psyched for Sherlock: A Review of Maria Konnikova’s “Mastermind: How to Think like Sherlock Holmes”—with Some Thoughts on Science Education

Maria Konnikova’s book “Mastermind: How to Think Like Sherlock Holmes” got me really excited because if the science of psychology is ever brought up in discussions of literature, it’s usually the pseudoscience of Sigmund Freud. Konnikova, whose blog went a long way toward remedying that tragedy, wanted to offer up an alternative approach. However, though the book shows great promise, it’s ultimately disappointing.

Whenever he gets really drunk, my brother has the peculiar habit of reciting the plot of one or another of his favorite shows or books. His friends and I like to tease him about it—“Watch out, Dan’s drunk, nobody mention The Wire!”—and the quirk can certainly be annoying, especially if you’ve yet to experience the story first-hand. But I have to admit, given how blotto he usually is when he first sets out on one of his grand retellings, his ability to recall intricate plotlines right down to their minutest shifts and turns is extraordinary. One recent night, during a timeout in an epic shellacking of Notre Dame’s football team, he took up the tale of Django Unchained, which incidentally I’d sat next to him watching just the week before. Tuning him out, I let my thoughts shift to a post I’d read on The New Yorker’s cinema blog The Front Row.

            In “The Riddle of Tarantino,” film critic Richard Brody analyzes the director-screenwriter’s latest work in an attempt to tease out the secrets behind the popular appeal of his creations and to derive insights into the inner workings of his mind. The post is agonizingly—though also at points, I must admit, exquisitely—overwritten, almost a parody of the grandiose type of writing one expects to find within the pages of the august weekly. Bemused by the lavish application of psychoanalytic jargon, I finished the essay pitying Brody for, in all his writerly panache, having nothing of real substance to say about the movie or the mind behind it. I wondered if he knows the scientific consensus on Freud is that his influence is less in the line of, say, a Darwin or an Einstein than of an L. Ron Hubbard.

            What Brody and my brother have in common is that they were both moved enough by their cinematic experience to feel an urge to share their enthusiasm, complicated though that enthusiasm may have been. Yet they both ended up doing the story a disservice, succeeding less in celebrating the work than in blunting its impact. Listening to my brother’s rehearsal of the plot with Brody’s essay in mind, I wondered what better field there could be than psychology for affording enthusiasts discussion-worthy insights to help them move beyond simple plot references. How tragic, then, that the only versions of psychology on offer in educational institutions catering to those who would be custodians of art, whether in academia or on the mastheads of magazines like The New Yorker, are those in thrall to Freud’s cultish legacy.

There’s just something irresistibly seductive about the promise of a scientific paradigm that allows us to know more about another person than he knows about himself. In this spirit of privileged knowingness, Brody faults Django for its lack of moral complexity before going on to make a silly accusation. Watching the movie, you know who the good guys are, who the bad guys are, and who you want to see prevail in the inevitably epic climax. “And yet,” Brody writes,

the cinematic unconscious shines through in moments where Tarantino just can’t help letting loose his own pleasure in filming pain. In such moments, he never seems to be forcing himself to look or to film, but, rather, forcing himself not to keep going. He’s not troubled by representation but by a visual superego that restrains it. The catharsis he provides in the final conflagration is that of purging the world of miscreants; it’s also a refining fire that blasts away suspicion of any peeping pleasure at misdeeds and fuses aesthetic, moral, and political exultation in a single apotheosis.

The strained stateliness of the prose provides a ready distraction from the stark implausibility of the assessment. Applying Occam’s Razor rather than Freud’s at once insanely elaborate and absurdly reductionist ideology, we might guess that what prompted Tarantino to let the camera linger discomfortingly long on the violent misdeeds of the black hats is that he knew we in the audience would be anticipating that “final conflagration.”

The more outrageous the offense, the more pleasurable the anticipation of comeuppance—but the experimental findings that support this view aren’t covered in film or literary criticism curricula, mired as they are in century-old pseudoscience.

I’ve been eagerly awaiting the day when scientific psychology supplants psychoanalysis (as well as other equally, if not more, absurd ideologies) in academic and popular literary discussions. Coming across the blog Literally Psyched on Scientific American’s website about a year ago gave me a great sense of hope. The tagline, “Conceived in literature, tested in psychology,” as well as the credibility conferred by the host site, promised that the most fitting approach to exploring the resonance and beauty of stories might be undergoing a long overdue renaissance, liberated at last from the dominion of crackpot theorists. So when the author, Maria Konnikova, a doctoral candidate at Columbia, released her first book, I made a point to have Amazon deliver it as early as possible.

Mastermind: How to Think Like Sherlock Holmes does indeed follow the conceived-in-literature-tested-in-psychology formula, taking the principles of sound reasoning expounded by what may be the most recognizable fictional character in history and attempting to show how modern psychology proves their soundness. In what she calls a “Prelude” to her book, Konnikova explains that she’s been a Holmes fan since her father read Conan Doyle’s stories to her and her siblings as children.

The one demonstration of the detective’s abilities that stuck with Konnikova the most comes when he explains to his companion and chronicler Dr. Watson the difference between seeing and observing, using as an example the number of stairs leading up to their famous flat at 221B Baker Street. Watson, naturally, has no idea how many stairs there are because he isn’t in the habit of observing. Holmes, preternaturally, knows there are seventeen steps. Ever since being made aware of Watson’s—and her own—cognitive limitations through this vivid illustration (which had a similar effect on me when I first read “A Scandal in Bohemia” as a teenager), Konnikova has been trying to find the secret to becoming a Holmesian observer as opposed to a mere Watsonian seer. Already in these earliest pages, we encounter some of the principle shortcomings of the strategy behind the book. Konnikova wastes no time on the question of whether or not a mindset oriented toward things like the number of stairs in your building has any actual advantages—with regard to solving crimes or to anything else—but rather assumes old Sherlock is saying something instructive and profound.

Mastermind is, for the most part, an entertaining read. Its worst fault in the realm of simple page-by-page enjoyment is that Konnikova often belabors points that upon reflection expose themselves as mere platitudes. The overall theme is the importance of mindfulness—an important message, to be sure, in this age of rampant multitasking. But readers get more endorsement than practical instruction. You can only be exhorted to pay attention to what you’re doing so many times before you stop paying attention to the exhortations. The book’s problems in both the literary and psychological domains, however, are much more serious. I came to the book hoping it would hold some promise for opening the way to more scientific literary discussions by offering at least a glimpse of what they might look like, but while reading I came to realize there’s yet another obstacle to any substantive analysis of stories. Call it the TED effect. For anything to be read today, or for anything to get published for that matter, it has to promise to uplift readers, reveal to them some secret about how to improve their lives, help them celebrate the horizonless expanse of human potential.

Naturally enough, with the cacophony of competing information outlets, we all focus on the ones most likely to offer us something personally useful. Though self-improvement is a worthy endeavor, the overlooked corollary to this trend is that the worthiness intrinsic to enterprises and ideas is overshadowed and diminished. People ask what’s in literature for me, or what can science do for me, instead of considering them valuable in their own right—and instead of thinking, heaven forbid, we may have a duty to literature and science as institutions serving as essential parts of the foundation of civilized society.

In trying to conceive of a book that would operate as a vehicle for her two passions, psychology and Sherlock Holmes, while at the same time catering to readers’ appetite for life-enhancement strategies and spiritual uplift, Konnikova has produced a work in the grip of a bewildering and self-undermining identity crisis. The organizing conceit of Mastermind is that, just as Sherlock explains to Watson in the second chapter of A Study in Scarlet, the brain is like an attic. For Konnikova, this means the mind is in constant danger of becoming cluttered and disorganized through carelessness and neglect. That this interpretation wasn’t what Conan Doyle had in mind when he put the words into Sherlock’s mouth—and that the meaning he actually had in mind has proven to be completely wrong—doesn’t stop her from making her version of the idea the centerpiece of her argument. “We can,” she writes,

learn to master many aspects of our attic’s structure, throwing out junk that got in by mistake (as Holmes promises to forget Copernicus at the earliest opportunity), prioritizing those things we want to and pushing back those that we don’t, learning how to take the contours of our unique attic into account so that they don’t unduly influence us as they otherwise might. (27)

This all sounds great—a little too great—from a self-improvement perspective, but the attic metaphor is Sherlock’s explanation for why he doesn’t know the earth revolves around the sun and not the other way around. He states quite explicitly that he believes the important point of similarity between attics and brains is their limited capacity. “Depend upon it,” he insists, “there comes a time when for every addition of knowledge you forget something that you knew before.” Note here his topic is knowledge, not attention.

It is possible that a human mind could reach and exceed its storage capacity, but the way we usually avoid this eventuality is that memories that are seldom referenced are forgotten. Learning new facts may of course exhaust our resources of time and attention. But the usual effect of acquiring knowledge is quite the opposite of what Sherlock suggests. In the early 1990’s, a research team led by Patricia Alexander demonstrated that having background knowledge in a subject area actually increased participants’ interest in and recall for details in an unfamiliar text. One of the most widely known replications of this finding was a study showing that chess experts have much better recall for the positions of pieces on a board than novices. However, Sherlock was worried about information outside of his area of expertise. Might he have a point there?

The problem is that Sherlock’s vocation demands a great deal of creativity, and it’s never certain at the outset of a case what type of knowledge may be useful in solving it. In the story “The Lion’s Mane,” he relies on obscure information about a rare species of jellyfish to wrap up the mystery. Konnikova cites this as an example of “The Importance of Curiosity and Play.” She goes on to quote Sherlock’s endorsement for curiosity in The Valley of Fear: “Breadth of view, my dear Mr. Mac, is one of the essentials of our profession. The interplay of ideas and the oblique uses of knowledge are often of extraordinary interest” (151). How does she account for the discrepancy? Could Conan Doyle’s conception of the character have undergone some sort of evolution? Alas, Konnikova isn’t interested in questions like that. “As with most things,” she writes about the earlier reference to the attic theory, “it is safe to assume that Holmes was exaggerating for effect” (150). I’m not sure what other instances she may have in mind—it seems to me that the character seldom exaggerates for effect. In any case, he was certainly not exaggerating his ignorance of Copernican theory in the earlier story.

If Konnikova were simply privileging the science at the expense of the literature, the measure of Mastermind’s success would be in how clearly the psychological theories and findings are laid out. Unfortunately, her attempt to stitch science together with pronouncements from the great detective often leads to confusing tangles of ideas. Following her formula, she prefaces one of the few example exercises from cognitive research provided in the book with a quote from “The Crooked Man.” After outlining the main points of the case, she writes,

How to make sense of these multiple elements? “Having gathered these facts, Watson,” Holmes tells the doctor, “I smoked several pipes over them, trying to separate those which were crucial from others which were merely incidental.” And that, in one sentence, is the first step toward successful deduction: the separation of those factors that are crucial to your judgment from those that are just incidental, to make sure that only the truly central elements affect your decision. (169)

So far she hasn’t gone beyond the obvious. But she does go on to cite a truly remarkable finding that emerged from research by Amos Tversky and Daniel Kahneman in the early 1980’s. People who read a description of a man named Bill suggesting he lacks imagination tended to feel it was less likely that Bill was an accountant than that he was an accountant who plays jazz for a hobby—even though the two points of information in that second description make in inherently less likely than the one point of information in the first. The same result came when people were asked whether it was more likely that a woman named Linda was a bank teller or both a bank teller and an active feminist. People mistook the two-item choice as more likely. Now, is this experimental finding an example of how people fail to sift crucial from incidental facts?

The findings of this study are now used as evidence of a general cognitive tendency known as the conjunction fallacy. In his book Thinking, Fast and Slow, Kahneman explains how more detailed descriptions (referring to Tom instead of Bill) can seem more likely, despite the actual probabilities, than shorter ones. He writes,

The judgments of probability that our respondents offered, both in the Tom W and Linda problems, corresponded precisely to judgments of representativeness (similarity to stereotypes). Representativeness belongs to a cluster of closely related basic assessments that are likely to be generated together. The most representative outcomes combine with the personality description to produce the most coherent stories. The most coherent stories are not necessarily the most probable, but they are plausible, and the notions of coherence, plausibility, and probability are easily confused by the unwary. (159)

So people are confused because the less probable version is actually easier to imagine. But here’s how Konnikova tries to explain the point by weaving it together with Sherlock’s ideas:

Holmes puts it this way: “The difficulty is to detach the framework of fact—of absolute undeniable fact—from the embellishments of theorists and reporters. Then, having established ourselves upon this sound basis, it is our duty to see what inferences may be drawn and what are the special points upon which the whole mystery turns.” In other words, in sorting through the morass of Bill and Linda, we would have done well to set clearly in our minds what were the actual facts, and what were the embellishments or stories in our minds. (173)

But Sherlock is not referring to our minds’ tendency to mistake coherence for probability, the tendency that has us seeing more detailed and hence less probable stories as more likely. How could he have been? Instead, he’s talking about the importance of independently assessing the facts instead of passively accepting the assessments of others. Konnikova is fudging, and in doing so she’s shortchanging the story and obfuscating the science.

As the subtitle implies, though, Mastermind is about how to think; it is intended as a self-improvement guide. The book should therefore be judged based on the likelihood that readers will come away with a greater ability to recognize and avoid cognitive biases, as well as the ability to sustain the conviction to stay motivated and remain alert. Konnikova emphasizes throughout that becoming a better thinker is a matter of determinedly forming better habits of thought. And she helpfully provides countless illustrative examples from the Holmes canon, though some of these precepts and examples may not be as apt as she’d like. You must have clear goals, she stresses, to help you focus your attention. But the overall purpose of her book provides a great example of a vague and unrealistic end-point. Think better? In what domain? She covers examples from countless areas, from buying cars and phones, to sizing up strangers we meet at a party. Sherlock, of course, is a detective, so he focuses his attention of solving crimes. As Konnikova dutifully points out, in domains other than his specialty, he’s not such a mastermind.

What Mastermind works best as is a fun introduction to modern psychology. But it has several major shortcomings in that domain, and these same shortcomings diminish the likelihood that reading the book will lead to any lasting changes in thought habits. Concepts are covered too quickly, organized too haphazardly, and no conceptual scaffold is provided to help readers weigh or remember the principles in context. Konnikova’s strategy is to take a passage from Conan Doyle’s stories that seems to bear on noteworthy findings in modern research, discuss that research with sprinkled references back to the stories, and wrap up with a didactic and sententious paragraph or two. Usually, the discussion begins with one of Watson’s errors, moves on to research showing we all tend to make similar errors, and then ends admonishing us not to be like Watson. Following Kahneman’s division of cognition into two systems—one fast and intuitive, the other slower and demanding of effort—Konnikova urges us to get out of our “System Watson” and rely instead on our “System Holmes.” “But how do we do this in practice?” she asks near the end of the book,

How do we go beyond theoretically understanding this need for balance and open-mindedness and applying it practically, in the moment, in situations where we might not have as much time to contemplate our judgments as we do in the leisure of our reading?

The answer she provides: “It all goes back to the very beginning: the habitual mindset that we cultivate, the structure that we try to maintain for our brain attic no matter what” (240). Unfortunately, nowhere in her discussion of built-in biases and the correlates to creativity did she offer any step-by-step instruction on how to acquire new habits. Konnikova is running us around in circles to hide the fact that her book makes an empty promise.

Tellingly, Kahneman, whose work on biases Konnikova cites on several occasions, is much more pessimistic about our prospects for achieving Holmesian thought habits. In the introduction to Thinking, Fast and Slow, he says his goal is merely to provide terms and labels for the regular pitfalls of thinking to facilitate more precise gossiping. He writes,

Why be concerned with gossip? Because it is much easier, as well as far more enjoyable, to identify and label the mistakes of others than to recognize our own. Questioning what we believe and want is difficult at the best of times, and especially difficult when we most need to do it, but we can benefit from the informed opinions of others. Many of us spontaneously anticipate how friends and colleagues will evaluate our choices; the quality and content of these anticipated judgments therefore matters. The expectation of intelligent gossip is a powerful motive for serious self-criticism, more powerful than New Year resolutions to improve one’s decision making at work and home. (3)

The worshipful attitude toward Sherlock in Mastermind is designed to pander to our vanity, and so the suggestion that we need to rely on others to help us think is too mature to appear in its pages. The closest Konnikova comes to allowing for the importance of input and criticism from other people is when she suggests that Watson is an indispensable facilitator of Sherlock’s process because he “serves as a constant reminder of what errors are possible” (195), and because in walking him through his reasoning Sherlock is forced to be more mindful. “It may be that you are not yourself luminous,” Konnikova quotes from The Hound of the Baskervilles, “but you are a conductor of light. Some people without possessing genius have a remarkable power of stimulating it. I confess, my dear fellow, that I am very much in your debt” (196).

That quote shows one of the limits of Sherlock’s mindfulness that Konnikova never bothers to address. At times throughout Mastermind, it’s easy to forget that we probably wouldn’t want to live the way Sherlock is described as living. Want to be a great detective? Abandon your spouse and your kids, move into a cheap flat, work full-time reviewing case histories of past crimes, inject some cocaine, shoot holes in the wall of your flat where you’ve drawn a smiley face, smoke a pipe until the air is unbreathable, and treat everyone, including your best (only?) friend with casual contempt. Conan Doyle made sure his character casts a shadow. The ideal character Konnikova holds up, with all his determined mindfulness, often bears more resemblance to Kwai Chang Caine from Kung Fu. This isn’t to say that Sherlock isn’t morally complex—readers love him because he’s so clearly a good guy, as selfish and eccentric as he may be. Konnikova cites an instance in which he holds off on letting the police know who committed a crime. She quotes:

Once that warrant was made out, nothing on earth would save him. Once or twice in my career I feel that I have done more real harm by my discovery of the criminal than ever he had done by his crime. I have learned caution now, and I had rather play tricks with the law of England than with my own conscience. Let us know more before we act.

But Konnikova isn’t interested in morality, complex or otherwise, no matter how central moral intuitions are to our enjoyment of fiction. The lesson she draws from this passage shows her at her most sententious and platitudinous:

You don’t mindlessly follow the same preplanned set of actions that you had determined early on. Circumstances change, and with them so does the approach. You have to think before you leap to act, or judge someone, as the case may be. Everyone makes mistakes, but some may not be mistakes as such, when taken in the context of the time and the situation. (243)

Hard to disagree, isn’t it?

To be fair, Konnikova does mention some of Sherlock’s peccadilloes in passing. And she includes a penultimate chapter titled “We’re Only Human,” in which she tells the story of how Conan Doyle was duped by a couple of young girls into believing they had photographed some real fairies. She doesn’t, however, take the opportunity afforded by this episode in the author’s life to explore the relationship between the man and his creation. She effectively says he got tricked because he didn’t do what he knew how to do, it can happen to any of us, so be careful you don’t let it happen to you. Aren’t you glad that’s cleared up? She goes on to end the chapter with an incongruous lesson about how you should think like a hunter. Maybe we should, but how exactly, and when, and at what expense, we’re never told.

Konnikova clearly has a great deal of genuine enthusiasm for both literature and science, and despite my disappointment with her first book I plan to keep following her blog. I’m even looking forward to her next book—confident she’ll learn from the negative reviews she’s bound to get on this one. The tragic blunder she made in eschewing nuanced examinations of how stories work, how people relate to characters, or how authors create them for a shallow and one-dimensional attempt at suggesting a 100 year-old fictional character somehow divined groundbreaking research findings from the end of the Twentieth and beginning of the Twenty-First Centuries calls to mind an exchange you can watch on YouTube between Neil Degrasse Tyson and Richard Dawkins. Tyson, after hearing Dawkins speak in the way he’s known to, tries to explain why many scientists feel he’s not making the most of his opportunities to reach out to the public.

You’re professor of the public understanding of science, not the professor of delivering truth to the public. And these are two different exercises. One of them is putting the truth out there and they either buy your book or they don’t. That’s not being an educator; that’s just putting it out there. Being an educator is not only getting the truth right; there’s got to be an act of persuasion in there as well. Persuasion isn’t “Here’s the facts—you’re either an idiot or you’re not.” It’s “Here are the facts—and here is a sensitivity to your state of mind.” And it’s the facts and the sensitivity when convolved together that creates impact. And I worry that your methods, and how articulately barbed you can be, ends up being simply ineffective when you have much more power of influence than is currently reflected in your output.

Dawkins begins his response with an anecdote that shows that he’s not the worst offender when it comes to simple and direct presentations of the facts.

A former and highly successful editor of New Scientist Magazine, who actually built up New Scientist to great new heights, was asked “What is your philosophy at New Scientist?” And he said, “Our philosophy at New Scientist is this: science is interesting, and if you don’t agree you can fuck off.”

I know the issue is a complicated one, but I can’t help thinking Tyson-style persuasion too often has the opposite of its intended impact, conveying as it does the implicit message that science has to somehow be sold to the masses, that it isn’t intrinsically interesting. At any rate, I wish that Konnikova hadn’t dressed up her book with false promises and what she thought would be cool cross-references. Sherlock Holmes is interesting. Psychology is interesting. If you don’t agree, you can fuck off.

Also read

FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

And

THE STORYTELLING ANIMAL: A LIGHT READ WITH WEIGHTY IMPLICATIONS

And

LAB FLIES: JOSHUA GREENE’S MORAL TRIBES AND THE CONTAMINATION OF WALTER WHITE

Also a propos is

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

Read More
Dennis Junk Dennis Junk

What's the Point of Difficult Reading?

For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them.

You sit reading the first dozen or so pages of some celebrated classic and gradually realize that having to sort out how the ends of the long sentences fix to their beginnings is taking just enough effort to distract you entirely from the setting or character you’re supposed to be getting to know. After a handful of words you swear are made up and a few tangled metaphors you find yourself riddling over with nary a resolution, the dread sinks in. Is the whole book going to be like this? Is it going to be one of those deals where you get to what’s clearly meant to be a crucial turning point in the plot but for you is just another riddle without a solution, sending you paging back through the forest of verbiage in search of some key succession of paragraphs you spaced out while reading the first time through? Then you wonder if you’re missing some other kind of key, like maybe the story’s an allegory, a reference to some historical event like World War II or some Revolution you once had to learn about but have since lost all recollection of. Maybe the insoluble similes are allusions to some other work you haven’t read or can’t recall. In any case, you’re not getting anything out of this celebrated classic but frustration leading to the dual suspicion that you’re too ignorant or stupid to enjoy great literature and that the whole “great literature” thing is just a conspiracy to trick us into feeling dumb so we’ll defer to the pseudo-wisdom of Ivory Tower elites.

If enough people of sufficient status get together and agree to extol a work of fiction, they can get almost everyone else to agree. The readers who get nothing out of it but frustration and boredom assume that since their professors or some critic in a fancy-pants magazine or the judges of some literary award committee think it’s great they must simply be missing something. They dutifully continue reading it, parrot a few points from a review that sound clever, and afterward toe the line by agreeing that it is indeed a great work of literature, clearly, even if it doesn’t speak to them personally. For instance, James Joyce’s Ulysses, utterly nonsensical to anyone without at least a master’s degree, tops the Modern Library’s list of 100 best novels in the English language. Responding to the urging of his friends to write out an explanation of the novel, Joyce scoffed, boasting,

I’ve put in so many enigmas and puzzles that it will keep the professors busy for centuries arguing over what I meant, and that’s the only way of ensuring one’s immortality.

He was right. To this day, professors continue to love him even as Ulysses and the even greater monstrosity Finnegan’s Wake do nothing but bore and befuddle everyone else—or else, more fittingly, sit inert or unchecked-out on the shelf, gathering well-deserved dust.

Joyce’s later novels are not literature; they are lengthy collections of loosely connected literary puzzles. But at least his puzzles have actual solutions—or so I’m told. Ulysses represents the apotheosis of the tradition in literature called modernism. What came next, postmodernism, is even more disconnected from the universal human passion for narrative. Even professors aren’t sure what to do with it, so they simply throw their hands up, say it’s great, and explain that the source of its greatness is its very resistance to explanation. Jonathan Franzen, whose 2001 novel The Corrections represented a major departure from the postmodernism he began his career experimenting with, explained the following year in The New Yorker how he’d turned away from the tradition. He’d been reading the work of William Gaddis “as a kind of penance” (101) and not getting any meaning out of it. Of the final piece in the celebrated author’s oeuvre, Franzen writes,

The novel is an example of the particular corrosiveness of literary postmodernism. Gaddis began his career with a Modernist epic about the forgery of masterpieces. He ended it with a pomo romp that superficially resembles a masterpiece but punishes the reader who tries to stay with it and follow its logic. When the reader finally says, Hey, wait a minute, this is a mess, not a masterpiece, the book instantly morphs into a performance-art prop: its fraudulence is the whole point! And the reader is out twenty hours of good-faith effort. (111)

In other words, reading postmodern fiction means not only forgoing the rewards of narratives, having them replaced by the more taxing endeavor of solving multiple riddles in succession, but those riddles don’t even have answers. What’s the point of reading this crap? Exactly. Get it?

You can dig deeper into the meaningless meanderings of pomos and discover there is in fact an ideology inspiring all the infuriating inanity. The super smart people who write and read this stuff point to the willing, eager complicity of the common reader in the propagation of all the lies that sustain our atrociously unjust society (but atrociously unjust compared to what?). Franzen refers to this as the Fallacy of the Stupid Reader,

wherein difficulty is a “strategy” to protect art from cooptation and the purpose of art is to “upset” or “compel” or “challenge” or “subvert” or “scar” the unsuspecting reader; as if the writer’s audience somehow consisted, again and again, of Charlie Browns running to kick Lucy’s football; as if it were a virtue in a novelist to be the kind of boor who propagandizes at friendly social gatherings. (109)

But if the author is worried about art becoming a commodity does making the art shitty really amount to a solution? And if the goal is to make readers rethink something they take for granted why not bring the matter up directly, or have a character wrestle with it, or have a character argue with another character about it? The sad fact is that these authors probably just suck, that, as Franzen suspects, “literary difficulty can operate as a smoke screen for an author who has nothing interesting, wise, or entertaining to say” (111).

Not all difficulty in fiction is a smoke screen though. Not all the literary emperors are naked. Franzen writes that “there is no headache like the headache you get from working harder on deciphering a text than the author, by all appearances, has worked on assembling it.” But the essay, titled “Mr. Difficult,” begins with a reader complaint sent not to Gaddis but to Franzen himself. And the reader, a Mrs. M. from Maryland, really gives him the business:

Who is it that you are writing for? It surely could not be the average person who enjoys a good read… The elite of New York, the elite who are beautiful, thin, anorexic, neurotic, sophisticated, don’t smoke, have abortions tri-yearly, are antiseptic, live in penthouses, this superior species of humanity who read Harper’s and The New Yorker. (100)

In this first part of the essay, Franzen introduces a dilemma that sets up his explanation of why he turned away from postmodernism—he’s an adherent of the “Contract model” of literature, whereby the author agrees to share, on equal footing, an entertaining or in some other way gratifying experience, as opposed to the “Status model,” whereby the author demonstrates his or her genius and if you don’t get it, tough. But his coming to a supposed agreement with Mrs. M. about writers like Gaddis doesn’t really resolve Mrs. M.’s conflict with him.

The Corrections, after all, the novel she was responding to, represents his turning away from the tradition Gaddis wrote in. (It must be said, though, that Freedom, Franzen’s next novel, is written in a still more accessible style.)

The first thing we must do to respond properly to Mrs. M. is break down each of Franzen’s models into two categories. The status model includes writers like Gaddis whose difficulty serves no purpose but to frustrate and alienate readers. But Franzen’s own type specimen for this model is Flaubert, much of whose writing, though difficult at first, rewards any effort to re-read and further comprehend with a more profound connection. So it is for countless other writers, the one behind number two on the Modern Library’s ranking for instance—Fitzgerald and Gatsby. As for the contract model, Franzen admits,

Taken to its free-market extreme, Contract stipulates that if a product is disagreeable to you the fault must be the product’s. If you crack a tooth on a hard word in a novel, you sue the author. If your professor puts Dreiser on your reading list, you write a harsh student evaluation… You’re the consumer; you rule. (100)

Franzen, in declaring himself a “Contract kind of person,” assumes that the free-market extreme can be dismissed for its extremity. But Mrs. M. would probably challenge him on that. For many, particularly right-leaning readers, the market not only can but should be relied on to determine which books are good and which ones belong in some tiny niche. When the Modern Library conducted a readers' poll to create a popular ranking to balance the one made by experts, the ballot was stuffed by Ayn Rand acolytes and scientologists. Mrs. M. herself leaves little doubt as to her political sympathies. For her and her fellow travelers, things like literature departments, National Book Awards—like the one The Corrections won—Nobels and Pulitzers are all an evil form of intervention into the sacred workings of the divine free market, un-American, sacrilegious, communist. According to this line of thinking, authors aren’t much different from whores—except of course literal whoring is condemned in the bible (except when it isn’t).

A contract with readers who score high on the personality dimension of openness to new ideas and experiences (who tend to be liberal), those who have spent a lot of time in the past reading books like The Great Gatsby or Heart of Darkness or Lolita (the horror!), those who read enough to have developed finely honed comprehension skills—that contract is going to look quite a bit different from one with readers who attend Beck University, those for whom Atlas Shrugged is the height of literary excellence. At the same time, though, the cult of self-esteem is poisoning schools and homes with the idea that suggesting that a student or son or daughter is anything other than a budding genius is a form of abuse. Heaven forbid a young person feel judged or criticized while speaking or writing. And if an author makes you feel the least bit dumb or ignorant, well, it’s an outrage—heroes like Mrs. M. to the rescue.

One of the problems with the cult of self-esteem is that anticipating criticism tends to make people more, not less creative. And the link between low self-esteem and mental disorders is almost purely mythical. High self-esteem is correlated with school performance, but as far as researchers can tell it’s the performance causing the esteem, not the other way around. More invidious, though, is the tendency to view anything that takes a great deal of education or intelligence to accomplish as an affront to everyone less educated or intelligent. Conservatives complain endlessly about class warfare and envy of the rich—the financially elite—but they have no qualms about decrying intellectual elites and condemning them for flaunting their superior literary achievements. They see the elitist mote in the eye of Nobel laureates without noticing the beam in their own.

         What’s the point of difficult reading? Well, what’s the point of running five or ten miles? What’s the point of eating vegetables as opposed to ice cream or Doritos? Difficulty need not preclude enjoyment. And discipline in the present is often rewarded in the future. It very well may be that the complexity of the ideas you’re capable of understanding is influenced by how many complex ideas you attempt to understand. No matter how vehemently true believers in the magic of markets insist otherwise, markets don’t have minds. And though an individual’s intelligence need not be fixed a good way to ensure children never get any smarter than they already are is to make them feel fantastically wonderful about their mediocrity. We just have to hope that despite these ideological traps there are enough people out there determined to wrap their minds around complex situations depicted in complex narratives about complex people told in complex language, people who will in the process develop the types of minds and intelligence necessary to lead the rest of our lazy asses into a future that’s livable and enjoyable. For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them. How do we get the rewards without wasting our time?

Read More
Dennis Junk Dennis Junk

Can’t Win for Losing: Why There Are So Many Losers in Literature and Why It Has to Change

David Lurie’s position in Disgrace is similar to that of John Proctor in The Crucible (although this doesn’t come out nearly as much in the movie version). And it’s hard not to see feminism in its current manifestations—along with Marxism and postcolonialism—as a pernicious new breed of McCarthyism infecting academia and wreaking havoc with men and literature alike.

Ironically, the author of The Golden Notebook, celebrating its 50th anniversary this year, and considered by many a “feminist bible,” happens to be an outspoken critic of feminism. When asked in a 1982 interview with Lesley Hazelton about her response to readers who felt some of her later works were betrayals of the women whose cause she once championed, Doris Lessing replied,

What the feminists want of me is something they haven't examined because it comes from religion. They want me to bear witness. What they would really like me to say is, ‘Ha, sisters, I stand with you side by side in your struggle toward the golden dawn where all those beastly men are no more.’ Do they really want people to make oversimplified statements about men and women? In fact, they do. I've come with great regret to this conclusion.

Lessing has also been accused of being overly harsh—“castrating”—to men, too many of whom she believes roll over a bit too easily when challenged by women aspiring to empowerment. As a famous novelist, however, who would go on to win the Nobel prize in literature in 2007, she got to visit a lot of schools, and it gradually dawned on her that it wasn’t so much that men were rolling over but rather that they were being trained from childhood to be ashamed of their maleness. In a lecture she gave to the Edinburgh book festival in 2001, she said,

Great things have been achieved through feminism. We now have pretty much equality at least on the pay and opportunities front, though almost nothing has been done on child care, the real liberation. We have many wonderful, clever, powerful women everywhere, but what is happening to men? Why did this have to be at the cost of men? I was in a class of nine- and 10-year-olds, girls and boys, and this young woman was telling these kids that the reason for wars was the innately violent nature of men. You could see the little girls, fat with complacency and conceit while the little boys sat there crumpled, apologising for their existence, thinking this was going to be the pattern of their lives.

Lessing describes how the teacher kept casting glances expectant of her approval as she excoriated these impressionable children. 

Elaine Blair, in “Great American Losers,” an essay that’s equal parts trenchant and infuriatingly obtuse, describes a dynamic in contemporary fiction that’s similar to the one Lessing saw playing out in the classroom.

The man who feels himself unloved and unlovable—this is a character that we know well from the latest generation or two of American novels. His trials are often played for sympathetic laughs. His loserdom is total: it extends to his stunted career, his squalid living quarters, his deep unease in the world.

At the heart of this loserdom is his auto-manifesting knowledge that women don’t like him. As opposed to men of earlier generations who felt entitled to a woman’s respect and admiration, Blair sees this modern male character as being “the opposite of entitled: he approaches women cringingly, bracing for a slap.” This desperation on the part of male characters to avoid offending women, to prove themselves capable of sublimating their own masculinity so they can be worthy of them, finds its source in the authors themselves. Blair writes,

Our American male novelists, I suspect, are worried about being unloved as writers—specifically by the female reader. This is the larger humiliation looming behind the many smaller fictional humiliations of their heroes, and we can see it in the way the characters’ rituals of self-loathing are tacitly performed for the benefit of an imagined female audience.

Blair quotes a review David Foster Wallace wrote of a John Updike novel to illustrate how conscious males writing literature today are of their female readers’ hostility toward men who write about sex and women without apologizing for liking sex and women—sometimes even outside the bounds of caring, committed relationships. Labeling Updike as a “Great Male Narcissist,” a distinction he shares with writers like Philip Roth and Norman Mailer, Wallace writes,

Most of the literary readers I know personally are under forty, and a fair number are female, and none of them are big admirers of the postwar GMNs. But it’s John Updike in particular that a lot of them seem to hate. And not merely his books, for some reason—mention the poor man himself and you have to jump back:

“Just a penis with a thesaurus.”

“Has the son of a bitch ever had one unpublished thought?”

“Makes misogyny seem literary the same way Rush 

[Limbaugh] makes fascism seem funny.”

And trust me: these are actual quotations, and I’ve heard even

worse ones, and they’re all usually accompanied by the sort of

facial expressions where you can tell there’s not going to be

any profit in appealing to the intentional fallacy or talking

about the sheer aesthetic pleasure of Updike’s prose.

Since Wallace is ready to “jump back” at the mere mention of Updike’s name, it’s no wonder he’s given to writing about characters who approach women “cringingly, bracing for a slap.”

Blair goes on to quote from Jonathan Franzen’s novel The Corrections, painting a plausible picture of male writers who fear not only that their books will be condemned if too misogynistic—a relative term which has come to mean "not as radically feminist as me"—but they themselves will be rejected. In Franzen’s novel, Chip Lambert has written a screenplay and asked his girlfriend Julia to give him her opinion. She holds off doing so, however, until after she breaks up with him and is on her way out the door. “For a woman reading it,” she says, “it’s sort of like the poultry department. Breast, breast, breast, thigh, leg” (26). Franzen describes his character’s response to the critique:

It seemed to Chip that Julia was leaving him because “The Academy Purple” had too many breast references and a draggy opening, and that if he could correct these few obvious problems, both on Julia’s copy of the script and, more important, on the copy he’d specially laser-printed on 24-pound ivory bond paper for [the film producer] Eden Procuro, there might be hope not only for his finances but also for his chances of ever again unfettering and fondling Julia’s own guileless, milk-white breasts. Which by this point in the day, as by late morning of almost every day in recent months, was one of the last activities on earth in which he could still reasonably expect to take solace for his failures. (28)

If you’re reading a literary work like The Corrections, chances are you’ve at some point sat in a literature class—or even a sociology or culture studies class—and been instructed that the proper way to fulfill your function as a reader is to critically assess the work in terms of how women (or minorities) are portrayed. Both Chip and Julia have sat through such classes. And you’re encouraged to express disapproval, even outrage if something like a traditional role is enacted—or, gasp, objectification occurs. Blair explains how this affects male novelists:

When you see the loser-figure in a novel, what you are seeing is a complicated bargain that goes something like this: yes, it is kind of immature and boorish to be thinking about sex all the time and ogling and objectifying women, but this is what we men sometimes do and we have to write about it. We fervently promise, however, to avoid the mistake of the late Updike novels: we will always, always, call our characters out when they’re being self-absorbed jerks and louts. We will make them comically pathetic, and punish them for their infractions a priori by making them undesirable to women, thus anticipating what we imagine will be your judgments, female reader. Then you and I, female reader, can share a laugh at the characters’ expense, and this will bring us closer together and forestall the dreaded possibility of your leaving me.

In other words, these male authors are the grownup versions of those poor school boys Lessing saw forced to apologize for their own existence. Indeed, you can feel this dynamic, this bargain, playing out when you’re reading these guys’ books. Blair’s description of the problem is spot on. Her theory of what caused it, however, is laughable.

Because of the GMNs, these two tendencies—heroic virility and sexist condescension—have lingered in our minds as somehow yoked together, and the succeeding generations of American male novelists have to some degree accepted the dyad as truth. Behind their skittishness is a fearful suspicion that if a man gets what he wants, sexually speaking, he is probably exploiting someone.

The dread of slipping down the slope from attraction to exploitation has nothing to do with John Updike. Rather, it is embedded in terms at the very core of feminist ideology. Misogyny, for instance, is frequently deemed an appropriate label for men who indulge in lustful gazing, even in private. And the term objectification implies that the female whose subjectivity isn’t being properly revered is the victim of oppression. The main problem with this idea—and there are several—is that the term objectification is synonymous with attraction. The deluge of details about the female body in fiction by male authors can just as easily be seen as a type of confession, an unburdening of guilt by the offering up of sins. The female readers respond by assigning the writers some form of penance, like never writing, never thinking like that again without flagellating themselves.

The conflict between healthy male desire and disapproving feminist prudery doesn’t just play out in the tortured psyches of geeky American male novelists. A.S. Byatt, in her Booker prize-winning novel Possession, satirizes the plight of scholars steeped in literary theories for being “papery” and sterile. But the novel ends with a male scholar named Roland overcoming his theory-induced self-consciousness to initiate sex with another scholar named Maud. Byatt describes the encounter:

And very slowly and with infinite gentle delays and delicate diversions and variations of indirect assault Roland finally, to use an outdated phrase, entered and took possession of all her white coolness that grew warm against him, so that there seemed to be no boundaries, and he heard, towards dawn, from a long way off, her clear voice crying out, uninhibited, unashamed, in pleasure and triumph. (551)

The literary critic Monica Flegel cites this passage as an example of how Byatt’s old-fashioned novel features “such negative qualities of the form as its misogyny and its omission of the lower class.” Flegel is particularly appalled by how “stereotypical gender roles are reaffirmed” in the sex scene. “Maud is reduced in the end,” Flegel alleges, “to being taken possession of by her lover…and assured that Roland will ‘take care of her.’” How, we may wonder, did a man assuring a woman he would take care of her become an act of misogyny?

Perhaps critics like Flegel occupy some radical fringe; Byatt’s book was after all a huge success with audiences and critics alike, and it did win Byatt the Booker. The novelist Martin Amis, however, isn’t one to describe his assaults as indirect. He routinely dares to feature men who actually do treat women poorly in his novels—without any authorial condemnation.

Martin Goff, the non-intervening director of the Booker Prize committee, tells the story of the 1989 controversy over whether or not Amis’s London Fields should be on the shortlist. Maggie Gee, a novelist, and Helen McNeil, a professor, simply couldn’t abide Amis’s treatment of his women characters. “It was an incredible row,” says Goff.

Maggie and Helen felt that Amis treated women appallingly in the book. That is not to say they thought books which treated women badly couldn't be good, they simply felt that the author should make it clear he didn't favour or bless that sort of treatment. Really, there was only two of them and they should have been outnumbered as the other three were in agreement, but such was the sheer force of their argument and passion that they won. David [Lodge] has told me he regrets it to this day, he feels he failed somehow by not saying, “It's two against three, Martin's on the list”.

In 2010, Amis explained his career-spanning failure to win a major literary award, despite enjoying robust book sales, thus:

There was a great fashion in the last century, and it's still with us, of the unenjoyable novel. And these are the novels which win prizes, because the committee thinks, “Well it's not at all enjoyable, and it isn't funny, therefore it must be very serious.”

Brits like Hilary Mantel, and especially Ian McEwan are working to turn this dreadful trend around. But when McEwan dared to write a novel about a neurosurgeon who prevails in the end over an afflicted, less privileged tormenter he was condemned by critic Jennifer Szalai in the pages of Harper’s Magazine for his “blithe, bourgeois sentiments.” If you’ve read Saturday, you know the sentiments are anything but blithe, and if you read Szalai’s review you’ll be taken aback by her articulate blindness.

Amis is probably right in suggesting that critics and award committees have a tendency to mistake misery for profundity. But his own case, along with several others like it, hint at something even more disturbing, a shift in the very idea of what role fictional narratives play in our lives.

The sad new reality is that, owing to the growing influence of ideologically extreme and idiotically self-righteous activist professors, literature is no longer read for pleasure and enrichment—it’s no longer even read as a challenging exercise in outgroup empathy. Instead, reading literature is supposed by many to be a ritual of male western penance. Prior to taking an interest in literary fiction, you must first be converted to the proper ideologies, made to feel sufficiently undeserving yet privileged, the beneficiary of a long history of theft and population displacement, the scion and gene-carrier of rapists and genocidaires—the horror, the horror. And you must be taught to systematically overlook and remain woefully oblivious of all the evidence that the Enlightenment was the best fucking thing that ever happened to the human species. Once you’re brainwashed into believing that so-called western culture is evil and that you’ve committed the original sin of having been born into it, you’re ready to perform your acts of contrition by reading horrendously boring fiction that forces you to acknowledge and reflect upon your own fallen state.

Fittingly, the apotheosis of this new literary tradition won the Booker in 1999, and its author, like Lessing, is a Nobel laureate. J.M. Coetzee’s Disgrace chronicles in exquisite free indirect discourse the degradation of David Lurie, a white professor in Cape Town, South Africa, beginning with his somewhat pathetic seduction of black student, a crime for which he pays with the loss of his job, his pension, and his reputation, and moving on to the aftermath of his daughter’s rape at the hands of three black men who proceed to rob her, steal his car, douse him with spirits and light him on fire. What’s unsettling about the novel—and it is a profoundly unsettling novel—is that its structure implies that everything that David and Lucy suffer flows from his original offense of lusting after a young black woman. This woman, Melanie, is twenty years old, and though she is clearly reluctant at first to have sex with her teacher there’s never any force involved. At one point, she shows up at David’s house and asks to stay with him. It turns out she has a boyfriend who is refusing to let her leave him without a fight. It’s only after David unheroically tries to wash his hands of the affair to avoid further harassment from this boyfriend—while stooping so low as to insist that Melanie make up a test she missed in his class—that she files a complaint against him.

David immediately comes clean to university officials and admits to taking advantage of his position of authority. But he stalwartly refuses to apologize for his lust, or even for his seduction of the young woman. This refusal makes him complicit, the novel suggests, in all the atrocities of colonialism. As he’s awaiting a hearing to address Melanie’s complaint, David gets a message:

On campus it is Rape Awareness Week. Women Against Rape, WAR, announces a twenty-four-hour vigil in solidarity with “recent victims”. A pamphlet is slipped under his door: ‘WOMEN SPEAK OUT.’ Scrawled in pencil at the bottom is a message: ‘YOUR DAYS ARE OVER, CASANOVA.’ (43)

During the hearing, David confesses to doctoring the attendance ledgers and entering a false grade for Melanie. As the attendees become increasingly frustrated with what they take to be evasions, he goes on to confess to becoming “a servant of Eros” (52). But this confession only enrages the social sciences professor Farodia Rassool:

Yes, he says, he is guilty; but when we try to get specificity, all of a sudden it is not abuse of a young woman he is confessing to, just an impulse he could not resist, with no mention of the pain he has caused, no mention of the long history of exploitation of which this is part. (53)

There’s also no mention, of course, of the fact that already David has gone through more suffering than Melanie has, or that her boyfriend deserves a great deal of the blame, or that David is an individual, not a representative of his entire race who should be made to answer for the sins of his forefathers.

After resigning from his position in disgrace, David moves out to the country to live with his daughter on a small plot of land. The attack occurs only days after he’s arrived. David wants Lucy to pursue some sort of justice, but she refuses. He wants her to move away because she’s clearly not safe, but she refuses. She even goes so far as to accuse him of being in the wrong for believing he has any right to pronounce what happened an injustice—and for thinking it is his place to protect his daughter. And if there’s any doubt about the implication of David’s complicity she clears it up. As he’s pleading with her to move away, they begin talking about the rapists’ motivation. Lucy says to her father,

When it comes to men and sex, David, nothing surprises me anymore. Maybe, for men, hating the woman makes sex more exciting. You are a man, you ought to know. When you have sex with someone strange—when you trap her, hold her down, get her under you, put all your weight on her—isn’t it a bit like killing? Pushing the knife in; exiting afterwards, leaving the body behind covered in blood—doesn’t it feel like murder, like getting away with murder? (158)

The novel is so engrossing and so disturbing that it’s difficult to tell what the author’s position is vis à vis his protagonist’s degradation or complicity. You can’t help sympathizing with him and feeling his treatment at the hands of Melanie, Farodia, and Lucy is an injustice. But are you supposed to question that feeling in light of the violence Melanie is threatened with and Lucy is subjected to? Are you supposed to reappraise altogether your thinking about the very concept of justice in light of the atrocities of history? Are we to see David Lurie as an individual or as a representative of western male colonialism, deserving of whatever he’s made to suffer and more?

Personally, I think David Lurie’s position in Disgrace is similar to that of John Proctor in The Crucible (although this doesn’t come out nearly as much in the movie version). And it’s hard not to see feminism in its current manifestations—along with Marxism and postcolonialism—as a pernicious new breed of McCarthyism infecting academia and wreaking havoc with men and literature alike. It’s really no surprise at all that the most significant developments in the realm of narratives lately haven’t occurred in novels at all. Insofar as the cable series contributing to the new golden age of television can be said to adhere to a formula, it’s this: begin with a bad ass male lead who doesn’t apologize for his own existence and has no qualms about expressing his feelings toward women. As far as I know, these shows are just as popular with women viewers as they are with the guys.

When David first arrives at Lucy’s house, they take a walk and he tells her a story about a dog he remembers from a time when they lived in a neighborhood called Kenilworth.

It was a male. Whenever there was a bitch in the vicinity it would get excited and unmanageable, and with Pavlovian regularity the owners would beat it. This went on until the poor dog didn’t know what to do. At the smell of a bitch it would chase around the garden with its ears flat and its tail between its legs, whining, trying to hide…There was something so ignoble in the spectacle that I despaired. One can punish a dog, it seems to me, for an offence like chewing a slipper. A dog will accept the justice of that: a beating for a chewing. But desire is another story. No animal will accept the justice of being punished for following its instincts.

Lucy breaks in, “So males must be allowed to follow their instincts unchecked? Is that the moral?” David answers,

No, that is not the moral. What was ignoble about the Kenilworth spectacle was that the poor dog had begun to hate its own nature. It no longer needed to be beaten. It was ready to punish itself. At that point it would be better to shoot it.

“Or have it fixed,” Lucy offers. (90)

Also Read:

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

And:
FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

Read More
Dennis Junk Dennis Junk

Taking the GRE again after 10 Years

I aced the verbal reasoning section of the GRE the first time I took it. It ended up, not being the worst thing that ever happened to me, but… distracting. Eleven years later, I had to take the test again to start trying to make my way back into school. How could I compete with my earlier perfection?

            I had it timed: if I went to the bathroom at 8:25, I’d be finishing up the essay portion of the test about ten minutes after my bladder was full again. Caffeine being essential for me to get into the proper state of mind for writing, I’d woken up to three cans of Diet Mountain Dew and two and half rather large cups of coffee. I knew I might not get called in to take the test precisely at 8:30, but I figured I could handle the pressure, as it were. The clock in the office of the test center read 8:45 when I walked in. Paperwork, signatures, getting a picture taken, turning out all my pockets (where I managed to keep my three talismans concealed)—by the time I was sitting down in the carrel—in a room that might serve as a meeting place for prisoners and their lawyers—it was after 9:00. And there were still more preliminaries to go through.

            Test takers are allotted 45 minutes for an essay on the “Issue Topic” prompted by a short quote. The “Analysis of an Argument” essay takes a half hour. The need to piss got urgent with about ten minutes left on the clock for the issue essay. By the end of the second essay, I was squirming and dancing and pretty desperate. Of course, I had to wait for our warden to let me out of the testing room. And then I had to halt midway through the office to come back and sign myself out. Standing at the urinal—and standing and standing—I had plenty of time to consider how poorly designed my strategy had been. I won’t find out my scores for the essay portion for ten or so days.

**********************************

            I’ve been searching my apartment for the letter with my official scores from the first time I took the GRE about ten years ago. I’d taken it near the end of the summer, at one of those times in life of great intellectual awakening. With bachelor’s degrees in both anthropology and psychology, and with only the most inchoate glimmerings of a few possible plans for the future, I lived in my dad’s enormous house with my oldest brother, who had returned after graduating from Notre Dame and was now taking graduate courses at IPFW, my alma mater, and some roommates. I delivered pizzas in the convertible Mustang I bought as a sort of hand-me-down from that same brother. And I spent hours every day reading.

            I’m curious about the specific date of the test because it would allow me place it in the context of what I was reading. It would also help me ascertain the amount of time I spent preparing. If memory serves, I was doing things like pouring over various books by Stephen Jay Gould and Richard Dawkins, trying to decide which one of them knew the real skinny on how evolution works. I think by then I’d read Frank Sulloway’s Born to Rebel, in which he applied complex statistics to data culled from historical samples and concluded that later-born siblings tend to be less conscientious but more open to new ideas and experiences. I was delighted to hear that the former president had read Jared Diamond’s Guns, Germs, and Steel, and thought it tragically unimaginable that the current president would ever read anything like that. At some point, I began circling words I didn’t recognize or couldn’t define so when I was finished with the chapter I could look them up and make a few flashcards.

            I’m not even sure the flashcards were in anticipation of the GRE. Several of my classmates in both the anthropology and psychology departments had spoken to me by then of their dejection upon receiving their scores. I was scared to take it. The trend seemed to be that everyone was getting about a hundred points less on this test than they did on the SAT. I decided I only really cared about the verbal reasoning section, and a 620 on that really wasn’t acceptable. Beyond the flashcards, I got my hands on a Kaplan CD-ROM from a guy at school and started doing all the practice tests on it. The scores it gave me hovered in the mid-600s. It also gave me scads of unfamiliar words (like scad) to put in my stack of flashcards, which grew, ridiculously, to the height of about a foot.

            I don’t remember much about the test itself. It was at a Sylvan Learning Center that closed a while back. One of the reading comprehension excerpts was on chimpanzees, which I saw as a good sign. When I was done, there was a screen giving me a chance to admit I cheated. It struck me as odd. Then came the screen with my scores—800 verbal reasoning. I looked around the room and saw nothing but the backs of silent test-takers. Could this be right? I never ace anything. It sank in when I was sitting down in the Mustang. Driving home on I-69, I sang along to “The Crush” by Dave Matthews, elated.

            I got accepted into MIT’s program in science writing based on that score and a writing sample in which I defended Frank Sulloway’s birth order theory against Judith Rich Harris, the author of The Nurture Assumption, another great book. But Harris’s arguments struck me as petty and somewhat disgraceful. She was engaging in something akin to a political campaign against a competing theory, rather than making a good faith effort to discover the truth. Anyway, the article I wrote got long and unwieldy. Michael Shermer considered it for publication in Skeptic but ultimately declined because I just didn’t have my chops up when it came to writing about science. By then, I was a writer of fiction.

            That’s why upon discovering how expensive a year in Cambridge would be and how little financial aid I’d be getting I declined MIT's invitation to attend their program. If being a science writer was my dream, I’d have gone. But I decided to hold out for an acceptance to an MFA program in creative writing. I’d already applied two years in row before stretching my net to include science writing. But the year I got accepted at MIT ended up being the third year of summary rejection on the fiction front. I had one more year before that perfect GRE score expired.

**********

            Year four went the same way all the other years had gone. I was in my late twenties now and had the feeling whatever opportunities that were once open to me had slipped away. Next came a crazy job at a restaurant—Lucky’s—and a tumultuous relationship with the kitchen manager. After I had to move out of the apartment I shared with her in the wake of our second breakup (there would be a third), I was in a pretty bad place. But I made the smartest decision I’d made in a while and went back to school to get my master’s in English at IPFW.

            The plan was to improve my qualifications for creative writing programs. And now that I’m nearly finished with the program I put re-taking the GRE at the top of my list for things to do this summer. In the middle of May, I registered to take it on June 22nd. I’d been dreading it ever since my original score expired, but now I was really worried. What would it mean if I didn’t get an 800 again? What if I got significantly lower than that? The MFA programs I’ll be applying to are insanely competitive: between five hundred and a thousand applicants for less than a dozen spaces. At the same time, though, there was a sense that a lower score would serve as this perfect symbol for just how far I’d let my life go off-track.

            Without much conscious awareness of what I was doing, I started playing out a Rocky narrative, or some story like Mohammed Ali making his comeback after losing his boxing license for refusing to serve in Vietnam. I would prove I wasn’t a has-been, that whatever meager accomplishments I had under my belt weren’t flukes. Last semester I wrote a paper on how to practice to be creative, and one of the books I read for it was K. Anders Ericsson’s The Road to Excellence. So, after signing up for the test I created a regimen of what Ericsson calls “deliberate practice,” based on anticipation and immediate feedback. I got my hands on as many sample items and sample tests I could find. I made little flashcards with the correct answers on them to make the feedback as close as possible to the hazarded answer. I put hours and hours into it. And I came up with a strategy for each section, and for every possible contingency I could think of. I was going to beat the GRE, again, through sheer force of will.

***********

            The order of the sections is variable. Ideally, the verbal section would have come first after the essay section so I wouldn’t have to budget my stores of concentration. But sitting down again after relieving my bladder I saw the quantitative section appear before me on the screen. Oh well, I planned for this too, I thought. I adhered pretty well to my strategy of working for a certain length of time to see if I could get the answer and then guessing if it didn’t look promising. And I achieved my goal for this section by not embarrassing myself. I got a 650.

            The trouble began almost immediately when the verbal questions starting coming. The strategy for doing analogies, the questions I most often missed in practice, was to work out the connection between the top words, “the bridge,” before considering the five word couples below to see which one has the same bridge. But because the screen was so large, and because I was still jittery from the caffeine, I couldn’t read the first word pair without seeing all the others. I abandoned the strategy with the first question.

            Then disaster struck. I’d anticipated only two sets of reading comprehension questions, but then, with the five minute warning already having passed, another impossibly long blurb appeared. I resign myself at that point to having to give up my perfect score. I said to myself, “Just read it quick and give the best answers you can.” I finished the section with about twenty seconds left. At least all the antonyms had been easy. Next came an experimental section I agreed to take since I didn’t need to worry about flagging concentration anymore. For the entire eighteen minutes it took, I sat there feeling completely defeated. I doubt my answers for that section will be of much use.

            Finally, I was asked if I wanted to abandon my scores—a ploy, I’m sure to get skittish people to pay to take the test twice. I said no, and clicked to see and record my scores. There it was at the top of the screen, my 800. I’d visualized the moment several times. I was to raise one arm in victory—but I couldn’t because the warden would just think I was raising my hand to signal I needed something. I also couldn’t because I didn’t feel victorious. I still felt defeated. I was sure all the preparation I’d done had been completely pointless. I hadn’t boxed. I’d clenched my jaw, bunched up my fist, and brawled.

            I listened to “The Crush” on the way home again, but as I detoured around all the construction downtown I wasn’t in a celebratory mood. I wasn’t elated. I was disturbed. The experience hadn’t been at all like a Rocky movie. It was a lot more like Gattaca. I’d come in, had my finger pricked so they could read my DNA, and had the verdict delivered to me. Any score could have come up on the screen. I had no control over it. That it turned out to be the one I was after was just an accident. A fluke.

**************

            The week before I took the test, I’d met a woman at Columbia Street who used to teach seventh graders. After telling her I taught Intro Comp at IPFW, we discussed how teaching is a process of translation from how you understand something into a language that will allow others who lack your experience and knowledge to understand it. Then you have to add some element of entertainment so you don’t lose their attention. The younger the students, the more patience it takes to teach them. Beginning when I was an undergrad working in the Writing Center, but really picking up pace as I got more and more experience as a TA, the delight I used to feel in regard to my own cleverness was being superseded by the nagging doubt that I could ever pass along the method behind it to anyone.

            When you’re young (or conservative), it’s easy to look at people who don’t do as well as you with disdain, as if it’s a moral failing on their part. You hold the conviction deep in your gut that if they merely did what you’ve done they’d have what you have or know what you know. Teaching disabuses you of this conviction (which might be why so many teachers are liberal). How many times did I sit with a sharp kid in the writing center trying to explain some element of college writing to him or her, trying to think back to how I had figured it out, and realizing either that I’d simply understood it without much effort or arrived at an understanding through a process that had already failed this kid? You might expect such a realization would make someone feel really brilliant. But in fact it’s humbling. You wonder how many things there are, fascinating things, important things, that despite your own best effort you’ll never really get. Someone, for instance, probably “just gets” how to relay complex information to freshman writers—just gets teaching.

            And if, despite your efforts, you’re simply accorded a faculty for perceiving this or understanding that, if you ever lose it your prospects for recreating the same magic are dismal. What can be given can be taken away. Finally, there’s the question of desert. That I can score an 800 on the verbal reasoning section of the GRE is not tied to my effort or to my will. I like to read, always have. It’s not work to me. My proficiency is morally arbitrary. And yet everyone will say about my accomplishments and accolades, “You deserve it.”

            Really, though, this unsettled feeling notwithstanding, this is some stupid shit to complain about. I aced the GRE—again. It’s time to celebrate.

Also read:

GRACIE - INVISIBLE FENCES

SECRET DANCERS

THE GHOST HAUNTING 710 CROWDER COURT

KAYAKING ON A WORMHOLE

Read More
Dennis Junk Dennis Junk

How to Read Stories--You're probably doing it wrong

Your efforts to place each part into the context of the whole will, over time, as you read more stories, give you a finer appreciation for the strategies writers use to construct their work, one scene or one section at a time. And as you try to anticipate the parts to come from the parts you’ve read you will be training your mind to notice patterns, laying down templates for how to accomplish the types of effects—surprise, emotional resonance, lyricism, profundity—the author has accomplished.

There are whole books out there about how to read like a professor or a writer, or how to speed-read and still remember every word. For the most part, you can discard all of them. Studies have shown speed readers are frauds—the faster they read the less they comprehend and remember. The professors suggest applying the wacky theories they use to write their scholarly articles, theories which serve to cast readers out of the story into some abstract realm of symbols, psychological forces, or politics. I find the endeavor offensive.

Writers writing about how to read like a writer are operating on good faith. They just tend to be a bit deluded. Literature is very much like a magic trick, but of course it’s not real magic. They like to encourage people to stand in awe of great works and great passages—something I frankly don’t need any encouragement to do (what is it about the end of “Mr. Sammler’s Planet”?) But to get to those mystical passages you have to read a lot of workaday prose, even in the work of the most lyrical and crafty writers. Awe simply can’t be used as a reading strategy.

Good fiction is like a magic trick because it’s constructed of small parts that our minds can’t help responding to holistically. We read a few lines and all the sudden we have a person in mind; after a few pages we find ourselves caring about what happens to this person. Writers often avoid talking about the trick and the methods and strategies that go into it because they’re afraid once the mystery is gone the trick will cease to convince. But even good magicians will tell you well performed routines frequently astonish even the one performing them. Focusing on the parts does not diminish appreciation for the whole.

The way to read a piece of fiction is to use the information you've already read in order to anticipate what will happen next. Most contemporary stories are divided into several sections, which offer readers the opportunity to pause after each, reflecting how it may fit into the whole of the work. The author had a purpose in including each section: furthering the plot, revealing the character’s personality, developing a theme, or playing with perspective. Practice posing the questions to yourself at the end of each section, what has the author just done, and what does it suggests she’ll likely do in sections to come.

In the early sections, questions will probably be general: What type of story is this? What type of characters are these? But by the time you reach about the two/thirds point they will be much more specific: What’s the author going to do with this character? How is this tension going to be resolved? Efforts to classify and anticipate the elements of the story will, if nothing else, lead to greater engagement with it. Every new character should be memorized—even if doing so requires a mnemonic (practice coming up with one on the fly).

The larger goal, though, is a better understanding of how the type of fiction you read works. Your efforts to place each part into the context of the whole will, over time, as you read more stories, give you a finer appreciation for the strategies writers use to construct their work, one scene or one section at a time. And as you try to anticipate the parts to come from the parts you’ve read you will be training your mind to notice patterns, laying down templates for how to accomplish the types of effects—surprise, emotional resonance, lyricism, profundity—the author has accomplished.

By trying to get ahead of the author, as it were, you won’t be learning to simply reproduce the same effects. By internalizing the strategies, making them automatic, you’ll be freeing up your conscious mind for new flights of creative re-working. You’ll be using the more skilled author’s work to bootstrap your own skill level. But once you’ve accomplished this there’ll be nothing stopping you from taking your own writing to the next level. Anticipation makes reading a challenge in real time—like a video game. And games can be conquered.

Finally, if a story moves you strongly, re-read it immediately. And then put it in a stack for future re-reading.

Also read:

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

Read More
Dennis Junk Dennis Junk

Productivity as Practice:An Expert Performance Approach to Creative Writing Pedagogy Part 3

Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is difficult to define.

Start reading at part one.

            But the question of what standards of success the instructor is to apply to students’ work, as well as the ones instructors will encourage the students to apply to each others’ work, has yet to be addressed. The skills students develop through practicing evaluating their own work will both be based on their evaluations of the works of others and be applied to them. The first step toward becoming a creative writer is recognizing how much one likes the writing of another. The work the student is initially exposed to will almost certainly have gone through a complex series of assessments beginning with the author’s of his own work, onto commenters and editors working on behalf of the author, then onto editors working on behalf of publishers, and finally to the publishers themselves. Even upon publication, any given work is unlikely to be read by a majority of readers who appreciate the type of writing it represents until a critical threshold is reached beyond which its likelihood of becoming recommended reading is increased. At some point in the process it may even reach the attention of critics and reviewers, who will themselves evaluate the work either positively or negatively. (This is leaving out the roles of branding and author reputation because they probably aren’t practicable skills.) Since feedback cannot be grounded in any absolute or easily measurable criteria, Ericsson advocates a “socially based definition of creativity” (330). And, since students develop their evaluative skills through internalizing and anticipating the evaluations of others, the choice of which workshop to attend is paramount. The student should seek out those most versed in and most appreciative of the type of writing he aspires to master.

            Simply reading theoretical essays on poetry or storytelling, as Vikil has his students do, is probably far less effective than sampling a theorist’s or critic’s work and then trying to anticipate that evaluator’s response to a work he or she has written about. Some critics’ work lends itself to this type of exercise more readily than others; those who focus on literary as opposed to political elements, and those who put more effort into using sound methods to ensure the validity of their psychological or sociological theories—if they must theorize—will be much more helpful than those who see each new work as an opportunity to reiterate their favorite ideas in a fresh context. It may be advisable, in other words, to concentrate on reviewers rather than critics and theorists. After having learned to anticipate the responses of a few reviewers whose work is influential, the student will be better equipped to evaluate his or her own work in terms of how it will be received in the social context that will be the final arbiter of success or failure.

            Anticipation, as it allows for feedback, could form the basis for several types of practice exercises. Ericsson cites his own and others’ research demonstrating that chess players improve not as a function of how much time they spend playing chess but through studying past games between chess masters. “By trying to select the best move for each position of the game,” Ericsson writes, “and comparing their selected move to the actual move of the game, the players identify discrepancies where they must study the chess position more deeply to uncover the reason for the master’s move” (37). In a similar way, pausing in the process of reading to anticipate a successful author’s next move in a story or novel should offer an opportunity for creative writing students to compare their ideas with the author’s. Of course, areas of divergence between the reader’s ideas for a next move and the one the author actually made need not be interpreted as a mistake on the part of the reader—the reader’s idea may even be better. However, in anticipating what will happen next in a story, the student is generating ideas and therefore getting practice in the area of productivity. And, whether or not the author’s ideas are better, the student will develop greater familiarity with her methods through such active engagement with them. Finally, the students will be getting practice evaluating ideas as they compare their own to those of the author.

            A possible objection to implementing this anticipatory reading method in a creative writing curriculum is that a student learning to anticipate an author’s moves would simply be learning to make moves like the ones that author makes—which amounts to reproduction, not creativity. Indeed, one of the theories Ericsson has explored to explain how expertise develops posits a sort of rote memorization of strategies and their proper application to a limited set of situations. “For a long time it was believed that experts acquired a large repertoire of patterns,” he explains, “and their superior performance could be attributed to simple pattern matching and recall of previously stored actions from memory in an effortless and automatic manner” (331). If expertise relies on memory and pattern recognition, though, then experts would fare no better in novel situations than non-experts. Ericsson has found just the opposite to be the case.

Superior expert performers in domains such as music, chess, and medicine can generate better actions than their less skilled peers even in situations they have never directly experienced. Expert performers have acquired refined mental representations that maintain access to relevant information about the situation and support more extensive, flexible reasoning to determine the appropriate actions demanded by the encountered situation. (331)

What the creative writer would be developing through techniques for practice such as anticipation-based reading likely goes beyond a simple accumulation of fixed strategies—a bigger bag of tricks appropriated from other authors. They would instead be developing a complex working model of storytelling as well as a greater capacity for representing and manipulating the various aspects of their own stories in working memory.

            Skepticism about whether literary writing of any sort can be taught—or learned in any mundane or systematic way—derives from a real and important insight: authors are judged not by how well they reproduce the formulas of poetry and storytelling but by how successful they are in reformulating the conventional techniques of the previous generation of writers. No one taught Cervantes his epic-absurd form of parody. No one taught Shakespeare how to explore the inner workings of his characters’ minds through monologues. No one taught Virginia Woolf how to shun external trappings and delve so exquisitely into the consciousness of her characters. Yet observations of where authors came to reside in relation to prevailing literary cultures don’t always offer clues to the mode of transportation. Woolf, for instance, wrote a great deal about the fashion for representing characters through references to their physical appearances and lists of their possessions in her reviews for the Times Literary Supplement. She didn’t develop her own approach oblivious of what she called “materialism”; fully understanding the method, she found it insufficient for what she hoped to accomplish with her own work. And she’d spent a lot time in her youth reading Shakespeare, with those long eminently revealing monologues (Wood 110). Supposing creative genius is born of mastery of conventions and techniques and not ignorance of or antipathy toward them, the emphasis on the works of established authors in creative writing pedagogy ceases to savor of hidebound conservatism.

            The general pedagogical outline focusing on practice devoted to productivity, as well as the general approach to reading based on anticipation can be refined to accommodate any student’s proclivities or concerns. A student who wants to develop skill in describing characters’ physical appearances in a way that captures something of the essence of their personalities may begin by studying the work of authors from Charles Dickens to Saul Bellow. Though it’s difficult to imagine how such descriptions might be anticipated, the characters’ later development over the course of the plot does offer opportunities to test predictions. Coming away from studies of past works, the student need not be limited to exercises on blank screens or sheets of paper; practice might entail generating multiple ideas for describing some interesting individual he knows in real life, or describing multiple individuals he knows. He may set himself the task of coming up with a good description for everyone interviewed during the course of a television news program. He can practice describing random people who pass on a campus sidewalk, imagining details of their lives and personalities, or characters in shows and movies. By the time the aspiring author is sitting down to write about her own character in a story or novel, she will all but automatically produce a number of possible strategies for making that character come alive through words, increasing the likelihood that she’ll light on one that resonates strongly, first with her own memories and emotions and then with those of her readers. And, if Simonton’s theory has any validity, the works produced according to this strategy need not resemble each other any more than one species resembles another.

            All of the conventional elements of craft—character, plot, theme, dialogue, point of view, and even higher-order dimensions like voice—readily lend themselves to this qualitative approach to practice. A creative writing instructor may coach a student who wants to be better able to devise compelling plots to read stories recognized as excelling in that dimension, encouraging her to pause along the way to write a list of possible complications, twists, and resolutions to compare with the ones she’ll eventually discover in the actual text. If the student fails to anticipate the author’s moves, she can then compare her ideas with the author’s, giving her a deeper sense of why one works better than the others. She may even practice anticipating the plots of television shows and movies, or trying to conceive of how stories in the news might be rendered as fictional plots. To practice describing settings, students could be encouraged to come up with multiple metaphors and similes based on one set and then another of the physical features they observe in real places. How many ways, a student may be prompted, can you have characters exchange the same basic information in a dialogue? Which ones reveal more of the characters’ personalities? Which ones most effectively reprise and develop the themes you’re working with? Any single idea generated in these practice sessions is unlikely to represent a significant breakthrough. But the more ideas one has the more likely she’ll discover one which seems likely to her to garner wider recognition of superior quality. The productivity approach can also be applied to revision and would consist of the writer identifying weak passages or scenes in an early draft and generating several new versions of each one so that a single, best version can be chosen for later drafts.

            What I’ve attempted here is a sketch of one possible approach to teaching. It seems that since many worry about the future of literature, fearing that the growing influence of workshops will lead to insularity and standardization, too few teachers are coming forward with ideas on how to help their students improve, as if whatever methods they admit to using would inevitably lend credence to the image of workshops as assembly lines for the production of mediocre and tragically uninspired poems and short stories. But, if creative writing is in danger of being standardized into obsolescence, the publishing industry is the more likely culprit, as every starry-eyed would-be author knows full well publication is the one irreducible factor underlying professional legitimacy. And research has pretty thoroughly ruled out the notion that familiarity with the techniques of the masters in any domain inevitably precludes original, truly creative thinking. The general outline for practice based on productivity and evaluation can be personalized and refined in countless ways, and students can be counted on to bring an endless variety of experiences and perspectives to workshops, variety that would be difficult, to say the least, to completely eradicate in the span of the two or three years allotted to MFA programs.

            The productivity and evaluation model for creative writing pedagogy also holds a great deal of potential for further development. For instance, a survey of successful poets and fiction writers asking them how they practice—after providing them a précis of Ericsson’s and Simonton’s findings on what constitutes practice—may lead to the development of an enormously useful and surprising repertoire of training techniques. How many authors engage in activities they think of as simple games or distractions but in fact contribute to their ability to write engaging and moving stories or poems? Ultimately, though, the discovery of increasingly effective methods will rely on rigorously designed research comparing approaches to each other. The popular rankings for MFA programs based on the professional success of students who graduate from them are a step in the direction of this type of research, but they have the rather serious flaw of sampling bias owing to higher ranking schools having the advantage of larger applicant pools. At this stage, though, even the subjective ratings of individuals experimenting with several practice techniques would be a useful guide for adjusting and refining teaching methods.

            Applying the expert performance framework developed by Ericsson, Simonton, Csikszentmihaly, and their colleagues to creative writing pedagogy would probably not drastically revolutionize teaching and writing practices. It would rather represent a shift in focus from the evaluation-heavy workshop model onto methods for generating ideas. And of course activities like brainstorming and free writing are as old as the hills. What may be new is the conception of these invention strategies as a form of practice to be engaged in for the purpose of developing skills, and the idea that this practice can and should be engaged in independent of any given writing project. Even if a writing student isn’t working on a story or novel, even if he doesn’t have an idea for one yet, he should still be practicing to be a better storyteller or novelist. It’s probably the case, too, that many or most professional writers already habitually engage in activities fitting the parameters of practice laid out by the expert performance model. Such activities probably already play at least some role in classrooms. Since the basic framework can be tailored to any individual’s interests, passions, weaknesses, and strengths, and since it stresses the importance and quantity of new ideas, it’s not inconceivable that more of this type of practice will lead to greater as opposed to less originality.

Also read:

WHAT IS A STORY? AND WHAT ARE YOU SUPPOSED TO DO WITH ONE?

HOW TO GET KIDS TO READ LITERATURE WITHOUT MAKING THEM HATE IT

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

Read More
Dennis Junk Dennis Junk

Productivity as Practice: An Expert Performance Approach to Creative Writing Pedagogy Part 2

Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is difficult to define.

            Of course, productivity alone cannot account for impactful ideas and works; at some point the most promising ideas must be culled from among the multitude. Since foresight seems to play little if any role in the process, Simonton, following D.T. Campbell, describes it as one of “blind variation and selective retention” (310). Simonton thus theorizes that creativity is Darwinian. Innovative and valuable ideas are often borne of non-linear or “divergent” thinking, which means their future use may not be at all apparent when they are originally conceived. So, Csikszentmihalyi follows his advice to produce multiple ideas with the suggestion, “Try to produce unlikely ideas” (369). Ignoring future utility, then, seems to be important for the creative process, at least until the stage is reached when one should “Shift from openness to closure.” Csikszentmihalyi explains

Good scientists, like good artists, must let their minds roam playfully or they will not discover new facts, new patterns, new relationships. At the same time, they must also be able to evaluate critically every novelty they encounter, forget immediately the spurious ones, and then concentrate their minds on developing and realizing the few that are promising. (361)

So, two sets of skills appear to lie at the heart of creative endeavors, and they suggest themselves as focal areas for those hoping to build on their talents. In the domain of creative writing, it would seem the most important things to practice are producing multiple and unlikely ideas, and evaluating those ideas to see which are the most viable.

            The workshop method prevalent in graduate writing programs probably involves at least some degree of practice in both of these skills. Novelist and teacher Ardashir Vakil, in his thoughtful and candid essay, “Teaching Creative Writing,” outlines what happens in his classrooms.

In the first part of the workshop students are given writing exercises. These vary from the most basic—write about a painful experience from your childhood—to more elaborate games in which you get pairs to tell stories to each other and then write the other person’s story with some bits invented. Then we look at texts by established writers and try to analyse what makes them work—what has the writer done?—in terms of character, language, voice and structure to capture our attention and how have they brought forth a visceral emotional response from the reader. (158)

            These exercises amount to efforts to generate ideas by taking inspiration from life, the writer’s or someone else’s, or from the work of successful writers. The analysis of noteworthy texts also shades into the practice of evaluating ideas and methods. Already, though, it seems the focus leans more toward evaluation, the ability to recognize useful ideas, than productivity. And this emphasis becomes even more pronounced as the course progresses.

Along the way, we read essays, interviews and extracts by established writers, reflecting on their practice. Sometimes I use an essay by Freud, Bakhtin or Benjamin on theories of storytelling. Finally, there is a group workshop in which we read and discuss each others’ writing. Each week, someone offers up a story or a few poems or an extract to the group, who go away and read, annotate and comment on it. (158)

Though the writer’s reflections on their practices may describe methods for generating ideas, those methods don’t seem to comprise an integral part of the class. Vakil reports that “with minor variations” this approach is common to creative writing programs all over England and America.

            Before dismissing any of these practices, though, it is important to note that creative writing must rely on skills beyond the production and assessment of random ideas. One could have a wonderful idea for a story but not have the language or storytelling skills necessary to convey it clearly and movingly. Or one may have a great idea for how to string words together into a beautiful sentence but lack any sense of how to fit it into a larger plot or poem. In a critique of the blind-variation and selective-retention model, Ericsson points out that productivity in terms of published works, which is what Simonton used to arrive at his equal odds rule, takes a certain level of expertise for granted. Whether students learn to develop multiple new ideas by writing down each other’s stories or not, it is likely important that they get practice with the basic skills of stringing words together into narratives. As Ericsson explains, “Unless the individual has the technical mastery to develop his or her ideas or products fully, it is unlikely that judges will be able to recognize their value and potential” (330). Though mere productivity may be what separates the professional from the game-changer, to get to the level of expertise required to reliably produce professional-grade work of any quality takes more than a bunch of blindly conceived ideas. As if defending Vakil’s assigned reading of “established writers,” Ericsson argues that “anyone interested in being able to anticipate better what is valued by experts in a domain should study the teachings and recognized masterpieces of master teachers in that domain” (330).

            Part of the disagreement between Simonton and Ericsson stems from their focusing on different levels of the creative process. In the domain of creative writing, even the skills underlying “technical mastery” are open to revision. Writers can—and certainly have—experimented with every aspect of storytelling from word choice and syntax at the fundamental level to perspective and voice at higher-order levels. The same is true for poetry. Assuming the equal odds principle can be extrapolated into each of these levels, teachers of creative writing might view their role as centering on the assessment of their students’ strengths and weaknesses and thenceforth offering encouragement to them to practice in those areas they show poorer skills in. “Develop what you lack” (360) is another of Csikszentmihalyi’s prescriptions. The teacher also has at her disposal the evaluations of each student’s classmates, which she might collate into a more unified assessment. Rather than focusing solely on a student’s text, then, the teacher could ask for the class’s impressions of the writer’s skills as evidenced by the text in relation to others offered by that student in the past. Once a few weaknesses have been agreed upon, the student can then devote practice sessions to generating multiple ideas in that area and subsequently evaluating them with reference to the works of successful authors.

            The general outline for creative writing workshops based on the expert performance framework might entail the following: Get the workshop students to provide assessments, based on each of the texts submitted for the workshop, of their colleagues’ strengths and weaknesses to supplement those provided by the instructor. If a student learns from such assessments that, say, his plots are engaging but his characters are eminently forgettable, he can then devote practice sessions to characterization. These sessions should consist of 1) studying the work of authors particularly strong in this area, 2) brainstorming exercises in which the student generates a large number of ideas in the area, and 3) an exercise at a later time involving a comparative assessment of the ideas produced in the prior stage. This general formula can be applied to developing skills in any aspect of creative writing, from word choice and syntax to plot and perspective. As the workshop progresses and the student receives more feedback from the evaluations, he will get better at anticipating the responses of the instructor and his classmates, thus honing the evaluative skills necessary for the third practice phase.

            The precedence of quantity of ideas over quality may be something of a dirty little secret for those with romantic conceptions of creative writing. Probably owing to these romantic or mystical notions about creativity, workshops focus on assessment and evaluation to the exclusion of productivity. One objection to applying the productivity approach within the expert performance framework likely to be leveled by those with romantic leanings is that they ignore the emotional aspects of creative writing. Where in the process of developing a slew of random words and images and characters does the heart come in? Many writers report that they are routinely struck with ideas and characters—some of whom are inspired by real people—that they simply have to write about. And these ideas that come equipped with preformed emotional resonance are the same ones that end up striking a chord with readers. Applying a formula to the process of conceiving ideas, even assuming it doesn’t necessarily lead to formulaic works, might simply crowd out or somehow dissipate the emotional pull of these moments of true inspiration.

            This account may, however, go further toward supporting the expert performance methods than casting doubt on them. For one, it leaves unanswered the question of how many ideas the writer was developing when the big inspiration struck. How many other real people had the author considered as possible characters before lighting on the one deemed perfect? Like the dreamer who becomes convinced of her dreams’ prophetic powers by dint of forgetting the much greater proportion of dreams that don’t come true, these writers are likely discounting a large number of ideas they generated before settling on the one with the greatest potential. Far from being completely left out of the training methods, emotional resonance can just as easily take ascendant priority as an evaluative criterion. It can even play a central role in the other two phases of practice.

            If the student reads a passage from another author’s work that evokes a strong emotion, she can then analyze the writing to see what made it so powerful. Also, since the emotion the passage evoked will likely prime the reader’s mind to recall experiences that aroused similar feelings—memories which resonate with the passage—it offers an opportunity for brainstorming ideas linked with those feelings, which will in turn have greater potential for evoking them in other readers. And of course the student need not limit herself to memories of actual events; she can elaborate on those events or extemporize to come up with completely new scenes. The fear that a cognitive exercise must preclude any emotional component is based on a false dichotomy between feeling and thought and the assumption that emotions are locked off from thinking and thus not amenable to deliberate training. Any good actor can attest that this assumption is wrong.

Part 3 of this essay.

Read More
Dennis Junk Dennis Junk

Productivity as Practice: An Expert Performance Approach to Creative Writing Pedagogy Part 1

Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is difficult to define.

            Much of the pedagogy in creative writing workshops derives solely from tradition and rests on the assumption that the mind of the talented writer will adopt its own learned practices in the process of writing. The difficult question of whether mastery, or even expertise, can be inculcated through any process of instruction, and the long-standing tradition of assuming the answer is an only somewhat qualified “no”, comprise just one of several impediments to developing an empirically supported set of teaching methods for aspiring writers. Even the phrase, “empirically supported,” conjures for many the specter of formula, which they fear students will be encouraged to apply to their writing, robbing the products of some mysterious and ineffable quality of freshness and spontaneity. Since the criterion of originality is only one of several that are much easier to recognize than they are to define, the biggest hindrance to moving traditional workshop pedagogy onto firmer empirical ground may be the intractability of the question of what evaluative standards should be applied to student writing. Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is as difficult to describe in any detail as the standards by which work in that domain are evaluated.

            Paul Kezle, in a review article whose title, “What Creative Writing Pedagogy Might Be,” promises more than the conclusions deliver, writes, “The Iowa Workshop model originally laid out by Paul Engle stands as the pillar of origination for all debate about creative writing pedagogy” (127). This model, which Kezle describes as one of “top-down apprenticeship,” involves a published author who’s achieved some level of acclaim—usually commensurate to the prestige of the school housing the program—whose teaching method consists of little more than moderating evaluative class discussions on each student’s work in turn. The appeal of this method is two-fold. As Shirley Geok-lin Lim explains, it “reliev[es] the teacher of the necessity to offer teacher feedback to students’ writing, through editing, commentary, and other one-to-one, labor intensive, authority-based evaluation” (81), leaving the teacher more time to write his or her own work as the students essentially teach each other and, hopefully, themselves. This aspect of self-teaching is the second main appeal of the workshop method—it bypasses the pesky issue of whether creative writing can be taught, letting the gates of the sacred citadel of creative talent remain closed. Furthermore, as is made inescapably clear in Mark McGurl’s book The Program Era, which tracks the burgeoning of creative writing programs as their numbers go from less than eighty in 1975 to nearly nine hundred today, the method works, at least in terms of its own proliferation.

            But what, beyond enrolling in a workshop, can a writer do to get better at writing? The answer to this question, assuming it can be reliably applied to other writers, holds the key to answering the question of what creative writing teachers can do to help their students improve. Lim, along with many other scholars and teachers with backgrounds in composition, suggests that pedagogy needs to get beyond “lore,” by which she means “the ad hoc strategies composing what today is widely accepted as standard workshop technique” (79). Unfortunately, the direction these theorists take is forbiddingly abstruse, focusing on issues of gender and ethnic identity in the classroom, or the negotiation of power roles (see Russel 109 for a review.) Their prescription for creative writing pedagogy boils down to an injunction to introduce students to poststructuralist ways of thinking and writing. An example sentence from Lim will suffice to show why implementing this approach would be impractical:

As Kalamaras has argued, however, collective identities, socially constructed, historically circumscribed, uniquely experienced, call for a “socially responsible” engagement, not only on the level of theme and content but particularly on that of language awareness, whether of oral or dialectic-orthographic “voice,” lexical choice, particular idiolect features, linguistic registers, and what Mikhail Bakhtin called heteroglossic characteristics. (86)

Assuming the goal is not to help marginalized individuals find a voice and communicate effectively and expressively in society but rather to help a group of students demonstrating some degree of both talent and passion in the realm of creative writing to reach the highest levels of success possible—or even simply to succeed in finding a way to get paid for doing what they love—arcane linguistic theories are unlikely to be of much use. (Whether they’re of any real use even for the prior goal is debatable.)

            Conceiving of creative writing as the product of a type of performance demanding several discrete skills, at least some of which are improvable through training, brings it into a realm that has been explored with increasing comprehensiveness and with ever more refined methods by psychologists. While University of Chicago professor Mihaly Csikszentmihalyi writes about the large group of highly successful people in creative fields interviewed for his book Creativity: Flow and the Psychology of Discovery and Invention as if they were a breed apart, even going so far as to devote an entire chapter to “The Creative Personality,” and in so doing reinforcing the idea that creative talent is something one is simply born with, he does manage to provide several potentially useful strategies for “Enhancing Personal Creativity” in a chapter by that name. “Just as a physician may look at the physical habits of the most healthy individuals” Csikszentmihalyi writes, “to find in them a prescription that will help everyone else to be more healthy, so we may extract some useful ideas from the lives of a few creative persons about how to enrich the lives of everyone else” (343). The aspirant creative writer must understand, though, that “to move from personal to cultural creativity one needs talent, training, and an enormous dose of good luck” (344). This equation, as it suggests only one variable amenable to deliberate effort, offers a refinement to the question of what an effective creative writing pedagogy might entail. How does one train to be a better a writer? Training as a determining factor underlying exceptional accomplishments is underscored by Ericsson’s finding that “amount of experience in a domain is often a weak predictor of performance” (20). Simply writing poems and stories may not be enough to ensure success in the realm of creative writing, especially considering the intense competition evidenced by those nearly nine hundred MFA programs.

            Because writing stories and poems seldom entails a performance in real time, but instead involves multiple opportunities for inspiration and revision, the distinction Ericsson found between simply engaging in an activity and training for it may not be as stark for creative writing. Writing and training may overlap if the tasks involved in writing meet the requirements for effective training. Having identified deliberate practice as the most important predictor of expert performance, Ericsson breaks the concept down into three elements: “a well-defined task with an appropriate level of difficulty for the particular individual, informative feedback, and opportunities for repetition and corrections of errors” (21). Deliberate practice requires immediate feedback on performance. In a sense, success can be said to multiply in direct proportion to the accumulation of past failures. But how is a poet to know if the line she’s just written constitutes a success or failure? How does a novelist know if a scene or a chapter bears comparison to the greats of literature?

            One possible way to get around the problem of indefinable evaluative standards is to focus on quantity instead of quality. Ericsson’s colleague, Dean Simonton, studies people in various fields in which innovation is highly valued in an attempt to discover what separates those who exhibit “received expertise,” mastering and carrying on dominant traditions in arts or sciences, from those who show “creative expertise” (228) by transforming or advancing those traditions. Contrary to the conventional view that some individuals possess a finely attuned sense of how to go about producing a successful creative work, Simonton finds that what he calls “the equal odds rule” holds in every creative field he’s studied. What the rule suggests is “that quality correlates positively with quantity, so that creativity becomes a linear statistical function of productivity” (235). Individuals working in creative fields can never be sure which of their works will have an impact, so the creators who have the greatest impact tend to be those who produce the greatest number of works. Simonton has discovered that this rule holds at every stage in the individual’s lifespan, leading him to conclude that success derives more from productivity and playing the odds than from sure-footed and far-seeing genius. “The odds of hitting a bull’s eye,” he writes, “is a probabilistic function of the number of shots” (234). Csikszentmihalyi discovered a similar quantitative principle among the creative people he surveyed; part of creativity, he suggests, is having multiple ideas where only one seems necessary, leading him to the prescription for enhancing personal creativity, “Produce as many ideas as possible” (368).

Part 2 of this essay.

Read More