READING SUBTLY

This
was the domain of my Blogger site from 2009 to 2018, when I moved to this domain and started
The Storytelling Ape
. The search option should help you find any of the old posts you're looking for.
 

Dennis Junk Dennis Junk

How Violent Fiction Works: Rohan Wilson’s “The Roving Party” and James Wood’s Sanguinary Sublime from Conrad to McCarthy

James Wood criticized Cormac McCathy’s “No Country for Old Men” for being too trapped by its own genre tropes. Wood has a strikingly keen eye for literary registers, but he’s missing something crucial in his analysis of McCarthy’s work. Rohan Wilson’s “The Roving Party” works on some of the same principles as McCarthy’s work, and it shows that the authors’ visions extend far beyond the pages of any book.

            Any acclaimed novel of violence must be cause for alarm to anyone who believes stories encourage the behaviors depicted in them or contaminate the minds of readers with the attitudes of the characters. “I always read the book as an allegory, as a disguised philosophical argument,” writes David Shields in his widely celebrated manifesto Reality Hunger. Suspicious of any such disguised effort at persuasion, Shields bemoans the continuing popularity of traditional novels and agitates on behalf of a revolutionary new form of writing, a type of collage that is neither marked as fiction nor claimed as truth but functions rather as a happy hybrid—or, depending on your tastes, a careless mess—and is in any case completely lacking in narrative structure. This is because to him giving narrative structure to a piece of writing is itself a rhetorical move. “I always try to read form as content, style as meaning,” Shields writes. “The book is always, in some sense, stutteringly, about its own language” (197).

As arcane as Shields’s approach to reading may sound, his attempt to find some underlying message in every novel resonates with the preoccupations popular among academic literary critics. But what would it mean if novels really were primarily concerned with their own language, as so many students in college literature courses are taught? What if there really were some higher-order meaning we absorbed unconsciously through reading, even as we went about distracting ourselves with the details of description, character, and plot? Might a novel like Heart of Darkness, instead of being about Marlowe’s growing awareness of Kurtz’s descent into inhuman barbarity, really be about something that at first seems merely contextual and incidental, like the darkness—the evil—of sub-Saharan Africa and its inhabitants? Might there be a subtle prompt to regard Kurtz’s transformation as some breed of enlightenment, a fatal lesson encapsulated and propagated by Conrad’s fussy and beautifully tantalizing prose, as if the author were wielding the English language like the fastenings of a yoke over the entire continent?

Novels like Cormac McCarthy’s Blood Meridian and, more recently, Rohan Wilson’s The Roving Party, take place amid a transition from tribal societies to industrial civilization similar to the one occurring in Conrad’s Congo. Is it in this seeming backdrop that we should seek the true meaning of these tales of violence? Both McCarthy’s and Wilson’s novels, it must be noted, represent conspicuous efforts at undermining the sanitized and Manichean myths that arose to justify the displacement and mass killing of indigenous peoples by Europeans as they spread over the far-flung regions of the globe. The white men hunting “Indians” for the bounties on their scalps in Blood Meridian are as beastly and bloodthirsty as the savages peopling the most lurid colonial propaganda, just as the Europeans making up Wilson’s roving party are only distinguishable by the relative degrees of their moral degradation, all of them, including the protagonist, moving in the shadow of their chief quarry, a native Tasmanian chief.

If these novels are about their own language, their form comprising their true content, all in the service of some allegory or argument, then what pleasure would anyone get from them, suggesting as they do that to partake of the fruit of civilization is to become complicit in the original sin of the massacre that made way for it? “There is no document of civilization,” Walter Benjamin wrote, “that is not at the same time a document of barbarism.” It could be that to read these novels is to undergo a sort of rite of expiation, similar to the ritual reenactment of the crucifixion performed by Christians in the lead up to Easter. Alternatively, the real argument hidden in these stories may be still more insidious; what if they’re making the case that violence is both eternal and unavoidable, that it is in our nature to relish it, so there’s no more point in resisting the urge personally than in trying to bring about reform politically?

            Shields intimates that the reason we enjoy stories is that they warrant our complacency when he writes, “To ‘tell a story well’ is to make what one writes resemble the schemes people are used to—in other words, their ready-made idea of reality” (200). Just as we take pleasure in arguments for what we already believe, Shields maintains (explicitly) that we delight in stories that depict familiar scenes and resolve in ways compatible with our convictions. And this equating of the pleasure we take in reading with the pleasure we take in having our beliefs reaffirmed is another practice nearly universal among literary critics. Sophisticated readers know better than to conflate the ideas professed by villainous characters like the judge in Blood Meridian with those of the author, but, as one prominent critic complains,

there is often the disquieting sense that McCarthy’s fiction puts certain fond American myths under pressure merely to replace them with one vaster myth—eternal violence, or [Harold] Bloom’s “universal tragedy of blood.” McCarthy’s fiction seems to say, repeatedly, that this is how it has been and how it always will be.

What’s interesting about this interpretation is that it doesn’t come from anyone normally associated with Shields’s school of thought on literature. Indeed, its author, James Wood, is something of a scourge to postmodern scholars of Shields’s ilk.

Wood takes McCarthy to task for his alleged narrative dissemination of the myth of eternal violence in a 2005 New Yorker piece, Red Planet: The Sanguinary Sublime of Cormac McCarthy, a review of his then latest novel No Country for Old Men. Wood too, it turns out, hungers for reality in his novels, and he faults McCarthy’s book for substituting psychological profundity with the pabulum of standard plot devices. He insists that

the book gestures not toward any recognizable reality but merely toward the narrative codes already established by pulp thrillers and action films. The story is itself cinematically familiar. It is 1980, and a young man, Llewelyn Moss, is out antelope hunting in the Texas desert. He stumbles upon several bodies, three trucks, and a case full of money. He takes the money. We know that he is now a marked man; indeed, a killer named Anton Chigurh—it is he who opens the book by strangling the deputy—is on his trail.

Because McCarthy relies on the tropes of a familiar genre to convey his meaning, Wood suggests, that meaning can only apply to the hermetic universe imagined by that genre. In other words, any meaning conveyed in No Country for Old Men is rendered null in transit to the real world.

When Chigurh tells the blameless Carla Jean that “the shape of your path was visible from the beginning,” most readers, tutored in the rhetoric of pulp, will write it off as so much genre guff. But there is a way in which Chigurh is right: the thriller form knew all along that this was her end.

The acuity of Wood’s perception when it comes to the intricacies of literary language is often staggering, and his grasp of how diction and vocabulary provide clues to the narrator’s character and state of mind is equally prodigious. But, in this dismissal of Chigurh as a mere plot contrivance, as in his estimation of No Country for Old Men in general as a “morally empty book,” Wood is quite simply, quite startlingly, mistaken. And we might even say that the critical form knew all along that he would make this mistake.

           When Chigurh tells Carla Jean her path was visible, he’s not voicing any hardboiled fatalism, as Wood assumes; he’s pointing out that her predicament came about as a result of a decision her husband Llewelyn Moss made with full knowledge of the promised consequences. And we have to ask, could Wood really have known, before Chigurh showed up at the Moss residence, that Carla Jean would be made to pay for her husband’s defiance? It’s easy enough to point out superficial similarities to genre conventions in the novel (many of which it turns inside-out), but it doesn’t follow that anyone who notices them will be able to foretell how the book will end. Wood, despite his reservations, admits that No Country for Old Men is “very gripping.” But how could it be if the end were so predictable? And, if it were truly so morally empty, why would Wood care how it was going to end enough to be gripped? Indeed, it is in the realm of the characters’ moral natures that Wood is the most blinded by his reliance on critical convention. He argues,

Lewelyn Moss, the hunted, ought not to resemble Anton Chigurh, the hunter, but the flattening effect of the plot makes them essentially indistinguishable. The reader, of course, sides with the hunted. But both have been made unfree by the fake determinism of the thriller.

How could the two men’s fates be determined by the genre if in a good many thrillers the good guy, the hunted, prevails?

One glaring omission in Wood’s analysis is that Moss initially escapes undetected with the drug money he discovers at the scene of the shootout he happens upon while hunting, but he is then tormented by his conscience until he decides to return to the trucks with a jug of water for a dying man who begged him for a drink. “I’m fixin to go do somethin dumbern hell but I’m goin anyways,” he says to Carla Jean when she asks what he’s doing. “If I don’t come back tell Mother I love her” (24). Llewelyn, throughout the ensuing chase, is thus being punished for doing the right thing, an injustice that unsettles readers to the point where we can’t look away—we’re gripped—until we’re assured that he ultimately defeats the agents of that injustice. While Moss risks his life to give a man a drink, Chigurh, as Wood points out, is first seen killing a cop. Moreover, it’s hard to imagine Moss showing up to murder an innocent woman to make good on an ultimatum he’d presented to a man who had already been killed in the interim—as Chigurh does in the scene when he explains to Carla Jean that she’s to be killed because Llewelyn made the wrong choice.

Chigurh is in fact strangely principled, in a morally inverted sort of way, but the claim that he’s indistinguishable from Moss bespeaks a failure of attention completely at odds with the uncannily keen-eyed reading we’ve come to expect from Wood. When he revisits McCarthy’s writing in a review of the 2006 post-apocalyptic novel The Road, collected in the book The Fun Stuff, Wood is once again impressed by McCarthy’s “remarkable effects” but thoroughly baffled by “the matter of his meanings” (61). The novel takes us on a journey south to the sea with a father and his son as they scrounge desperately for food in abandoned houses along the way. Wood credits McCarthy for not substituting allegory for the answer to “a simpler question, more taxing to the imagination and far closer to the primary business of fiction making: what would this world without people look like, feel like?” But then he unaccountably struggles to sift out the novel’s hidden philosophical message. He writes,

A post-apocalyptic vision cannot but provoke dilemmas of theodicy, of the justice of fate; and a lament for the Deus absconditus is both implicit in McCarthy’s imagery—the fine simile of the sun that circles the earth “like a grieving mother with a lamp”—and explicit in his dialogue. Early in the book, the father looks at his son and thinks: “If he is not the word of God God never spoke.” There are thieves and murderers and even cannibals on the loose, and the father and son encounter these fearsome envoys of evil every so often. The son needs to think of himself as “one of the good guys,” and his father assures him that this is the side they are indeed on. (62)

We’re left wondering, is there any way to answer the question of what a post-apocalypse would be like in a story that features starving people reduced to cannibalism without providing fodder for genre-leery critics on the lookout for characters they can reduce to mere “envoys of evil”?

As trenchant as Wood is regarding literary narration, and as erudite—or pedantic, depending on your tastes—as he is regarding theology, the author of the excellent book How Fiction Works can’t help but fall afoul of his own, and his discipline’s, thoroughgoing ignorance when it comes to how plots work, what keeps the moral heart of a story beating. The way Wood fails to account for the forest comprised of the trees he takes such thorough inventory of calls to mind a line of his own from a chapter in The Fun Stuff about Edmund Wilson, describing an uncharacteristic failure on part of this other preeminent critic:

Yet the lack of attention to detail, in a writer whose greatness rests supremely on his use of detail, the unwillingness to talk of fiction as if narrative were a special kind of aesthetic experience and not a reducible proposition… is rather scandalous. (72)

To his credit, though, Wood never writes about novels as if they were completely reducible to their propositions; he doesn’t share David Shields’s conviction that stories are nothing but allegories or disguised philosophical arguments. Indeed, few critics are as eloquent as Wood on the capacity of good narration to communicate the texture of experience in a way all literate people can recognize from their own lived existences.

            But Wood isn’t interested in plot. He just doesn’t seem to like them. (There’s no mention of plot in either the table of contents or the index to How Fiction Works.) Worse, he shares Shields’s and other postmodern critics’ impulse to decode plots and their resolutions—though he also searches for ways to reconcile whatever moral he manages to pry from the story with its other elements. This is in fact one of the habits that tends to derail his reviews. Even after lauding The Road’s eschewal of easy allegory in place of the hard work of ground-up realism, Wood can’t help trying to decipher the end of the novel in the context of the religious struggle he sees taking place in it. He writes of the son’s survival,

The boy is indeed a kind of last God, who is “carrying the fire” of belief (the father and son used to speak of themselves, in a kind of familial shorthand, as people who were carrying the fire: it seems to be a version of being “the good guys”.) Since the breath of God passes from man to man, and God cannot die, this boy represents what will survive of humanity, and also points to how life will be rebuilt. (64)

This interpretation underlies Wood’s contemptuous attitude toward other reviewers who found the story uplifting, including Oprah, who used The Road as one of her book club selections. To Wood, the message rings false. He complains that

a paragraph of religious consolation at the end of such a novel is striking, and it throws the book off balance a little, precisely because theology has not seemed exactly central to the book’s inquiry. One has a persistent, uneasy sense that theodicy and the absent God have been merely exploited by the book, engaged with only lightly, without much pressure of interrogation. (64)

Inquiry? Interrogation? Whatever happened to “special kind of aesthetic experience”? Wood first places seemingly inconsequential aspects of the novel at the center of his efforts to read meaning into it, and then he faults the novel for not exploring these aspects at greater length. The more likely conclusion we might draw here is that Wood is simply and woefully mistaken in his interpretation of the book’s meaning. Indeed, Wood’s jump to theology, despite his insistence on its inescapability, is really quite arbitrary, one of countless themes a reader might possibly point to as indicative of the novel’s one true meaning.

Perhaps the problem here is the assumption that a story must have a meaning, some point that can be summed up in a single statement, for it to grip us. Getting beyond the issue of what statement the story is trying to make, we can ask what it is about the aesthetic experience of reading a novel that we find so compelling. For Wood, it’s clear the enjoyment comes from a sort of communion with the narrator, a felt connection forged by language, which effects an estrangement from his own mundane experiences by passing them through the lens of the character’s idiosyncratic vocabulary, phrasings, and metaphors. The sun dimly burning through an overcast sky looks much different after you’ve heard it compared to “a grieving mother with a lamp.” This pleasure in authorial communion and narrative immersion is commonly felt by the more sophisticated of literary readers. But what about less sophisticated readers? Many people who have a hard enough time simply understanding complex sentences, never mind discovering in them clues to the speaker’s personality, nevertheless become absorbed in narratives.

Developmental psychologists Karen Wynn and Paul Bloom, along with then graduate student Kiley Hamlin, serendipitously discovered a major clue to the mystery of why fictional stories engage humans’ intellectual and emotional faculties so powerfully while trying to determine at what age children begin to develop a moral sense. In a series of experiments conducted at the Yale Infant Cognition Center, Wynn and her team found that babies under a year old, even as young as three months, are easily induced to attribute agency to inanimate objects with nothing but a pair of crude eyes to suggest personhood. And, astonishingly, once agency is presumed, these infants begin attending to the behavior of the agents for evidence of their propensities toward being either helpfully social or selfishly aggressive—even when they themselves aren’t the ones to whom the behaviors are directed. 

            In one of the team’s most dramatic demonstrations, the infants watch puppet shows featuring what Bloom, in his book about the research program Just Babies, refers to as “morality plays” (30). Two rabbits respond to a tiger’s overture of rolling a ball toward them in different ways, one by rolling it back playfully, the other by snatching it up and running away with it. When the babies are offered a choice between the two rabbits after the play, they nearly always reach for the “good guy.” However, other versions of the experiment show that babies do favor aggressive rabbits over nice ones—provided that the victim is itself guilty of some previously witnessed act of selfishness or aggression. So the infants prefer cooperation over selfishness and punishment over complacency.

            Wynn and Hamlin didn’t intend to explore the nature of our fascination with fiction, but even the most casual assessment of our most popular stories suggests their appeal to audiences depends on a distinction similar to the one made by the infants in these studies. Indeed, the most basic formula for storytelling could be stated: good guy struggles against bad guy. Our interest is automatically piqued once such a struggle is convincingly presented, and it doesn’t depend on any proposition that can be gleaned from the outcome.

We favor the good guy because his (or her) altruism triggers an emotional response—we like him. And our interest in the ongoing developments of the story—the plot—are both emotional and dynamic. This is what the aesthetic experience of narrative consists of. 

            The beauty in stories comes from the elevation we get from the experience of witnessing altruism, and the higher the cost to the altruist the more elevating the story. The symmetry of plots is the balance of justice. Stories meant to disturb readers disrupt that balance.The crudest stories pit good guys against bad guys. The more sophisticated stories feature what we hope are good characters struggling against temptations or circumstances that make being good difficult, or downright dangerous. In other words, at the heart of any story is a moral dilemma, a situation in which characters must decide who deserves what fate and what they’re willing to pay to ensure they get it. The specific details of that dilemma are what we recognize as the plot.

The most basic moral, lesson, proposition, or philosophical argument inherent in the experience of attending to a story derives then not from some arbitrary decision on the part of the storyteller but from an injunction encoded in our genome. At some point in human evolution, our ancestor’s survival began to depend on mutual cooperation among all the members of the tribe, and so to this day, and from a startlingly young age, we’re on the lookout for anyone who might be given to exploiting the cooperative norms of our group. Literary critics could charge that the appeal of the altruist is merely another theme we might at this particular moment in history want to elevate to the status of most fundamental aspect of narrative. But I would challenge anyone who believes some other theme, message, or dynamic is more crucial to our engagement with stories to subject their theory to the kind of tests the interplay of selfish and altruistic impulses routinely passes in the Yale studies. Do babies care about theodicy? Are Wynn et al.’s morality plays about their own language?

This isn’t to say that other themes or allegories never play a role in our appreciation of novels. But whatever role they do play is in every case ancillary to the emotional involvement we have with the moral dilemmas of the plot. 1984 and Animal Farm are clear examples of allegories—but their greatness as stories is attributable to the appeal of their characters and the convincing difficulty of their dilemmas. Without a good plot, no one would stick around for the lesson. If we didn’t first believe Winston Smith deserved to escape Room 101 and that Boxer deserved a better fate than the knackery, we’d never subsequently be moved to contemplate the evils of totalitarianism. What makes these such powerful allegories is that, if you subtracted the political message, they’d still be great stories because they engage our moral emotions.

            What makes violence so compelling in fiction then is probably not that it sublimates our own violent urges, or that it justifies any civilization’s past crimes; violence simply ups the stakes for the moral dilemmas faced by the characters. The moment by moment drama in The Road, for instance, has nothing to do with whether anyone continues to believe in God. The drama comes from the father and son’s struggles to resist having to succumb to theft and cannibalism to survive. That’s the most obvious theme recurring throughout the novel. And you get the sense that were it not for the boy’s constant pleas for reassurance that they would never kill and eat anyone—the ultimate act of selfish aggression—and that they would never resort to bullying and stealing, the father quite likely would have made use of such expedients. The fire that they’re carrying is not the light of God; it’s the spark of humanity, the refusal to forfeit their human decency. (Wood doesn't catch that the fire was handed off from Sheriff Bell's father at the end of No Country.) The boy may very well be a redeemer, in that he helps his father make it to the end of his life with a clear conscience, but unless you believe morality is exclusively the bailiwick of religion God’s role in the story is marginal at best.

            What the critics given to dismissing plots as pointless fabrications fail to consider is that just as idiosyncratic language and simile estranges readers from their mundane existence so too the high-stakes dilemmas that make up plots can make us see our own choices in a different light, effecting their own breed of estrangement with regard to our moral perceptions and habits. In The Roving Party, set in the early nineteenth century, Black Bill, a native Tasmanian raised by a white family, joins a group of men led by a farmer named John Batman to hunt and kill other native Tasmanians and secure the territory for the colonials. The dilemmas Bill faces are like nothing most readers will ever encounter. But their difficulty is nonetheless universally understandable. In the following scene, Bill, who is also called the Vandemonian, along with a young boy and two native scouts, watches as Batman steps up to a wounded clansman in the aftermath of a raid on his people.

Batman considered the silent man secreted there in the hollow and thumbed back the hammers. He put one foot either side of the clansman’s outstretched legs and showed him the long void of those bores, standing thus prepared through a few creakings of the trees. The warrior was wide-eyed, looking to Bill and to the Dharugs.

The eruption raised the birds squealing from the branches. As the gunsmoke cleared the fellow slumped forward and spilled upon the soil a stream of arterial blood. The hollow behind was peppered with pieces of skull and other matter. John Batman snapped open the locks, cleaned out the pans with his cloth and mopped the blood off the barrels. He looked around at the rovers.

The boy was openmouthed, pale, and he stared at the ruination laid out there at his feet and stepped back as the blood ran near his rags. The Dharugs had by now turned away and did not look back. They began retracing their track through the rainforest, picking among the fallen trunks. But Black Bill alone among that party met Batman’s eye. He resettled his fowling piece across his back and spat on the ferns, watching Batman. Batman pulled out his rum, popped loose the cork, and drank. He held out the vessel to Bill. The Vandemonian looked at him. Then he turned to follow the Parramatta men out among the lemon myrtles and antique pines. (92)

If Rohan Wilson had wanted to expound on the evils of colonialism in Tasmania, he might have written about how Batman, a real figure from history, murdered several men he could easily have taken prisoner. But Wilson wanted to tell a story, and he knew that dilemmas like this one would grip our emotions. He likewise knew he didn’t have to explain that Bill, however much he disapproves of the murder, can’t afford to challenge his white benefactor in any less subtle manner than meeting his eyes and refusing his rum.

            Unfortunately, Batman registers the subtle rebuke all too readily. Instead of killing a native lawman wounded in a later raid himself, Batman leaves the task to Bill, who this time isn’t allowed the option of silently disapproving. But the way Wilson describes Bill’s actions leaves no doubt in the reader’s mind about his feelings, and those feelings have important consequences for how we feel about the character.

Black Bill removed his hat. He worked back the heavy cocks of both barrels and they settled with a dull clunk. Taralta clutched at his swaddled chest and looked Bill in the eyes, as wordless as ground stone. Bill brought up the massive gun and steadied the barrels across his forearm as his broken fingers could not take the weight. The sight of those octagonal bores levelled on him caused the lawman to huddle down behind his hands and cry out, and Bill steadied the gun but there was no clear shot he might take. He waited.

                        See now, he said. Move your hands.

            The lawman crabbed away over the dirt, still with his arms upraised, and Bill followed him and kicked him in the bandaged ribs and kicked at his arms.

                        menenger, Bill said, menenger.

The lawman curled up more tightly. Bill brought the heel of his boot to bear on the wounded man but he kicked in vain while Taralta folded his arms ever tighter around his head.

Black Bill lowered the gun. Wattlebirds made their yac-a-yac coughs in the bush behind and he gazed at the blue hills to the south and the snow clouds forming above them. When Bill looked again at the lawman he was watching through his hands, dirt and ash stuck in the cords of his ochred hair. Bill brought the gun up, balanced it across his arm again and tucked the butt into his shoulder. Then he fired into the lawman’s head.

The almighty concussion rattled the wind in his chest and the gun bucked from his grip and fell. He turned away, holding his shoulder. Blood had spattered his face, his arms, the front of his shirt. For a time he would not look at the body of the lawman where it lay near the fire. He rubbed at the bruising on his shoulder; watched storms amass around the southern peaks. After a while he turned to survey the slaughter he had wrought.

One of the lawman’s arms was gone at the elbow and the teeth seated in the jawbone could be seen through the cheek. There was flesh blown every place. He picked up the Manton gun. The locks were soiled and he fingered out the grime, and then with the corner of his coat cleaned the pan and blew into the latchworks. He brought the weapon up to eye level and peered along its sights for barrel warps or any misalignment then, content, slung the leather on his shoulder. Without a rearward glance he stalked off, his hat replaced, his boots slipping in the blood. Smoke from the fire blew around him in a snarl raised on the wind and dispersed again on the same. (102-4)

Depending on their particular ideological bent, critics may charge that a scene like this simply promotes the type of violence it depicts, or that it encourages a negative view of native Tasmanians—or indigenous peoples generally—as of such weak moral fiber that they can be made to turn against their own countrymen. And pointing out that the aspect of the scene that captures our attention is the process, the experience, of witnessing Bill’s struggle to resolve his dilemma would do little to ease their worries; after all, even if the message is ancillary, its influence could still be pernicious.

            The reason that critics applying their favored political theories to their analyses of fiction so often stray into the realm of the absurd is that the only readers who experience stories the same way as they do will be the ones who share the same ideological preoccupations. You can turn any novel into a Rorschach, pulling out disparate shapes and elements to blur into some devious message. But any reader approaching the writing without your political theories or your critical approach will likely come away with a much more basic and obvious lesson. Black Bill’s dilemma is that he has to kill many of his fellow Tasmanians if he wants to continue living as part of a community of whites. If readers take on his attitude toward killing as it’s demonstrated in the scene when he kills Taralta, they’ll be more reluctant to do it, not less. Bill clearly loathes what he’s forced to do. And if any race comes out looking bad it’s the whites, since they’re the ones whose culture forces Bill to choose between his family’s well-being and the dictates of his conscience.

Readers likely have little awareness of being influenced by the overarching themes in their favorite stories, but upon reflection the meaning of those themes is usually pretty obvious. Recent research into how reading the Harry Potter books has impacted young people’s political views, for instance, shows that fans of the series are more accepting of out-groups, more tolerant, less predisposed to authoritarianism, more supporting of equality, and more opposed to violence and torture. Anthony Gierzynsky, the author of the study, points out, “As Harry Potter fans will have noted, these are major themes repeated throughout the series.” The messages that reach readers are the conspicuous ones, not the supposedly hidden ones critics pride themselves on being able to suss out. 

            It’s an interesting question just how wicked stories could persuade us to be, relying as they do on our instinctual moral sense. Fans could perhaps be biased toward evil by themes about the threat posed by some out-group, or the debased nature of the lower orders, or nonbelievers in the accepted deities—since the salience of these concepts likewise seems to be inborn. But stories told from the perspective of someone belonging to the persecuted group could provide an antidote. At any rate, there’s a solid case to be made that novels have helped the moral of arc of history bend toward greater justice and compassion.

            Even a novel with violence as pervasive and chaotic as it is in Blood Meridian sets up a moral gradient for the characters to occupy—though finding where the judge fits is a quite complicated endeavor—and the one with the most qualms about killing happens to be the protagonist, referred to simply as the kid. “You alone were mutinous,” the judge says to him. “You alone reserved in your soul some corner of clemency for the heathen” (299). The kid’s character is revealed much the way Black Bill’s is in The Roving Party, as readers witness him working through high-stakes dilemmas. After drawing arrows to determine who in the band of scalp hunters will stay behind to kill some of their wounded (to prevent a worse fate at the hands of the men pursuing them), the kid finds himself tasked with euthanizing a man who would otherwise survive.

                        You wont thank me if I let you off, he said.

                        Do it then you son of a bitch.

            The kid sat. A light wind was blowing out of the north and some doves had begun to call in the thicket of greasewood behind them.

                        If you want me just to leave you I will.

                        Shelby didnt answer

                        He pushed a furrow in the sand with the heel of his boot. You’ll have to say.

                        Will you leave me a gun?

                        You know I can’t leave you no gun.

                        You’re no better than him. Are you?

                        The kid didn’t answer. (208)

That “him” is ambiguous; it could either be Glanton, the leader of the gang whose orders the kid is ignoring, or the judge, who engages him throughout the later parts of the novel in a debate about the necessity of violence in history. We know by now that the kid really is better than the judge—at least in the sense that Shelby means. And the kid handles the dilemma, as best he can, by hiding Shelby in some bushes and leaving him with a canteen of water.

These three passages from The Roving Party and Blood Meridian reveal as well something about the language commonly used by authors of violent novels going back to Conrad (perhaps as far back as Tolstoy). Faced with the choice of killing a man—or of standing idly by and allowing him to be killed—the characters hesitate, and the space of their hesitation is filled with details like the type of birdsong that can be heard. This style of “dirty realism,” a turning away from abstraction, away even from thought, to focus intensely on physical objects and the natural world, frustrates critics like James Wood because they prefer their prose to register the characters’ meanderings of mind in the way that only written language can. Writing about No Country for Old Men, Wood complains about all the labeling and descriptions of weapons and vehicles to the exclusion of thought and emotion.

Here is Hemingway’s influence, so popular in male American fiction, of both the pulpy and the highbrow kind. It recalls the language of “A Farewell to Arms”: “He looked very dead. It was raining. I had liked him as well as anyone I ever knew.” What appears to be thought is in fact suppressed thought, the mere ratification of male taciturnity. The attempt to stifle sentimentality—“He looked very dead”—itself comes to seem a sentimental mannerism. McCarthy has never been much interested in consciousness and once declared that as far as he was concerned Henry James wasn’t literature. Alas, his new book, with its gleaming equipment of death, its mindless men and absent (but appropriately sentimentalized) women, its rigid, impacted prose, and its meaningless story, is perhaps the logical result of a literary hostility to Mind.

Here again Wood is relaxing his otherwise razor-keen capacity for gleaning insights from language and relying instead on the anemic conventions of literary criticism—a discipline obsessed with the enactment of gender roles. (I’m sure Suzanne Collins would be amused by this idea of masculine taciturnity.) But Wood is right to recognize the natural tension between a literature of action and a literature of mind. Imagine how much the impact of Black Bill’s struggle with the necessity of killing Taralta would be blunted if we were privy to his thoughts, all of which are implicit in the scene as Wilson has rendered it anyway.

            Fascinatingly, though, it seems that Wood eventually realized the actual purpose of this kind of evasive prose—and it was Cormac McCarthy he learned it from. As much as Wood lusts after some leap into theological lucubration as characters reflect on the lessons of the post-apocalypse or the meanings of violence, the psychological reality is that it is often in the midst of violence or when confronted with imminent death that people are least given to introspection. As Wood explains in writing about the prose style of The Road,

McCarthy writes at one point that the father could not tell the son about life before the apocalypse: “He could not construct for the child’s pleasure the world he’d lost without constructing the loss as well and thought perhaps the child had known this better than he.” It is the same for the book’s prose style: just as the father cannot construct a story for the boy without also constructing the loss, so the novelist cannot construct the loss without the ghost of the departed fullness, the world as it once was. (55)

The rituals of weapon reloading, car repair, and wound wrapping that Wood finds so offputtingly affected in No Country for Old Men are precisely the kind of practicalities people would try to engage their minds with in the aftermath of violence to avoid facing the reality. But this linguistic and attentional coping strategy is not without moral implications of its own.

            In the opening of The Roving Party, Black Bill receives a visit from some of the very clansmen he’s been asked by John Batman to hunt. The headman of the group is a formidable warrior named Manalargena (another real historical figure), who is said to have magical powers. He has come to recruit Bill to help in fighting against the whites, unaware of Bill’s already settled loyalties. When Bill refuses to come fight with Manalargena, the headman’s response is to tell a story about two brothers who live near a river where they catch plenty of crayfish, make fires, and sing songs. Then someone new arrives on the scene:

Hunter come to the river. He is hungry hunter you see. He want crayfish. He see them brother eating crayfish, singing song. He want crayfish too. He bring up spear. Here the headman made as if to raise something. He bring up that spear and he call out: I hungry, you give me that crayfish. He hold that spear and he call out. But them brother they scared you see. They scared and they run. They run and they change. They change to wallaby and they jump. Now they jump and jump and the hunter he follow them.

So hunter he change too. He run and he change to that wallaby and he jump. Now three wallaby jump near river. They eat grass. They forget crayfish. They eat grass and they drink water and they forget crayfish. Three wallaby near the river. Very big river. (7-8)

Bill initially dismisses the story, saying it makes no sense. Indeed, as a story, it’s terrible. The characters have no substance and the transformation seems morally irrelevant. The story is pure allegory. Interestingly, though, by the end of the novel, its meaning is perfectly clear to Bill. Taking on the roles of hunter and hunted leaves no room for songs, no place for what began the hunt in the first place, creating a life closer to that of animals than of humans. There are no more fires.

            Wood counts three registers authors like Conrad and McCarthy—and we can add Wilson—use in their writing. The first is the dirty realism that conveys the characters’ unwillingness to reflect on their circumstances or on the state of their souls. The third is the lofty but oblique discourse on God’s presence or absence in a world of tragedy and carnage Wood finds so ineffectual. For most readers, though, it’s the second register that stands out. Here’s how Wood describes it:

Hard detail and a fine eye is combined with exquisite, gnarled, slightly antique (and even slightly clumsy or heavy) lyricism. It ought not to work, and sometimes it does not. But many of its effects are beautiful—and not only beautiful, but powerfully efficient as poetry. (59)

This description captures what’s both great and frustrating about the best and worst lines in these authors’ novels. But Wood takes the tradition for granted without asking why this haltingly graceful and heavy-handedly subtle language is so well-suited to these violent stories. The writers are compelled to use this kind of language by the very effects of the plot and setting that critics today so often fail to appreciate—though Wood does gesture toward it in the title of his essay on No Country for Old Men. The dream logic of song and simile that goes into the aesthetic experience of bearing witness to the characters sparsely peopling the starkly barren and darkly ominous landscapes of these novels carries within it the breath of the sublime.

            In coming to care about characters whose fates unfold in the aftermath of civilization, or in regions where civilization has yet to take hold, places where bloody aggression and violent death are daily concerns and witnessed realities, we’re forced to adjust emotionally to the worlds they inhabit. Experiencing a single death brings a sense of tragedy, but coming to grips with a thousand deaths has a more curious effect. And it is this effect that the strange tangles of metaphorical prose both gesture toward and help to induce. The sheer immensity of the loss, the casual brushing away of so many bodies and the blotting out of so much unique consciousness, overstresses the capacity of any individual to comprehend it. The result is paradoxical, a fixation on the material objects still remaining, and a sliding off of one’s mind onto a plane of mental existence where the material has scarcely any reality at all because it has scarcely any significance at all. The move toward the sublime is a lifting up toward infinite abstraction, the most distant perspective ever possible on the universe, where every image is a symbol for some essence, where every embrace is a symbol for human connectedness, where every individual human is a symbol for humanity. This isn’t the abstraction of logic, the working out of implications about God or cosmic origins. It’s the abstraction of the dream or the religious experience, an encounter with the sacred and the eternal, a falling and fading away of the world of the material and the particular and the mundane.

            The prevailing assumption among critics and readers alike is that fiction, especially literary fiction, attempts to represent some facet of life, so the nature of a given representation can be interpreted as a comment on whatever is being represented. But what if the representations, the correspondences between the fictional world and the nonfictional one, merely serve to make the story more convincing, more worthy of our precious attention? What if fiction isn’t meant to represent reality so much as to alter our perceptions of it? Critics can fault plots like the one in No Country for Old Men, and characters like Anton Chigurh, for having no counterparts outside the world of the story, mooting any comment about the real world the book may be trying to make. But what if the purpose of drawing readers into fictional worlds is to help them see their own worlds anew by giving them a taste of what it would be like to live a much different existence? Even the novels that hew more closely to the mundane, the unremarkable passage of time, are condensed versions of the characters’ lives, encouraging readers to take a broader perspective on their own. The criteria we should apply to our assessments of novels then would not be how well they represent reality and how accurate or laudable their commentaries are. We should instead judge novels by how effectively they pull us into the worlds they create for themselves and how differently we look at our own world in the wake of the experience. And since high-stakes moral dilemmas are the heart of stories we might wonder what effect the experience of witnessing them will have on our own lower-stakes lives.

Also read:

HUNGER GAME THEORY: Post-Apocalyptic Fiction and the Rebirth of Humanity

Life's White Machine: James Wood and What Doesn't Happen in Fiction

LET'S PLAY KILL YOUR BROTHER: FICTION AS A MORAL DILEMMA GAME

Read More
Dennis Junk Dennis Junk

Let's Play Kill Your Brother: Fiction as a Moral Dilemma Game

Anthropologist Jean Briggs discovered one of the keys to Inuit peacekeeping in the style of play adults engage use to engage children. She describes the games in her famous essay, ‘Why Don’t You Kill Your Baby Brother?’ The Dynamics of Peace in Canadian Inuit Camps,” and in so doing, probably unknowingly, lays the groundwork for an understanding of how our love of fiction evolved, along with our moral sensibilities.

            Season 3 of Breaking Bad opens with two expressionless Mexican men in expensive suits stepping out of a Mercedes, taking a look around the peasant village they’ve just arrived in, and then dropping to the ground to crawl on their knees and elbows to a candlelit shrine where they leave an offering to Santa Muerte, along with a crude drawing of the meth cook known as Heisenberg, marking him for execution. We later learn that the two men, Leonel and Marco, who look almost identical, are in fact twins (played by Daniel and Luis Moncada), and that they are the cousins of Tuco Salamanca, a meth dealer and cartel affiliate they believe Heisenberg betrayed and killed. We also learn that they kill people themselves as a matter of course, without registering the slightest emotion and without uttering a word to each other to mark the occasion. An episode later in the season, after we’ve been made amply aware of how coldblooded these men are, begins with a flashback to a time when they were just boys fighting over an action figure as their uncle talks cartel business on the phone nearby. After Marco gets tired of playing keep-away, he tries to provoke Leonel further by pulling off the doll’s head, at which point Leonel runs to his Uncle Hector, crying, “He broke my toy!”

“He’s just having fun,” Hector says, trying to calm him. “You’ll get over it.”

“No! I hate him!” Leonel replies. “I wish he was dead!”

Hector’s expression turns grave. After a moment, he calls Marco over and tells him to reach into the tub of melting ice beside his chair to get him a beer. When the boy leans over the tub, Hector shoves his head into the water and holds it there. “This is what you wanted,” he says to Leonel. “Your brother dead, right?” As the boy frantically pulls on his uncle’s arm trying to free his brother, Hector taunts him: “How much longer do you think he has down there? One minute? Maybe more? Maybe less? You’re going to have to try harder than that if you want to save him.” Leonel starts punching his uncle’s arm but to no avail. Finally, he rears back and punches Hector in the face, prompting him to release Marco and rise from his chair to stand over the two boys, who are now kneeling beside each other. Looking down at them, he says, “Family is all.”

The scene serves several dramatic functions. By showing the ruthless and violent nature of the boys’ upbringing, it intensifies our fear on behalf of Heisenberg, who we know is actually Walter White, a former chemistry teacher and family man from a New Mexico suburb who only turned to crime to make some money for his family before his lung cancer kills him. It also goes some distance toward humanizing the brothers by giving us insight into how they became the mute, mechanical murderers they are when we’re first introduced to them. The bond between the two men and their uncle will be important in upcoming episodes as well. But the most interesting thing about the scene is that it represents in microcosm the single most important moral dilemma of the whole series.

Marco and Leonel are taught to do violence if need be to protect their family. Walter, the show’s central character, gets involved in the meth business for the sake of his own family, and as he continues getting more deeply enmeshed in the world of crime he justifies his decisions at each juncture by saying he’s providing for his wife and kids. But how much violence can really be justified, we’re forced to wonder, with the claim that you’re simply protecting or providing for your family? The entire show we know as Breaking Bad can actually be conceived of as a type of moral exercise like the one Hector puts his nephews through, designed to impart or reinforce a lesson, though the lesson of the show is much more complicated. It may even be the case that our fondness for fictional narratives more generally, like the ones we encounter in novels and movies and TV shows, originated in our need as a species to develop and hone complex social skills involving powerful emotions and difficult cognitive calculations.

Most of us watching Breaking Bad probably feel Hector went way too far with his little lesson, and indeed I’d like to think not too many parents or aunts and uncles would be willing to risk drowning a kid to reinforce the bond between him and his brother. But presenting children with frightening and stressful moral dilemmas to guide them through major lifecycle transitions—weaning, the birth of siblings, adoptions—which tend to arouse severe ambivalence can be an effective way to encourage moral development and instill traditional values. The ethnographer Jean Briggs has found that among the Inuit peoples whose cultures she studies adults frequently engage children in what she calls “playful dramas” (173), which entail hypothetical moral dilemmas that put the children on the hot seat as they struggle to come up with a solution. She writes about these lessons, which strike many outsiders as a cruel form of teasing by the adults, in “‘Why Don’t You Kill Your Baby Brother?’ The Dynamics of Peace in Canadian Inuit Camps,” a chapter she contributed to a 1994 anthology of anthropological essays on peace and conflict. In one example Briggs recounts,

A mother put a strange baby to her breast and said to her own nursling: “Shall I nurse him instead of you?” The mother of the other baby offered her breast to the rejected child and said: “Do you want to nurse from me? Shall I be your mother?” The child shrieked a protest shriek. Both mothers laughed. (176)

This may seem like sadism on the part of the mothers, but it probably functioned to soothe the bitterness arising from the child’s jealousy of a younger nursling. It would also help to settle some of the ambivalence toward the child’s mother, which comes about inevitably as a response to disciplining and other unavoidable frustrations.

Another example Briggs describes seems even more pointlessly sadistic at first glance. A little girl’s aunt takes her hand and puts it on a little boy’s head, saying, “Pull his hair.” The girl doesn’t respond, so her aunt yanks on the boy’s hair herself, making him think the girl had done it. They quickly become embroiled in a “battle royal,” urged on by several adults who find it uproarious. These adults do, however, end up stopping the fight before any serious harm can be done. As horrible as this trick may seem, Briggs believes it serves to instill in the children a strong distaste for fighting because the experience is so unpleasant for them. They also learn “that it is better not to be noticed than to be playfully made the center of attention and laughed at” (177). What became clear to Briggs over time was that the teasing she kept witnessing wasn’t just designed to teach specific lessons but that it was also tailored to the child’s specific stage of development. She writes,

Indeed, since the games were consciously conceived of partly as tests of a child’s ability to cope with his or her situation, the tendency was to focus on a child’s known or expected difficulties. If a child had just acquired a sibling, the game might revolve around the question: “Do you love your new baby sibling? Why don’t you kill him or her?” If it was a new piece of clothing that the child had acquired, the question might be: “Why don’t you die so I can have it?” And if the child had been recently adopted, the question might be: “Who’s your daddy?” (172)

As unpleasant as these tests can be for the children, they never entail any actual danger—Inuit adults would probably agree Hector Salamanca went a bit too far—and they always take place in circumstances and settings where the only threats and anxieties come from the hypothetical, playful dilemmas and conflicts. Briggs explains,

A central idea of Inuit socialization is to “cause thought”: isumaqsayuq. According to [Arlene] Stairs, isumaqsayuq, in North Baffin, characterizes Inuit-style education as opposed to the Western variety. Warm and tender interactions with children help create an atmosphere in which thought can be safely caused, and the questions and dramas are well designed to elicit it. More than that, and as an integral part of thought, the dramas stimulate emotion. (173)

Part of the exercise then seems to be to introduce the children to their own feelings. Prior to having their sibling’s life threatened, the children may not have any idea how they’d feel in the event of that sibling’s death. After the test, however, it becomes much more difficult for them to entertain thoughts of harming their brother or sister—the thought alone will probably be unpleasant.

Briggs also points out that the games send the implicit message to the children that they can be trusted to arrive at the moral solution. Hector knows Leonel won’t let his brother drown—and Leonel learns that his uncle knows this about him. The Inuit adults who tease and tempt children are letting them know they have faith in the children’s ability to resist their selfish or aggressive impulses. Discussing Briggs’s work in his book Moral Origins: The Evolution of Virtue, Altruism, and Shame, anthropologist Christopher Boehm suggests that evolution has endowed children with the social and moral emotions we refer to collectively as consciences, but these inborn moral sentiments need to be activated and shaped through socialization. He writes,

On the one side there will always be our usefully egoistic selfish tendencies, and on the other there will be our altruistic or generous impulses, which also can advance our fitness because altruism and sympathy are valued by our peers. The conscience helps us to resolve such dilemmas in ways that are socially acceptable, and these Inuit parents seem to be deliberately “exercising” the consciences of their children to make morally socialized adults out of them. (226)

The Inuit-style moral dilemma games seem strange, even shocking, to people from industrialized societies, and so it’s clear they’re not a normal part of children’s upbringing in every culture. They don’t even seem to be all that common among hunter-gatherers outside the region of the Arctic. Boehm writes, however,

Deliberately and stressfully subjecting children to nasty hypothetical dilemmas is not universal among foraging nomads, but as we’ll see with Nisa, everyday life also creates real moral dilemmas that can involve Kalahari children similarly. (226)

Boehm goes on to recount an episode from anthropologist Marjorie Shostak’s famous biography Nisa: The Life and Words of a !Kung Womanto show that parents all the way on the opposite side of the world from where Briggs did her fieldwork sometimes light on similar methods for stimulating their children’s moral development.

Nisa seems to have been a greedy and impulsive child. When her pregnant mother tried to wean her, she would have none of it. At one point, she even went so far as to sneak into the hut while her mother was asleep and try to suckle without waking her up. Throughout the pregnancy, Nisa continually expressed ambivalence toward the upcoming birth of her sibling, so much so that her parents anticipated there might be some problems. The !Kung resort to infanticide in certain dire circumstances, and Nisa’s parents probably reasoned she was at least somewhat familiar with the coping mechanism many other parents used when killing a newborn was necessary. What they’d do is treat the baby as an object, not naming it or in any other way recognizing its identity as a family member. Nisa explained to Shostak how her parents used this knowledge to impart a lesson about her baby brother.

After he was born, he lay there, crying. I greeted him, “Ho, ho, my baby brother! Ho, ho, I have a little brother! Some day we’ll play together.” But my mother said, “What do you think this thing is? Why are you talking to it like that? Now, get up and go back to the village and bring me my digging stick.” I said, “What are you going to dig?” She said, “A hole. I’m going to dig a hole so I can bury the baby. Then you, Nisa, will be able to nurse again.” I refused. “My baby brother? My little brother? Mommy, he’s my brother! Pick him up and carry him back to the village. I don’t want to nurse!” Then I said, “I’ll tell Daddy when he comes home!” She said, “You won’t tell him. Now, run back and bring me my digging stick. I’ll bury him so you can nurse again. You’re much too thin.” I didn’t want to go and started to cry. I sat there, my tears falling, crying and crying. But she told me to go, saying she wanted my bones to be strong. So, I left and went back to the village, crying as I walked. (The weaning episode occurs on pgs. 46-57)

Again, this may strike us as cruel, but by threatening her brother’s life, Nisa’s mother succeeded in triggering her natural affection for him, thus tipping the scales of her ambivalence to ensure the protective and loving feelings won out over the bitter and jealous ones. This example was extreme enough that Nisa remembered it well into adulthood, but Boehm sees it as evidence that real life reliably offers up dilemmas parents all over the world can use to instill morals in their children. He writes,

I believe that all hunter-gatherer societies offer such learning experiences, not only in the real-life situations children are involved with, but also in those they merely observe. What the Inuit whom Briggs studied in Cumberland Sound have done is to not leave this up to chance. And the practice would appear to be widespread in the Arctic. Children are systematically exposed to life’s typical stressful moral dilemmas, and often hypothetically, as a training ground that helps to turn them into adults who have internalized the values of their groups. (234)

One of the reasons such dilemmas, whether real or hypothetical or merely observed, are effective as teaching tools is that they bypass the threat to personal autonomy that tends to accompany direct instruction. Imagine Tío Salamanca simply scolding Leonel for wishing his brother dead—it would have only aggravated his resentment and sparked defiance. Leonel would probably also harbor some bitterness toward his uncle for unjustly defending Marco. In any case, he would have been stubbornly resistant to the lesson.

Winston Churchill nailed the sentiment when he said, “Personally, I am always ready to learn, although I don’t always like being taught.” The Inuit-style moral dilemmas force the children to come up with the right answer on their own, a task that requires the integration and balancing of short and long term desires, individual and group interests, and powerful albeit contradictory emotions. The skills that go into solving such dilemmas are indistinguishable from the qualities we recognize as maturity, self-knowledge, generosity, poise, and wisdom.

For the children Briggs witnessed being subjected to these moral tests, the understanding that the dilemmas were in fact only hypothetical developed gradually as they matured. For the youngest ones, the stakes were real and the solutions were never clear at the onset. Briggs explains that

while the interaction between small children and adults was consistently good-humored, benign, and playful on the part of the adults, it taxed the children to—or beyond—the limits of their ability to understand, pushing them to expand their horizons, and testing them to see how much they had grown since the last encounter. (173)

What this suggests is that there isn’t always a simple declarative lesson—a moral to the story, as it were—imparted in these games. Instead, the solutions to the dilemmas can often be open-ended, and the skills the children practice can thus be more general and abstract than some basic law or principle. Briggs goes on,

Adult players did not make it easy for children to thread their way through the labyrinth of tricky proposals, questions, and actions, and they did not give answers to the children or directly confirm the conclusions the children came to. On the contrary, questioning a child’s first facile answers, they turned situations round and round, presenting first one aspect then another, to view. They made children realize their emotional investment in all possible outcomes, and then allowed them to find their own way out of the dilemmas that had been created—or perhaps, to find ways of living with unresolved dilemmas. Since children were unaware that the adults were “only playing,” they could believe that their own decisions would determine their fate. And since the emotions aroused in them might be highly conflicted and contradictory—love as well as jealousy, attraction as well as fear—they did not always know what they wanted to decide. (174-5)

As the children mature, they become more adept at distinguishing between real and hypothetical problems. Indeed, Briggs suggests one of the ways adults recognize children’s budding maturity is that they begin to treat the dilemmas as a game, ceasing to take them seriously, and ceasing to take themselves as seriously as they did when they were younger.

In his book On the Origin of Stories: Evolution, Cognition, and Fiction, literary scholar Brian Boyd theorizes that the fictional narratives that humans engage one another with in every culture all over the world, be they in the form of religious myths, folklore, or plays and novels, can be thought of as a type of cognitive play—similar to the hypothetical moral dilemmas of the Inuit. He sees storytelling as an adaption that encourages us to train the mental faculties we need to function in complex societies. The idea is that evolution ensures that adaptive behaviors tend to be pleasurable, and thus many animals playfully and joyously engage in activities in low-stakes, relatively safe circumstances that will prepare them to engage in similar activities that have much higher stakes and are much more dangerous. Boyd explains,

The more pleasure that creatures have in play in safe contexts, the more they will happily expend energy in mastering skills needed in urgent or volatile situations, in attack, defense, and social competition and cooperation. This explains why in the human case we particularly enjoy play that develops skills needed in flight (chase, tag, running) and fight (rough-and-tumble, throwing as a form of attack at a distance), in recovery of balance (skiing, surfing, skateboarding), and individual and team games. (92)

The skills most necessary to survive and thrive in human societies are the same ones Inuit adults help children develop with the hypothetical dilemma’s Briggs describes. We should expect fiction then to feature similar types of moral dilemmas. Some stories may be designed to convey simple messages—“Don’t hurt your brother,” “Don’t stray from the path”—but others might be much more complicated; they may not even have any viable solutions at all. “Art prepares minds for open-ended learning and creativity,” Boyd writes; “fiction specifically improves our social cognition and our thinking beyond the here and now” (209).

One of the ways the cognitive play we call novels or TV shows differs from Inuit dilemma games is that the fictional characters take over center stage from the individual audience members. Instead of being forced to decide on a course of action ourselves, we watch characters we’ve become emotionally invested in try to come up with solutions to the dilemmas. When these characters are first introduced to us, our feelings toward them will be based on the same criteria we’d apply to real people who could potentially become a part of our social circles. Boyd explains,

Even more than other social species, we depend on information about others’ capacities, dispositions, intentions, actions, and reactions. Such “strategic information” catches our attention so forcefully that fiction can hold our interest, unlike almost anything else, for hours at a stretch. (130)

We favor characters who are good team players—who communicate honestly, who show concern for others, and who direct aggression toward enemies and cheats—for obvious reasons, but we also assess them in terms of what they might contribute to the group. Characters with exceptional strength, beauty, intelligence, or artistic ability are always especially attention-worthy. Of course, characters with qualities that make them sometimes an asset and sometimes a liability represent a moral dilemma all on their own—it’s no wonder such characters tend to be so compelling.

The most common fictional dilemma pits a character we like against one or more characters we hate—the good team player versus the power- or money-hungry egoist. We can think of the most straightforward plot as an encroachment of chaos on the providential moral order we might otherwise take for granted. When the bad guy is finally defeated, it’s like a toy that was snatched away from us has just been returned. We embrace the moral order all the more vigorously. But of course our stories aren’t limited to this one basic formula. Around the turn of the last century, the French writer Georges Polti, following up on the work of Italian playwright Carlo Gozzi, tried to write a comprehensive list of all the basic plots in plays and novels, and flipping through his book The Thirty-Six Dramatic Situations, you find that with few exceptions (“Daring Enterprise,” “The Enigma,” “Recovery of a Lost One”) the situations aren’t simply encounters between characters with conflicting goals, or characters who run into obstacles in chasing after their desires. The conflicts are nearly all moral, either between a virtuous character and a less virtuous one or between selfish or greedy impulses and more altruistic ones. Polti’s book could be called The Thirty-Odd Moral Dilemmas in Fiction. Hector Salamanca would be happy (not really) to see the thirteenth situation: “Enmity of Kinsmen,” the first example of which is “Hatred of Brothers” (49).

One type of fictional dilemma that seems to be particularly salient in American society today pits our impulse to punish wrongdoers against our admiration for people with exceptional abilities. Characters like Walter White in Breaking Bad win us over with qualities like altruism, resourcefulness, and ingenuity—but then they go on to behave in strikingly, though somehow not obviously, immoral ways. Variations on Conan-Doyle’s Sherlock Holmes abound; he’s the supergenius who’s also a dick (get the double-entendre?): the BBC’s Sherlock (by far the best), the movies starring Robert Downey Jr., the upcoming series featuring an Asian female Watson (Lucy Liu)—plus all the minor variations like The Mentalist and House

Though the idea that fiction is a type of low-stakes training simulation to prepare people cognitively and emotionally to take on difficult social problems in real life may not seem all that earthshattering, conceiving of stories as analogous to Inuit moral dilemmas designed to exercise children’s moral reasoning faculties can nonetheless help us understand why worries about the examples set by fictional characters are so often misguided. Many parents and teachers noisily complain about sex or violence or drug use in media. Academic literary critics condemn the way this or that author portrays women or minorities. Underlying these concerns is the crude assumption that stories simply encourage audiences to imitate the characters, that those audiences are passive receptacles for the messages—implicit or explicit—conveyed through the narrative. To be fair, these worries may be well placed when it comes to children so young they lack the cognitive sophistication necessary for separating their thoughts and feelings about protagonists from those they have about themselves, and are thus prone to take the hero for a simple model of emulation-worthy behavior. But, while Inuit adults communicate to children that they can be trusted to arrive at a right or moral solution, the moralizers in our culture betray their utter lack of faith in the intelligence and conscience of the people they try to protect from the corrupting influence of stories with imperfect or unsavory characters. 

           This type of self-righteous and overbearing attitude toward readers and viewers strikes me as more likely by orders of magnitude to provoke defiant resistance to moral lessons than the North Baffin’s isumaqsayuq approach. In other words, a good story is worth a thousand sermons. But if the moral dilemma at the core of the plot has an easy solution—if you can say precisely what the moral of the story is—it’s probably not a very good story.

Also read

The Criminal Sublime: Walter White's Brutally Plausible Journey to the Heart of Darkness in Breaking Bad

And

SYMPATHIZING WITH PSYCHOS: WHY WE WANT TO SEE ALEX ESCAPE HIS FATE AS A CLOCKWORK ORANGE

And

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

Read More
Dennis Junk Dennis Junk

The Self-Transcendence Price Tag: A Review of Alex Stone's Fooling Houdini

If you can convince people you know how to broaden the contours of selfhood and show them the world as they’ve never experienced it before, if you can give them the sense that their world is expanding, they’ll at the very least want to keep talking to you so they can keep the feeling going and maybe learn what your secret is. Much of this desire to better ourselves is focused on our social lives, and that’s why duping delight is so seductive—it gives us a taste of what it’s like to be the charismatic and irresistible characters we always expected ourselves to become.

Psychologist Paul Ekman is renowned for his research on facial expressions, and he frequently studies and consults with law enforcement agencies, legal scholars, and gamblers on the topic of reading people who don’t want to be read. In his book Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage, Ekman focuses on three emotions would-be lie detectors should be on the lookout for subtle expressions of. The first two—detection apprehension and deception guilt—are pretty obvious. But the third is more interesting. Many people actually enjoy deceiving others because, for one thing, the threat of detection is more thrilling to them than terrifying, and, for another, being able to pull off the deception successfully can give them a sense of “pride in the achievement, or feelings of smug contempt toward the target” (76). Ekman calls this “Duping Delight,” and he suggests it leads many liars to brag about their crimes, which in turn leads to them being caught.

The takeaway insight is that knowing something others don’t, or having the skill to trick or deceive others, can give us an inherently rewarding feeling of empowerment.

Alex Stone, in his new book Fooling Houdini: Magicians, Mentalists, Math Geeks & the Hidden Powers of the Mind, suggests that duping delight is what drives the continuing development of the magician’s trade. The title refers to a bit of lore that has reached the status of founding myth among aficionados of legerdemain. Houdini used to boast that he could figure out the secret behind any magic trick if he saw it performed three times. Time and again, he backed up his claim, sending defeated tricksters away to nurse their wounded pride. But then came Dai Vernon, who performed for Houdini what he called the Ambitious Card, a routine in which a card signed by a volunteer and then placed in the middle of a deck mysteriously appears at the top. After watching Vernon go through the routine seven times, Houdini turned around and walked away in a huff. Vernon went on to become a leading figure in Twentieth Century magic, and every magician today has his (they’re almost all male) own version of Ambitious Card, which serves as a type of signature.

In Fooling Houdini, Stone explains that for practiced magicians, tricking the uninitiated loses its thrill over time. So they end up having to up the ante, and in the process novitiates find themselves getting deeper and deeper into the practice, tradition, culture, and society of magic and magicians. He writes,

Sure, it’s fun to fool laypeople, but they’re easy prey. It’s far more thrilling to hunt your own kind. As a result, magicians are constantly engineering new ways to dupe one another. A hierarchy of foolmanship, a who-fooled-whom pecking order, rules the conjuror’s domain. This gladiatorial spirit in turn drives considerable evolution in the art. (173)

Stone’s own story begins with a trip to Belgium to compete in the 2006 Magic Olympics. His interest in magic was, at the time, little more than an outgrowth of his interest in science. He’d been an editor at Discover magazine and had since gone on to graduate school in physics at Columbia University. But after the Magic Olympics, where he performed dismally and was left completely humiliated and averse to the thought of ever doing magic again, he gradually came to realize that one way or another he would have to face his demons by mastering the art he’d only so far dabbled in.

Fooling Houdini chronicles how Stone became obsessed with developing his own personalized act and tweaking it to perfection, and how he went from being a pathetic amateur to a respectable semi-professional. The progress of a magician, Stone learns from Jeff McBride, follows “four cardinal stations of magic: Trickster, Sorcerer, Oracle, and Sage” (41). And the resultant story as told in Stone’s book follows an eminently recognizable narrative course, from humiliation and defeat to ever-increasing mastery and self-discovery.

Fooling Houdini will likely appeal to the same audience as did Moonwalking with Einstein, Joshua Foer’s book about how he ended up winning the U.S. Memory Championships. Foer, in fact, makes a guest appearance in Fooling Houdini when Stone seeks out his expertise to help him memorize a deck of cards for an original routine of his own devising. (He also gave the book a nice plug for the back cover.) The appeal of both books comes not just from the conventional narrative arc but also from the promise of untapped potential, a sense that greater mastery, and even a better life, lie just beyond reach, accessible to anyone willing to put in those enchanted ten thousand hours of training made famous by Malcolm Gladwell. It’s the same thing people seem to love about TED lectures, the idea that ideas will almost inevitably change our lives. Nathan Heller, in a recent New Yorker article, attempts to describe the appeal of TED conferences and lectures in terms that apply uncannily well to books like Foer’s and Stone’s. Heller writes,

Debby Ruth, a Long Beach attendee, told me that she started going to TED after reaching a point in her life when “nothing excited me anymore”; she returns now for a yearly fix. TED may present itself as an ideas conference, but most people seem to watch the lectures not so much for the information as for how they make them feel. (73)

The way they make us feel is similar to the way a good magic show can make us feel—like anything is possible, like on the other side of this great idea that breaks down the walls of our own limitations is a better, fuller, more just, and happier life. “Should we be grateful to TED for providing this form of transcendence—and on the Internet, of all places?” Heller asks.

Or should we feel manipulated by one more product attempting to play on our emotions? It’s tricky, because the ideas undergirding a TED talk are both the point and, for viewers seeking a generic TED-type thrill, essentially beside it: the appeal of TED comes as much from its presentation as from it substance. (73-4)

At their core, Fooling Houdini and Moonwalking with Einstein—and pretty much every TED lecture—are about transforming yourself, and to a somewhat lesser degree the world, either with new takes on deep-rooted traditions, reconnection with ancient wisdom, or revolutionary science.

Foer presumably funded the epic journey recounted in Moonwalking with his freelance articles and maybe with expense accounts from the magazines he wrote for. Still, it seems you could train to become a serviceable “mental athlete” without spending all that much money. Not so with magic. Stone’s prose is much more quirky and slightly more self-deprecatory than Foer’s, and in one of his funniest and most revealing chapters he discusses some of the personal and financial costs associated with his obsession. The title, “It’s Annoying and I Asked You to Stop,” is a quote from a girlfriend who was about to dump him. The chapter begins,

One of my biggest fears is that someday I’ll be audited. Not because my taxes aren’t in perfect order—I’m very OCD about saving receipts and keeping track of my expenses, a habit I learned from my father—but because it would bring me face-to-face with a very difficult and decidedly lose-lose dilemma in which I’d have to choose between going to jail for tax fraud and disclosing to another adult, in naked detail, just how much money I’ve spent on magic over the years. (That, and I’d have to fess up to eating at Arby’s multiple times while traveling to magic conventions.) (159)

Having originally found magic fun and mentally stimulating, Stone ends up being seduced into spending astronomical sums by the terrible slight he received from the magic community followed by a regimen of Pavlovian conditioning based on duping delight. Both Foer’s and Stone’s stories are essentially about moderately insecure guys who try to improve themselves by learning a new skill set.

The market for a renewed sense of limitless self-potential is booming. As children, it seems every future we can imagine for ourselves is achievable—that we can inhabit them all simultaneously—so whatever singular life we find ourselves living as adults inevitably falls short of our dreams. We may have good jobs, good relationships, good health, but we can’t help sometimes feeling like we’re missing out on something, like we’re trapped in overscheduled rutted routines of workaday compromise. After a while, it becomes more and more difficult to muster any enthusiasm for much of anything beyond the laziest indulgences like the cruises we save up for and plan months or years in advance, the three-day weekend at the lake cottage, a shopping date with an old friend, going out to eat with the gang. By modern, middle-class standards, this is the good life. What more can we ask for?

What if I told you, though, that there’s a training regimen that will make you so much more creative and intelligent that you’ll wonder after a few months how you ever managed to get by with a mind as dull as yours is now? What if I told you there’s a revolutionary diet and exercise program that is almost guaranteed to make you so much more attractive that even your friends won’t recognize you? What if I told you there’s a secret set of psychological principles that will allow you to seduce almost any member of the opposite sex, or prevail in any business negotiation you ever engage in? What if I told you you’ve been living in a small dark corner of the world, and that I know the way to a boundless life of splendor?

If you can convince people you know how to broaden the contours of selfhood and show them the world as they’ve never experienced it before, if you can give them the sense that their world is expanding, they’ll at the very least want to keep talking to you so they can keep the feeling going and maybe learn what your secret is. Much of this desire to better ourselves is focused on our social lives, and that’s why duping delight is so seductive—it gives us a taste of what it’s like to be the charismatic and irresistible characters we always expected ourselves to become. This is how Foer writes about his mindset at the outset of his memory training, after he’d read about the mythic feats of memory champion Ben Pridmore:

What would it mean to have all that otherwise-lost knowledge at my fingertips? I couldn’t help but think that it would make me more persuasive, more confident and, in some fundamental sense, smarter. Certainly I’d be a better journalist, friend, and boyfriend. But more than that, I imagined that having a memory like Ben Pridmore’s would make me an altogether more attentive, perhaps even wiser, person. To the extent that experience is the sum of our memories and wisdom the sum of experience, having a better memory would mean knowing not only more about the world, but also more about myself. (7)

Stone strikes a similar chord when he’s describing what originally attracted him to magic back when he was an awkward teenager. He writes,

In my mind, magic was also a disarming social tool, a universal language that transcended age and gender and culture. Magic would be a way out of my nerdiness. That’s right, I thought magic would make me less nerdy. Or at the very least it would allow me to become a different, more interesting nerd. Through magic, I would be able to navigate awkward social situations, escape the boundaries of culture and class, connect with people from all walks of life, and seduce beautiful women. In reality, I ended up spending most of my free time with pasty male virgins. (5)

This last line echoes Foer’s observation that the people you find at memory competitions are “indistinguishable from those” you’d find at a “‘Weird Al’ Yankovic (five of spades) concert”? (189).

Though Stone’s openness about his nerdiness at times shades into some obnoxious playing up of his nerdy credentials, it does lend itself to some incisive observations. One of the lessons he has to learn to become a better magician is that his performances have to be more about the audiences than they are about the tricks—less about duping and more about connecting. What this means is that magic isn’t the key to becoming more confident he hoped it would be; instead, he has to be more confident before he can be good at magic. For Stone, this means embracing, not trying to overcome, his nerdish tendencies. He writes,

Magicians like to pretend that they’re cool and mysterious, cultivating the image of the smooth operator, the suave seducer. Their stage names are always things like the Great Tomsoni or the Amazing Randi or the International Man of Mystery—never Alex the Magical Superdoofus or the Incredible Nerdini. But does all this posturing really make them look cooler? Or just more ridiculous for trying to hide their true stripes? Why couldn’t more magicians own up to their own nerdiness? Magic was geeky. And that was okay. (243)

Stone reaches this epiphany largely through the inspiration of a clown who, in a surprising twist, ends up stealing the show from many of the larger-than-life characters featured in Fooling Houdini. In an effort to improve his comfort while performing before crowds and thus to increase his stage presence, Stone works with his actress girlfriend, takes improv classes, and attends a “clown workshop” led by “a handsome man in his early forties named Christopher Bayes,” who begins by telling the class that “The clown is the physical manifestation of the unsocialized self… It’s the essence of the playful spirit before you were defeated by society, by the world” (235). Stone immediately recognizes the connection with magic. Here you have that spark of joyful spontaneity, that childish enthusiasm before a world where everything is new and anything is possible.

“The main trigger for laughter is surprise,” Bayes told us, speaking of how the clown gets his laughs. “There’s lots of ways to find that trigger. Some of them are tricks. Some of them are math. And some of them come from building something with integrity and then smashing it. So you smash the expectation and of what you think is going to happen. (239)

In smashing something you’ve devoted considerable energy to creating, you’re also asserting your freedom to walk away from what you’ve invested yourself in, to reevaluate your idea of what’s really important, to change your life on a whim. And surprise, as Bayes describes it, isn’t just the essential tool for comedians. Stone explains,

The same goes for the magician. Magic transports us to an absurd universe, parodying the laws of physics in a whimsical toying of cause and effect. “Magic creates tension,” says Juan Tamariz, “a logical conflict that it does not resolve. That’s why people often laugh after a trick, even when we haven’t done anything funny. Tamariz is also fond of saying that magic holds a mirror up to our impossible desires. We all would like to fly, see the future, know another’s thoughts, mend what has been broken. Great magic is anchored to a symbolic logic that transcends its status as an illusion. (239)

Stone’s efforts to become a Sage magician have up till this point in the story entailed little more than a desperate stockpiling of tricks. But he comes to realize that not all tricks jive with his personality, and if he tries to go too far outside of character his performances come across as strained and false. This is the stock ironic trope that this type of story turns on—he starts off trying to become something great only to discover that he first has to accept who he is. He goes on relaying the lessons of the clown workshop,

“Try to proceed with a kind of playful integrity,” Chris Bayes told us. “Because in that integrity we actually find more possibility of surprise than we do in an idea of how to trick us into laughing. You bring it from yourself. And we see this little gift that you brought for us, which is the gift of your truth. Not an idea of your truth, but the gift of your real truth. And you can play forever with that, because it’s infinite. (244)

What’s most charming about the principle of proceeding with playful integrity is that it applies not just to clowning and magic, but to almost every artistic and dramatic endeavor—and even to science. “Every truly great idea, be it in art or science,” Stone writes, “is a kind of magic” (289). Aspirants may initially be lured into any of these creative domains by the promise of greater mastery over other people, but the true sages end up being the ones who are the most appreciative and the most susceptible to the power of the art to produce in them that sense of playfulness and boundless exuberance.

Being fooled is fun, too, because it’s a controlled way of experiencing a loss of control. Much like a roller coaster or a scary movie, it lets you loosen your grip on reality without actually losing your mind. This is strangely cathartic, and when it’s over you feel more in control, less afraid. For magicians, watching magic is about chasing this feeling—call it duped delight, the maddening ecstasy of being a layperson again, a novice, if only for a moment. (291)

“Just before Vernon,” the Man Who Fooled Houdini, “died,” Stone writes, “comedian and amateur magician Dick Cavett asked him if there was anything he wished for.” Vernon answered, “I wish somebody could fool me one more time” (291). Stone uses this line to wrap up his book, neatly bringing the story full-circle.

Fooling Houdini is unexpectedly engrossing. It reads quite a bit different from the 2010 book on magic by neuroscientists Stephen Macknik and Susana Martinez-Conde, which they wrote with Sandra Blakeslee, Sleights of Mind: What the Neuroscience of Magic Reveals about Our Everyday Deceptions. For one thing, Stone focuses much more on the people he comes to know on his journey and less on the underlying principles. And, though Macknik and Martinez-Conde also use their own education in the traditions and techniques of magic as a narrative frame, Stone gets much more personal. One underlying message of both books is that our sense of being aware of what’s going on around us is illusory, and that illusion makes us ripe for the duping. But Stone conveys more of the childlike wonder of magic, despite his efforts at coming across as a stylish hipster geek. Unfortunately, when I got to the end I was reminded of the title of a TED lecture that’s perennially on the most-watched list, Brene Brown’s “The Power of Vulnerability,” which I came away from scratching my head because it seemed utterly nonsensical.

It’s interesting that duping delight is a feeling few anticipate and many fail to recognize even as they’re experiencing it. It is the trick that ends up being played on the trickster. Like most magic, it takes advantage of a motivational system that most of us are only marginally aware of, if at all. But there’s another motivational system and another magic trick that makes things like TED lectures so thrilling. It’s also the trick that makes books like Moonwalking with Einstein and Fooling Houdini so engrossing. Arthur and Elaine Aron use what’s called “The Self-Expansion Model” to explain what attracts us to other people, and even what attracts us to groups of people we end up wanting to identify with. The basic idea is that we’re motivated to increase our efficacy, not with regard to achieving any specific goals but in terms of our general potential. One of the main ways we seek to augment our potential is by establishing close relationships with other people who have knowledge or skills or resources that would contribute to our efficacy. Foer learns countless mnemonic techniques from guys like Ed Cooke. Stone learns magic from guys like Wes James. Meanwhile, we readers are getting a glimpse of all of it through our connection with Foer and Stone.

Self-expansion theory is actually somewhat uplifting in its own right because it offers a more romantic perspective on our human desire to associate with high-status individuals and groups. But the triggers for a sense of self-expansion are probably pretty easy to mimic, even for those who lack the wherewithal or the intention to truly increase our potential or genuinely broaden our horizons. Indeed, it seems as though self-expansion has become the main psychological principle exploited by politicians, marketers, and P.R. agents. This isn’t to say we should discount every book or lecture that we find uplifting, but we should keep in mind that there are genuine experiences of self-expansion and counterfeit ones. Heller’s observation about how TED lectures are more about presentation than substance, for instance, calls to mind an experiment done in the early 1970s in which Dr. Myron Fox gave a lecture titled “Mathematical Game Theory as Applied to Physician Education.” His audience included psychologists, psychiatrists, educators, and graduate students, virtually all of whom rated his ideas and his presentation highly. But Dr. Fox wasn’t actually a doctor; he was the actor Michael Fox. And his lecture was designed to be completely devoid of meaningful content but high on expressiveness and audience connection.

The Dr. Fox Effect is the term now used to describe our striking inability to recognize nonsense coming from the mouths of charismatic speakers.

         And if keeping our foolability in mind somehow makes that sense of self-transcendence elude us today, "tomorrow we will run faster, stretch out our arms farther....

And one fine morning—"

Also read:

PERCY FAWCETT’S 2 LOST CITIES

Read More
Dennis Junk Dennis Junk

Stories, Social Proof, & Our Two Selves

Robert Cialdini describes the phenomenon of social proof, whereby we look to how others are responding to something before we form an opinion ourselves. What are the implications of social proof for our assessments of literature? Daniel Kahneman describes two competing “selves,” the experiencing self and the remembering self. Which one should we trust to let us know how we truly feel about a story?

            You’ll quickly come up with a justification for denying it, but your response to a story is influenced far more by other people’s responses to it than by your moment-to-moment experience of reading or watching it. The impression that we either enjoy an experience or we don’t, that our enjoyment or disappointment emerges directly from the scenes, sensations, and emotions of the production itself, results from our cognitive blindness to several simultaneously active processes that go into our final verdict. We’re only ever aware of the output of the various algorithms, never the individual functions.

            None of us, for instance, directly experiences the operation of what psychologist and marketing expert Robert Cialdini calls social proof, but its effects on us are embarrassingly easy to measure. Even the way we experience pain depends largely on how we perceive others to be experiencing it. Subjects receiving mild shocks not only report them to be more painful when they witness others responding to them more dramatically, but they also show physiological signs of being in greater distress.

            Cialdini opens the chapter on social proof in his classic book Influence: Science and Practice by pointing to the bizarre practice of setting television comedies to laugh tracks. Most people you talk to will say canned laughter is annoying—and they’ll emphatically deny the mechanically fake chuckles and guffaws have any impact on how funny the jokes seem to them. The writers behind those jokes, for their part, probably aren’t happy about the implicit suggestion that their audiences need to be prompted to laugh at the proper times. So why do laugh tracks accompany so many shows? “What can it be about canned laughter that is so attractive to television executives?” Cialdini asks.

Why are these shrewd and tested people championing a practice that their potential watchers find disagreeable and their most creative talents find personally insulting? The answer is both simple and intriguing: They know what the research says. (98)

As with all the other “weapons of influence” Cialdini writes about in the book, social proof seems as obvious to people as it is dismissible. “I understand how it’s supposed to work,” we all proclaim, “but you’d have to be pretty stupid to fall for it.” And yet it still works—and it works on pretty much every last one of us. Cialdini goes on to discuss the finding that even suicide rates increase after a highly publicized story of someone killing themselves. The simple, inescapable reality is that when we see someone else doing something, we become much more likely to do it ourselves, whether it be writhing in genuine pain, laughing in genuine hilarity, or finding life genuinely intolerable.

            Another factor that complicates our responses to stories is that, unlike momentary shocks or the telling of jokes, they usually last long enough to place substantial demands on working memory. Movies last a couple hours. Novels can take weeks. What this means is that when we try to relate to someone else what we thought of a movie or a book we’re relying on a remembered abstraction as opposed to a real-time recording of how much we enjoyed the experience. In his book Thinking, Fast and Slow, Daniel Kahneman suggests that our memories of experiences can diverge so much from our feelings at any given instant while actually having those experiences that we effectively have two selves: the experiencing self and the remembering self. To illustrate, he offers the example of a man who complains that a scratch at the end of a disc of his favorite symphony ruined the listening experience for him.“But the experience was not actually ruined, only the memory of it,” Kahneman points out. “The experiencing self had had an experience that was almost entirely good, and the bad end could not undo it, because it had already happened” (381). But the distinction usually only becomes apparent when the two selves disagree—and such disagreements usually require some type of objective recording to discover. Kahneman explains,

Confusing experience with the memory of it is a compelling cognitive illusion—and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experiences. This is the tyranny of the remembering self. (381)

Kahneman suggests the priority we can’t help but give to the remembering self explains why tourists spend so much time taking pictures. The real objective of a vacation is not to have a pleasurable or fun experience; it’s to return home with good vacation stories.

            Kahneman reports the results of a landmark study he designed with Don Redelmeier that compared moment-to-moment pain recordings of men undergoing colonoscopies to global pain assessments given by the patients after the procedure. The outcome demonstrated that the remembering self was remarkably unaffected by the duration of the procedure or the total sum of pain experienced, as gauged by adding up the scores given moment-to-moment during the procedure. Men who actually experienced more pain nevertheless rated the procedure as less painful when the discomfort tapered off gradually as opposed to dropping off precipitously after reaching a peak. The remembering self is reliably guilty of what Kahneman calls “duration neglect,” and it assesses experiences based on a “peak-end rule,” whereby the “global retrospective rating” will be “well predicted by the average level of pain reported at the worst moment of the experience and at its end” (380). Duration neglect and the peak-end rule probably account for the greater risk of addiction for users of certain drugs like heroine or crystal meth, which result in rapid, intense highs and precipitous drop-offs, as opposed to drugs like marijuana whose effects are longer-lasting but less intense.

            We’ve already seen that pain in real time can be influenced by how other people are responding to it, and we can probably extrapolate and assume that the principle applies to pleasurable experiences as well. How does the divergence between experience and memory factor into our response to stories as expressed by our decisions about further reading or viewing, or in things like reviews or personal recommendations? For one thing, we can see that most good stories are structured in a way that serves not so much as a Jamesian “direct impression of life,”i.e. as reports from the experiencing self, but much more like the tamed abstractions Stevenson described in his “Humble Remonstrance” to James. As Kahneman explains,

A story is about significant events and memorable moments, not about time passing. Duration neglect is normal in a story, and the ending often defines its character. The same core features appear in the rules of narratives and in the memories of colonoscopies, vacations, and films. This is how the remembering self works: it composes stories and keeps them for future reference. (387)

            Now imagine that you’re watching a movie in a crowded theater. Are you influenced by the responses of your fellow audience members? Are you more likely to laugh if everyone else is laughing, wince if everyone else is wincing, cheer if everyone else is cheering? These are the effects on your experiencing self. What happens, though, in the hours and days and weeks after the movie is over—or after you’re done reading the book? Does your response to the story start to become intertwined with and indistinguishable from the cognitive schema you had in place before ever watching or reading it? Are your impressions influenced by the opinions of critics or friends whose opinions you respect? Do you give a celebrated classic the benefit of the doubt, assuming it has some merit even if you enjoyed it much less than some less celebrated work? Do you read into it virtues whose real source may be external to the story itself? Do you miss virtues that actually are present in less celebrated stories?

             Taken to its extreme, this focus on social proof leads to what’s known as social constructivism. In the realm of stories, this would be the idea that there are no objective criteria at all with which to assess merit; it’s all based on popular opinion or the dictates of authorities. Much of the dissatisfaction with the so-called canon is based on this type of thinking. If we collectively decide some work of literature is really good and worth celebrating, the reasoning goes, then it magically becomes really good and worth celebrating. There’s an undeniable kernel of truth to this—and there’s really no reason to object to the idea that one of the things that makes a work of art attention-worthy is that a lot of people are attending to it. Art serves a social function after all; part of the fun comes from sharing the experience and having conversations about it. But I personally can’t credit the absolutist version of social constructivism. I don’t think you’re anything but a literary tourist until you can make a convincing case for why a few classics don’t deserve the distinction—even though I acknowledge that any such case will probably be based largely on the ideas of other people.

            The research on the experiencing versus the remembering self also suggests a couple criteria we can apply to our assessments of stories so that they’re more meaningful to people who haven’t been initiated into the society and culture of highbrow literature. Too often, the classics are dismissed as works only English majors can appreciate. And too often, they're written in a way that justifies that dismissal. One criterion should be based on how well the book satisfies the experiencing self: I propose that a story should be considered good insofar as it induces a state of absorption. You forget yourself and become completely immersed in the plot. Mihaly Csikszentmihalyi calls this state flow, and has found that the more time people spend in it the happier and more fulfilled they tend to be. But the total time a reader or viewer spends in a state of flow will likely be neglected if the plot never reaches a peak of intensity, or if it ends on note of tedium. So the second criterion should be how memorable the story is. Assessments based on either of these criteria are of course inevitably vulnerable to social proof and idiosyncratic factors of the individual audience member (whether I find Swann’s Way tedious or absorbing depends on how much sleep and caffeine I’ve had). And yet knowing what the effects are that make for a good aesthetic experience, in real time and in our memories, can help us avoid the trap of merely academic considerations. And knowing that our opinions will always be somewhat contaminated by outside influences shouldn’t keep us from trying to be objective any more than knowing that surgical theaters can never be perfectly sanitized should keep doctors from insisting they be as well scrubbed and scoured as possible. 

Also read:

LET'S PLAY KILL YOUR BROTHER: FICTION AS A MORAL DILEMMA GAME

And:

TOO PSYCHED FOR SHERLOCK: A REVIEW OF MARIA KONNIKOVA’S “MASTERMIND: HOW TO THINK LIKE SHERLOCK HOLMES”—WITH SOME THOUGHTS ON SCIENCE EDUCATION

And:

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

Read More
Dennis Junk Dennis Junk

Why Shakespeare Nauseated Darwin: A Review of Keith Oatley's "Such Stuff as Dreams"

Does practicing science rob one of humanity? Why is it that, if reading fiction trains us to take the perspective of others, English departments are rife with pettiness and selfishness? Keith Oately is trying to make the study of literature more scientific, and he provides hints to these riddles and many others in his book “Such Stuff as Dreams.”

Late in his life, Charles Darwin lost his taste for music and poetry. “My mind seems to have become a kind of machine for grinding general laws out of large collections of facts,” he laments in his autobiography, and for many of us the temptation to place all men and women of science into a category of individuals whose minds resemble machines more than living and emotionally attuned organs of feeling and perceiving is overwhelming. In the 21st century, we even have a convenient psychiatric diagnosis for people of this sort. Don’t we just assume Sheldon in The Big Bang Theory has autism, or at least the milder version of it known as Asperger’s? It’s probably even safe to assume the show’s writers had the diagnostic criteria for the disorder in mind when they first developed his character. Likewise, Dr. Watson in the BBC’s new and obscenely entertaining Sherlock series can’t resist a reference to the quintessential evidence-crunching genius’s own supposed Asperger’s.

In Darwin’s case, however, the move away from the arts couldn’t have been due to any congenital deficiency in his finer human sentiments because it occurred only in adulthood. He writes,

I have said that in one respect my mind has changed during the last twenty or thirty years. Up to the age of thirty, or beyond it, poetry of many kinds, such as the works of Milton, Gray, Byron, Wordsworth, Coleridge, and Shelley, gave me great pleasure, and even as a schoolboy I took intense delight in Shakespeare, especially in the historical plays. I have also said that formerly pictures gave me considerable, and music very great delight. But now for many years I cannot endure to read a line of poetry: I have tried lately to read Shakespeare, and found it so intolerably dull that it nauseated me. I have also almost lost my taste for pictures or music. Music generally sets me thinking too energetically on what I have been at work on, instead of giving me pleasure.

We could interpret Darwin here as suggesting that casting his mind too doggedly into his scientific work somehow ruined his capacity to appreciate Shakespeare. But, like all thinkers and writers of great nuance and sophistication, his ideas are easy to mischaracterize through selective quotation (or, if you’re Ben Stein or any of the other unscrupulous writers behind creationist propaganda like the pseudo-documentary Expelled, you can just lie about what he actually wrote).

One of the most charming things about Darwin is that his writing is often more exploratory than merely informative. He writes in search of answers he has yet to discover. In a wider context, the quote about his mind becoming a machine, for instance, reads,

This curious and lamentable loss of the higher aesthetic tastes is all the odder, as books on history, biographies, and travels (independently of any scientific facts which they may contain), and essays on all sorts of subjects interest me as much as ever they did. My mind seems to have become a kind of machine for grinding general laws out of large collections of facts, but why this should have caused the atrophy of that part of the brain alone, on which the higher tastes depend, I cannot conceive. A man with a mind more highly organised or better constituted than mine, would not, I suppose, have thus suffered; and if I had to live my life again, I would have made a rule to read some poetry and listen to some music at least once every week; for perhaps the parts of my brain now atrophied would thus have been kept active through use. The loss of these tastes is a loss of happiness, and may possibly be injurious to the intellect, and more probably to the moral character, by enfeebling the emotional part of our nature.

His concern for his lost aestheticism notwithstanding, Darwin’s humanism, his humanity, radiates in his writing with a warmth that belies any claim about thinking like a machine, just as the intelligence that shows through it gainsays his humble deprecations about the organization of his mind.

           In this excerpt, Darwin, perhaps inadvertently, even manages to put forth a theory of the function of art. Somehow, poetry and music not only give us pleasure and make us happy—enjoying them actually constitutes a type of mental exercise that strengthens our intellect, our emotional awareness, and even our moral character. Novelist and cognitive psychologist Keith Oatley explores this idea of human betterment through aesthetic experience in his book Such Stuff as Dreams: The Psychology of Fiction. This subtitle is notably underwhelming given the long history of psychoanalytic theorizing about the meaning and role of literature. However, whereas psychoanalysis has fallen into disrepute among scientists because of its multiple empirical failures and a general methodological hubris common among its practitioners, the work of Oatley and his team at the University of Toronto relies on much more modest, and at the same time much more sophisticated, scientific protocols. One of the tools these researchers use, The Reading the Mind in the Eyes Test, was in fact first developed to research our new category of people with machine-like minds. What the researchers find bolsters Darwin’s impression that art, at least literary art, functions as a kind of exercise for our faculty of understanding and relating to others.

           Reasoning that “fiction is a kind of simulation of selves and their vicissitudes in the social world” (159), Oatley and his colleague Raymond Mar hypothesized that people who spent more time trying to understand fictional characters would be better at recognizing and reasoning about other, real-world people’s states of mind. So they devised a test to assess how much fiction participants in their study read based on how well they could categorize a long list of names according to which ones belonged to authors of fiction, which to authors of nonfiction, and which to non-authors. They then had participants take the Mind-in-the-Eyes Test, which consists of matching close-up pictures of peoples’ eyes with terms describing their emotional state at the time they were taken. The researchers also had participants take the Interpersonal Perception Test, which has them answer questions about the relationships of people in short video clips featuring social interactions. An example question might be “Which of the two children, or both, or neither, are offspring of the two adults in the clip?”  (Imagine Sherlock Holmes taking this test.) As hypothesized, Oatley writes, “We found that the more fiction people read, the better they were at the Mind-in-the-Eyes Test. A similar relationship held, though less strongly, for reading fiction and the Interpersonal Perception Test” (159).

            One major shortcoming of this study is that it fails to establish causality; people who are naturally better at reading emotions and making sound inferences about social interactions may gravitate to fiction for some reason. So Mar set up an experiment in which he had participants read either a nonfiction article from an issue of the New Yorker or a work of short fiction chosen to be the same length and require the same level of reading skills. When the two groups then took a test of social reasoning, the ones who had read the short story outperformed the control group. Both groups also took a test of analytic reasoning as a further control; on this variable there was no difference in performance between the groups. The outcome of this experiment, Oatley stresses, shouldn’t be interpreted as evidence that reading one story will increase your social skills in any meaningful and lasting way. But reading habits established over long periods likely explain the more significant differences between individuals found in the earlier study. As Oatley explains,

Readers of fiction tend to become more expert at making models of others and themselves, and at navigating the social world, and readers of non-fiction are likely to become more expert at genetics, or cookery, or environmental studies, or whatever they spend their time reading. Raymond Mar’s experimental study on reading pieces from the New Yorker is probably best explained by priming. Reading a fictional piece puts people into a frame of mind of thinking about the social world, and this is probably why they did better at the test of social reasoning. (160)

Connecting these findings to real-world outcomes, Oatley and his team also found that “reading fiction was not associated with loneliness,” as the stereotype suggests, “but was associated with what psychologists call high social support, being in a circle of people whom participants saw a lot, and who were available to them practically and emotionally” (160).

            These studies by the University of Toronto team have received wide publicity, but the people who should be the most interested in them have little or no idea how to go about making sense of them. Most people simply either read fiction or they don’t. If you happen to be of the tribe who studies fiction, then you were probably educated in a way that engendered mixed feelings—profound confusion really—about science and how it works. In his review of The Storytelling Animal, a book in which Jonathan Gottschall incorporates the Toronto team’s findings into the theory that narrative serves the adaptive function of making human social groups more cooperative and cohesive, Adam Gopnik sneers,

Surely if there were any truth in the notion that reading fiction greatly increased our capacity for empathy then college English departments, which have by far the densest concentration of fiction readers in human history, would be legendary for their absence of back-stabbing, competitive ill-will, factional rage, and egocentric self-promoters; they’d be the one place where disputes are most often quickly and amiably resolved by mutual empathetic engagement. It is rare to see a thesis actually falsified as it is being articulated.

Oatley himself is well aware of the strange case of university English departments. He cites a report by Willie van Peer on a small study he did comparing students in the natural sciences to students in the humanities. Oatley explains,

There was considerable scatter, but on average the science students had higher emotional intelligence than the humanities students, the opposite of what was expected; van Peer indicts teaching in the humanities for often turning people away from human understanding towards technical analyses of details. (160)

Oatley suggests in a footnote that an earlier study corroborates van Peer’s indictment. It found that high school students who show more emotional involvement with short stories—the type of connection that would engender greater empathy—did proportionally worse on standard academic assessments of English proficiency. The clear implication of these findings is that the way literature is taught in universities and high schools is long overdue for an in-depth critical analysis.

            The idea that literature has the power to make us better people is not new; indeed, it was the very idea on which the humanities were originally founded. We have to wonder what people like Gopnik believe the point of celebrating literature is if not to foster greater understanding and empathy. If you either enjoy it or you don’t, and it has no beneficial effects on individuals or on society in general, why bother encouraging anyone to read? Why bother writing essays about it in the New Yorker? Tellingly, many scholars in the humanities began doubting the power of art to inspire greater humanity around the same time they began questioning the value and promise of scientific progress. Oatley writes,

Part of the devastation of World War II was the failure of German citizens, one of the world’s most highly educated populations, to prevent their nation’s slide into Nazism. George Steiner has famously asserted: “We know that a man can read Goethe or Rilke in the evening, that he can play Bach and Schubert, and go to his day’s work at Auschwitz in the morning.” (164)

Postwar literary theory and criticism has, perversely, tended toward the view that literature and language in general serve as a vessel for passing on all the evils inherent in our western, patriarchal, racist, imperialist culture. The purpose of literary analysis then becomes to shift out these elements and resist them. Unfortunately, such accusatory theories leave unanswered the question of why, if literature inculcates oppressive ideologies, we should bother reading it at all. As van Peer muses in the report Oatley cites, “The Inhumanity of the Humanities,”

Consider the ills flowing from postmodern approaches, the “posthuman”: this usually involves the hegemony of “race/class/gender” in which literary texts are treated with suspicion. Here is a major source of that loss of emotional connection between student and literature. How can one expect a certain humanity to grow in students if they are continuously instructed to distrust authors and texts? (8)

           Oatley and van Peer point out, moreover, that the evidence for concentration camp workers having any degree of literary or aesthetic sophistication is nonexistent. According to the best available evidence, most of the greatest atrocities were committed by soldiers who never graduated high school. The suggestion that some type of cozy relationship existed between Nazism and an enthusiasm for Goethe runs afoul of recorded history. As Oatley points out,

Apart from propensity to violence, nationalism, and anti-Semitism, Nazism was marked by hostility to humanitarian values in education. From 1933 onwards, the Nazis replaced the idea of self-betterment through education and reading by practices designed to induce as many as possible into willing conformity, and to coerce the unwilling remainder by justified fear. (165)

Oatley also cites the work of historian Lynn Hunt, whose book Inventing Human Rights traces the original social movement for the recognition of universal human rights to the mid-1700s, when what we recognize today as novels were first being written. Other scholars like Steven Pinker have pointed out too that, while it’s hard not to dwell on tragedies like the Holocaust, even atrocities of that magnitude are resoundingly overmatched by the much larger post-Enlightenment trend toward peace, freedom, and the wider recognition of human rights. It’s sad that one of the lasting legacies of all the great catastrophes of the 20th Century is a tradition in humanities scholarship that has the people who are supposed to be the custodians of our literary heritage hell-bent on teaching us all the ways that literature makes us evil.

            Because Oatley is a central figure in what we can only hope is a movement to end the current reign of self-righteous insanity in literary studies, it pains me not to be able to recommend Such Stuff as Dreams to anyone but dedicated specialists. Oatley writes in the preface that he has “imagined the book as having some of the qualities of fiction. That is to say I have designed it to have a narrative flow” (x), and it may simply be that this suggestion set my expectations too high. But the book is poorly edited, the prose is bland and often roles over itself into graceless tangles, and a couple of the chapters seem like little more than haphazardly collated reports of studies and theories, none exactly off-topic, none completely without interest, but all lacking any central progression or theme. The book often reads more like an annotated bibliography than a story. Oatley’s scholarly range is impressive, however, bearing not just on cognitive science and literature through the centuries but extending as well to the work of important literary theorists. The book is never unreadable, never opaque, but it’s not exactly a work of art in its own right.

            Insofar as Such Stuff as Dreams is organized around a central idea, it is that fiction ought be thought of not as “a direct impression of life,” as Henry James suggests in his famous essay “The Art of Fiction,” and as many contemporary critics—notably James Wood—seem to think of it. Rather, Oatley agrees with Robert Louis Stevenson’s response to James’s essay, “A Humble Remonstrance,” in which he writes that

Life is monstrous, infinite, illogical, abrupt and poignant; a work of art in comparison is neat, finite, self-contained, rational, flowing, and emasculate. Life imposes by brute energy, like inarticulate thunder; art catches the ear, among the far louder noises of experience, like an air artificially made by a discreet musician. (qtd on pg 8)

Oatley theorizes that stories are simulations, much like dreams, that go beyond mere reflections of life to highlight through defamiliarization particular aspects of life, to cast them in a new light so as to deepen our understanding and experience of them. He writes,

Every true artistic expression, I think, is not just about the surface of things. It always has some aspect of the abstract. The issue is whether, by a change of perspective or by a making the familiar strange, by means of an artistically depicted world, we can see our everyday world in a deeper way. (15)

Critics of high-brow literature like Wood appreciate defamiliarization at the level of description; Oatley is suggesting here though that the story as a whole functions as a “metaphor-in-the-large” (17), a way of not just making us experience as strange some object or isolated feeling, but of reconceptualizing entire relationships, careers, encounters, biographies—what we recognize in fiction as plots. This is an important insight, and it topples verisimilitude from its ascendant position atop the hierarchy of literary values while rendering complaints about clichéd plots potentially moot. Didn’t Shakespeare recycle plots after all?

            The theory of fiction as a type of simulation to improve social skills and possibly to facilitate group cooperation is emerging as the frontrunner in attempts to explain narrative interest in the context of human evolution. It is to date, however, impossible to rule out the possibility that our interest in stories is not directly adaptive but instead emerges as a byproduct of other traits that confer more immediate biological advantages. The finding that readers track actions in stories with the same brain regions that activate when they witness similar actions in reality, or when they engage in them themselves, is important support for the simulation theory. But the function of mirror neurons isn’t well enough understood yet for us to determine from this study how much engagement with fictional stories depends on the reader's identifying with the protagonist. Oatley’s theory is more consonant with direct and straightforward identification. He writes,

A very basic emotional process engages the reader with plans and fortunes of a protagonist. This is what often drives the plot and, perhaps, keeps us turning the pages, or keeps us in our seat at the movies or at the theater. It can be enjoyable. In art we experience the emotion, but with it the possibility of something else, too. The way we see the world can change, and we ourselves can change. Art is not simply taking a ride on preoccupations and prejudices, using a schema that runs as usual. Art enables us to experience some emotions in contexts that we would not ordinarily encounter, and to think of ourselves in ways that usually we do not. (118)

Much of this change, Oatley suggests, comes from realizing that we too are capable of behaving in ways that we might not like. “I am capable of this too: selfishness, lack of sympathy” (193), is what he believes we think in response to witnessing good characters behave badly.

            Oatley’s theory has a lot to recommend it, but William Flesch’s theory of narrative interest, which suggests we don’t identify with fictional characters directly but rather track them and anxiously hope for them to get whatever we feel they deserve, seems much more plausible in the context of our response to protagonists behaving in surprisingly selfish or antisocial ways. When I see Ed Norton as Tyler Durden beating Angel Face half to death in Fight Club, for instance, I don’t think, hey, that’s me smashing that poor guy’s face with my fists. Instead, I think, what the hell are you doing? I had you pegged as a good guy. I know you’re trying not to be as much of a pushover as you used to be but this is getting scary. I’m anxious that Angel Face doesn’t get too damaged—partly because I imagine that would be devastating to Tyler. And I’m anxious lest this incident be a harbinger of worse behavior to come.

            The issue of identification is just one of several interesting questions that can lend itself to further research. Oatley and Mar’s studies are not enormous in terms of sample size, and their subjects were mostly young college students. What types of fiction work the best to foster empathy? What types of reading strategies might we encourage students to apply to reading literature—apart from trying to remove obstacles to emotional connections with characters? But, aside from the Big-Bad-Western Empire myth that currently has humanities scholars grooming successive generations of deluded ideologues to be little more than culture vultures presiding over the creation and celebration of Loser Lit, the other main challenge to transporting literary theory onto firmer empirical grounds is the assumption that the arts in general and literature in particular demand a wholly different type of thinking to create and appreciate than the type that goes into the intricate mechanics and intensely disciplined practices of science.

As Oatley and the Toronto team have shown, people who enjoy fiction tend to have the opposite of autism. And people who do science are, well, Sheldon. Interestingly, though, the writers of The Big Bang Theory, for whatever reason, included some contraindications for a diagnosis of autism or Asperger’s in Sheldon’s character. Like the other scientists in the show, he’s obsessed with comic books, which require at least some understanding of facial expression and body language to follow. As Simon Baron-Cohen, the autism researcher who designed the Mind-in-the-Eyes test, explains, “Autism is an empathy disorder: those with autism have major difficulties in 'mindreading' or putting themselves into someone else’s shoes, imagining the world through someone else’s feelings” (137). Baron-Cohen has coined the term “mindblindness” to describe the central feature of the disorder, and many have posited that the underlying cause is abnormal development of the brain regions devoted to perspective taking and understanding others, what cognitive psychologists refer to as our Theory of Mind.

            To follow comic book plotlines, Sheldon would have to make ample use of his own Theory of Mind. He’s also given to absorption in various science fiction shows on TV. If he were only interested in futuristic gadgets, as an autistic would be, he could just as easily get more scientifically plausible versions of them in any number of nonfiction venues. By Baron-Cohen’s definition, Sherlock Holmes can’t possibly have Asperger’s either because his ability to get into other people’s heads is vastly superior to pretty much everyone else’s. As he explains in “The Musgrave Ritual,”

You know my methods in such cases, Watson: I put myself in the man’s place, and having first gauged his intelligence, I try to imagine how I should myself have proceeded under the same circumstances.

            What about Darwin, though, that demigod of science who openly professed to being nauseated by Shakespeare? Isn’t he a prime candidate for entry into the surprisingly unpopulated ranks of heartless, data-crunching scientists whose thinking lends itself so conveniently to cooptation by oppressors and committers of wartime atrocities? It turns out that though Darwin held many of the same racist views as nearly all educated men of his time, his ability to empathize across racial and class divides was extraordinary. Darwin was not himself a Social Darwinist, a theory devised by Herbert Spencer to justify inequality (which has currency still today among political conservatives). And Darwin was also a passionate abolitionist, as is clear in the following excerpts from The Voyage of the Beagle:

On the 19th of August we finally left the shores of Brazil. I thank God, I shall never again visit a slave-country. To this day, if I hear a distant scream, it recalls with painful vividness my feelings, when passing a house near Pernambuco, I heard the most pitiable moans, and could not but suspect that some poor slave was being tortured, yet knew that I was as powerless as a child even to remonstrate.

Darwin is responding to cruelty in a way no one around him at the time would have. And note how deeply it pains him, how profound and keenly felt his sympathy is.

I was present when a kind-hearted man was on the point of separating forever the men, women, and little children of a large number of families who had long lived together. I will not even allude to the many heart-sickening atrocities which I authentically heard of;—nor would I have mentioned the above revolting details, had I not met with several people, so blinded by the constitutional gaiety of the negro as to speak of slavery as a tolerable evil.

            The question arises, not whether Darwin had sacrificed his humanity to science, but why he had so much more humanity than many other intellectuals of his day.

It is often attempted to palliate slavery by comparing the state of slaves with our poorer countrymen: if the misery of our poor be caused not by the laws of nature, but by our institutions, great is our sin; but how this bears on slavery, I cannot see; as well might the use of the thumb-screw be defended in one land, by showing that men in another land suffered from some dreadful disease.

And finally we come to the matter of Darwin’s Theory of Mind, which was quite clearly in no way deficient.

Those who look tenderly at the slave owner, and with a cold heart at the slave, never seem to put themselves into the position of the latter;—what a cheerless prospect, with not even a hope of change! picture to yourself the chance, ever hanging over you, of your wife and your little children—those objects which nature urges even the slave to call his own—being torn from you and sold like beasts to the first bidder! And these deeds are done and palliated by men who profess to love their neighbours as themselves, who believe in God, and pray that His Will be done on earth! It makes one's blood boil, yet heart tremble, to think that we Englishmen and our American descendants, with their boastful cry of liberty, have been and are so guilty; but it is a consolation to reflect, that we at least have made a greater sacrifice than ever made by any nation, to expiate our sin. (530-31)

            I suspect that Darwin’s distaste for Shakespeare was borne of oversensitivity. He doesn't say music failed to move him; he didn’t like it because it made him think “too energetically.” And as aesthetically pleasing as Shakespeare is, existentially speaking, his plays tend to be pretty harsh, even the comedies. When Prospero says, "We are such stuff / as dreams are made on" in Act 4 of The Tempest, he's actually talking not about characters in stories, but about how ephemeral and insignificant real human lives are. But why, beyond some likely nudge from his inherited temperament, was Darwin so sensitive? Why was he so empathetic even to those so vastly different from him? After admitting he’d lost his taste for Shakespeare, paintings, and music, he goes to say,

On the other hand, novels which are works of the imagination, though not of a very high order, have been for years a wonderful relief and pleasure to me, and I often bless all novelists. A surprising number have been read aloud to me, and I like all if moderately good, and if they do not end unhappily—against which a law ought to be passed. A novel, according to my taste, does not come into the first class unless it contains some person whom one can thoroughly love, and if a pretty woman all the better.

Also read

STORIES, SOCIAL PROOF, & OUR TWO SELVES

And:

LET'S PLAY KILL YOUR BROTHER: FICTION AS A MORAL DILEMMA GAME

And:

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

[Check out the Toronto group's blog at onfiction.ca]

Read More
Dennis Junk Dennis Junk

How to be Interesting: Dead Poets and Causal Inferences

Henry James famously wrote, “The only obligation to which in advance we may hold a novel without incurring the accusation of being arbitrary, is that it be interesting.” But how does one be interesting? The answer probably lies in staying one step ahead of your readers, but every reader moves at their own pace.

No one writes a novel without ever having read one. Though storytelling comes naturally to us as humans, our appreciation of the lengthy, intricately rendered narratives we find spanning the hundreds of pages between book covers is contingent on a long history of crucial developments, literacy for instance. In the case of an individual reader, the faithfulness with which ontogeny recapitulates phylogeny will largely determine the level of interest taken in any given work of fiction. In other words, to appreciate a work, it is necessary to have some knowledge of the literary tradition to which it belongs. T.S. Eliot’s famous 1919 essay “Tradition and the Individual Talent” eulogizes great writers as breathing embodiments of the entire history of their art. “The poet must be very conscious of the main current,” Eliot writes,

which does not at all flow invariably through the most distinguished reputations. He must be quite aware of the obvious fact that art never improves, but that the material of art is never quite the same. He must be aware that the mind of Europe—the mind of his own country—a mind which he learns in time to be much more important than his own private mind—is a mind which changes, and that this change is a development which abandons nothing en route, which does not superannuate either Shakespeare, or Homer, or the rock of the Magdalenian draughtsmen.

Though Eliot probably didn’t mean to suggest that to write a good poem or novel you have to have thoroughly mastered every word of world literature, a condition that would’ve excluded most efforts even at the time he wrote the essay, he did believe that to fully understand a work you have to be able to place it in its proper historical context. “No poet,” he wrote,

no artist of any art, has his complete meaning alone. His significance, his appreciation is the appreciation of his relation to the dead poets and artists. You cannot value him alone; you must set him, for contrast and comparison, among the dead.

If this formulation for what goes into the appreciation of art is valid, then as time passes and historical precedents accumulate the burden of knowledge that must be shouldered to sustain adequate interest in or appreciation for works in the tradition will be getting constantly bigger. Accordingly, the number of people who can manage it will be getting constantly smaller.

But what if there is something like a threshold awareness of literary tradition—or even of current literary convention—beyond which the past ceases to be the most important factor influencing your appreciation for a particular work? Once your reading comprehension is up to snuff and you’ve learned how to deal with some basic strategies of perspective—first person, third person omniscient, etc.—then you’re free to interpret stories not merely as representative of some tradition but of potentially real people and events, reflective of some theme that has real meaning in most people’s lives. Far from seeing the task of the poet or novelist as serving as a vessel for artistic tradition, Henry James suggests in his 1884 essay “The Art of Fiction” that

The only obligation to which in advance we may hold a novel without incurring the accusation of being arbitrary, is that it be interesting. That general responsibility rests upon it, but it is the only one I can think of. The ways in which it is at liberty to accomplish this result (of interesting us) strike me as innumerable and such as can only suffer from being marked out, or fenced in, by prescription. They are as various as the temperament of man, and they are successful in proportion as they reveal a particular mind, different from others. A novel is in its broadest definition a personal impression of life; that, to begin with, constitutes its value, which is greater or less according to the intensity of the impression.

Writing for dead poets the way Eliot suggests may lead to works that are historically interesting. But a novel whose primary purpose is to represent, say, Homer’s Odyssey in some abstract way, a novel which, in other words, takes a piece of literature as its subject matter rather than some aspect of life as it is lived by humans, will likely only ever be interesting to academics. This isn’t to say that writers of the past ought to be ignored; rather, their continuing relevance is likely attributable to their works’ success in being interesting. So when you read Homer you shouldn’t be wondering how you might artistically reconceptualize his epics—you should be attending to the techniques that make them interesting and wondering how you might apply them in your own work, which strives to artistically represent some aspect of live. You go to past art for technical or thematic inspiration, not for traditions with which to carry on some dynamic exchange.

Representation should, as a rule of thumb, take priority over tradition. And to insist, as Eliot does, as an obvious fact or otherwise, that artistic techniques never improve is to admit defeat before taking on the challenge. But this leaves us with the question of how, beyond a devotion to faithful representations of recognizably living details, one manages to be interesting. Things tend to interest us when they’re novel or surprising. That babies direct their attention to incidents which go against their expectations is what allows us to examine what those expectations are. Babies, like their older counterparts, stare longer at bizarre occurrences. If a story consisted of nothing but surprising incidents, however, we would probably lose interest in it pretty quickly because it would strike us as chaotic and incoherent. Citing research showing that while surprise is necessary in securing the interest of readers but not sufficient, Sung-Il Kim, a psychologist at Korea University, explains that whatever incongruity causes the surprise must somehow be resolved. In other words, the surprise has to make sense in the shifted context.

In Aspects of the Novel, E.M. Forster makes his famous distinction between flat and round characters with reference to the latter’s ability to surprise readers. He notes however that surprise is only half the formula, since a character who only surprises would seem merely erratic—or would seem like something other than a real person. He writes,

The test of a round character is whether it is capable of surprising in a convincing way. If it never surprises, it is flat. If it does not convince, it is a flat pretending to be round. It has the incalculability of life about it—life within the pages of a book. And by using it sometimes alone, more often in combination with the other kind, the novelist achieves his task of acclimatization and harmonizes the human race with the other aspects of his work. (78)

Kim discovered that this same dynamic is at play even in the basic of unit of a single described event, suggesting that the convincing surprise is important for all aspects of the story, not just character. He went on to test the theory that what lies at the heart of our interest in these seeming incongruities that are in time resolved is our tendency to anticipate the resolution. When a brief description involves some element that must be inferred, it is considered more interesting, and it proves more memorable, than when the same incident is described in full detail without any demand for inference. However, when researchers rudely distract readers in experiments, keeping them from being able to infer, the differences in recall and reported interest vanish.

Kim proposes a “causal bridging inference” theory to explain what makes a story interesting. If there aren’t enough inferences to be made, the story seems boring and banal. But if there are too many then the reader gets overwhelmed and spaces out. “Whether inferences are drawn or not,” Kim writes,

depends on two factors: the amount of background knowledge a reader possesses and the structure of a story… In a real life situation, for example, people are interested in new scientific theories, new fashion styles, or new leading-edge products only when they have an adequate amount of background knowledge on the domain to fill the gap between the old and the new… When a story contains such detailed information that there is no gap to fill in, a reader does not need to generate inferences. In this case, the story would not be interesting even if the reader possessed a great deal of background knowledge. (69)

One old-fashioned and intuitive way of thinking about causal bridge inference theory is to see the task of a writer as keeping one or two steps ahead of the reader. If the story runs ahead by more than a few steps it risks being too difficult to follow and the reader gets lost. If it falls behind, it drags, like the boor who relishes the limelight and so stretches out his anecdotes with excruciatingly superfluous detail.

For a writer, the takeaway is that you want to shock and surprise your readers, which means making your story take unexpected, incongruous turns, but you should also seed the narrative with what in hindsight can be seen as hints to what’s to come so that the surprises never seem random or arbitrary—and so that the reader is trained to seek out further clues to make further inferences. This is what Forster meant when he said characters should change in ways that are both surprising and convincing. It’s perhaps a greater achievement to have character development, plot, and language integrated so that an inevitable surprise in one of these areas has implications for or bleeds into both the others. But we as readers can enjoy on its own an unlikely observation or surprising analogy that we discover upon reflection to be fitting. And of course we can enjoy a good plot twist in isolation too—witness Hollywood and genre fiction.

Naturally, some readers can be counted on to be better at making inferences than others. As Kim points out, this greater ability may be based on a broader knowledge base; if the author makes an allusion, for instance, it helps to know about the subject being alluded to. It can also be based on comprehension skills, awareness of genre conventions, understanding of the physical or psychological forces at play in the plot, and so on. The implication is that keeping those crucial two steps ahead, no more no less, means targeting readers who are just about as good at making inferences as you are and working hard through inspiration, planning, and revision to maintain your lead. If you’re erudite and agile of mind, you’re going to bore yourself trying to write for those significantly less so—and those significantly less so are going to find what is keenly stimulating for you to write all but impossible to comprehend.

Interestingness is also influenced by fundamental properties of stories like subject matter—Percy Fawcett explores the Amazon in search of the lost City of Z is more interesting than Margaret goes grocery shopping—and the personality traits of characters that influence the degree to which we sympathize with them. But technical virtuosity often supersedes things like topic and character. A great writer can write about a boring character in an interesting way. Interestingly, however, the benefit in interest won through mastery of technique will only be appreciated by those capable of inferring the meaning hinted at by the narration, those able to make the proper conceptual readjustments to accommodate surprising shifts in context and meaning. When mixed martial arts first became popular, for instance, audiences roared over knockouts and body slams, and yawned over everything else. But as Joe Rogan has remarked from ringside at events over the past few years fans have become so sophisticated that they cheer when one fighter passes the other’s guard.

What this means is that no matter how steadfast your devotion to representation, assuming your skills continually develop, there will be a point of diminishing returns, a point where improving as a writer will mean your work has greater interest but to a shrinking audience. My favorite illustration of this dilemma is Steven Millhauser’s parable “In the Reign of Harad IV,” in which “a maker of miniatures” carves and sculpts tiny representations of a king’s favorite possessions. Over time, though, the miniaturist ceases to care about any praise he receives from the king or anyone else at court and begins working to satisfy an inner “stirring of restlessness.” His creations become smaller and smaller, necessitating greater and greater magnification tools to appreciate. No matter how infinitesimal he manages to make his miniatures, upon completion of each work he seeks “a farther kingdom.” It’s one of the most interesting short stories I’ve read in a while.

Causal Bridging Inference Model:

Read More
Dennis Junk Dennis Junk

What's the Point of Difficult Reading?

For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them.

You sit reading the first dozen or so pages of some celebrated classic and gradually realize that having to sort out how the ends of the long sentences fix to their beginnings is taking just enough effort to distract you entirely from the setting or character you’re supposed to be getting to know. After a handful of words you swear are made up and a few tangled metaphors you find yourself riddling over with nary a resolution, the dread sinks in. Is the whole book going to be like this? Is it going to be one of those deals where you get to what’s clearly meant to be a crucial turning point in the plot but for you is just another riddle without a solution, sending you paging back through the forest of verbiage in search of some key succession of paragraphs you spaced out while reading the first time through? Then you wonder if you’re missing some other kind of key, like maybe the story’s an allegory, a reference to some historical event like World War II or some Revolution you once had to learn about but have since lost all recollection of. Maybe the insoluble similes are allusions to some other work you haven’t read or can’t recall. In any case, you’re not getting anything out of this celebrated classic but frustration leading to the dual suspicion that you’re too ignorant or stupid to enjoy great literature and that the whole “great literature” thing is just a conspiracy to trick us into feeling dumb so we’ll defer to the pseudo-wisdom of Ivory Tower elites.

If enough people of sufficient status get together and agree to extol a work of fiction, they can get almost everyone else to agree. The readers who get nothing out of it but frustration and boredom assume that since their professors or some critic in a fancy-pants magazine or the judges of some literary award committee think it’s great they must simply be missing something. They dutifully continue reading it, parrot a few points from a review that sound clever, and afterward toe the line by agreeing that it is indeed a great work of literature, clearly, even if it doesn’t speak to them personally. For instance, James Joyce’s Ulysses, utterly nonsensical to anyone without at least a master’s degree, tops the Modern Library’s list of 100 best novels in the English language. Responding to the urging of his friends to write out an explanation of the novel, Joyce scoffed, boasting,

I’ve put in so many enigmas and puzzles that it will keep the professors busy for centuries arguing over what I meant, and that’s the only way of ensuring one’s immortality.

He was right. To this day, professors continue to love him even as Ulysses and the even greater monstrosity Finnegan’s Wake do nothing but bore and befuddle everyone else—or else, more fittingly, sit inert or unchecked-out on the shelf, gathering well-deserved dust.

Joyce’s later novels are not literature; they are lengthy collections of loosely connected literary puzzles. But at least his puzzles have actual solutions—or so I’m told. Ulysses represents the apotheosis of the tradition in literature called modernism. What came next, postmodernism, is even more disconnected from the universal human passion for narrative. Even professors aren’t sure what to do with it, so they simply throw their hands up, say it’s great, and explain that the source of its greatness is its very resistance to explanation. Jonathan Franzen, whose 2001 novel The Corrections represented a major departure from the postmodernism he began his career experimenting with, explained the following year in The New Yorker how he’d turned away from the tradition. He’d been reading the work of William Gaddis “as a kind of penance” (101) and not getting any meaning out of it. Of the final piece in the celebrated author’s oeuvre, Franzen writes,

The novel is an example of the particular corrosiveness of literary postmodernism. Gaddis began his career with a Modernist epic about the forgery of masterpieces. He ended it with a pomo romp that superficially resembles a masterpiece but punishes the reader who tries to stay with it and follow its logic. When the reader finally says, Hey, wait a minute, this is a mess, not a masterpiece, the book instantly morphs into a performance-art prop: its fraudulence is the whole point! And the reader is out twenty hours of good-faith effort. (111)

In other words, reading postmodern fiction means not only forgoing the rewards of narratives, having them replaced by the more taxing endeavor of solving multiple riddles in succession, but those riddles don’t even have answers. What’s the point of reading this crap? Exactly. Get it?

You can dig deeper into the meaningless meanderings of pomos and discover there is in fact an ideology inspiring all the infuriating inanity. The super smart people who write and read this stuff point to the willing, eager complicity of the common reader in the propagation of all the lies that sustain our atrociously unjust society (but atrociously unjust compared to what?). Franzen refers to this as the Fallacy of the Stupid Reader,

wherein difficulty is a “strategy” to protect art from cooptation and the purpose of art is to “upset” or “compel” or “challenge” or “subvert” or “scar” the unsuspecting reader; as if the writer’s audience somehow consisted, again and again, of Charlie Browns running to kick Lucy’s football; as if it were a virtue in a novelist to be the kind of boor who propagandizes at friendly social gatherings. (109)

But if the author is worried about art becoming a commodity does making the art shitty really amount to a solution? And if the goal is to make readers rethink something they take for granted why not bring the matter up directly, or have a character wrestle with it, or have a character argue with another character about it? The sad fact is that these authors probably just suck, that, as Franzen suspects, “literary difficulty can operate as a smoke screen for an author who has nothing interesting, wise, or entertaining to say” (111).

Not all difficulty in fiction is a smoke screen though. Not all the literary emperors are naked. Franzen writes that “there is no headache like the headache you get from working harder on deciphering a text than the author, by all appearances, has worked on assembling it.” But the essay, titled “Mr. Difficult,” begins with a reader complaint sent not to Gaddis but to Franzen himself. And the reader, a Mrs. M. from Maryland, really gives him the business:

Who is it that you are writing for? It surely could not be the average person who enjoys a good read… The elite of New York, the elite who are beautiful, thin, anorexic, neurotic, sophisticated, don’t smoke, have abortions tri-yearly, are antiseptic, live in penthouses, this superior species of humanity who read Harper’s and The New Yorker. (100)

In this first part of the essay, Franzen introduces a dilemma that sets up his explanation of why he turned away from postmodernism—he’s an adherent of the “Contract model” of literature, whereby the author agrees to share, on equal footing, an entertaining or in some other way gratifying experience, as opposed to the “Status model,” whereby the author demonstrates his or her genius and if you don’t get it, tough. But his coming to a supposed agreement with Mrs. M. about writers like Gaddis doesn’t really resolve Mrs. M.’s conflict with him.

The Corrections, after all, the novel she was responding to, represents his turning away from the tradition Gaddis wrote in. (It must be said, though, that Freedom, Franzen’s next novel, is written in a still more accessible style.)

The first thing we must do to respond properly to Mrs. M. is break down each of Franzen’s models into two categories. The status model includes writers like Gaddis whose difficulty serves no purpose but to frustrate and alienate readers. But Franzen’s own type specimen for this model is Flaubert, much of whose writing, though difficult at first, rewards any effort to re-read and further comprehend with a more profound connection. So it is for countless other writers, the one behind number two on the Modern Library’s ranking for instance—Fitzgerald and Gatsby. As for the contract model, Franzen admits,

Taken to its free-market extreme, Contract stipulates that if a product is disagreeable to you the fault must be the product’s. If you crack a tooth on a hard word in a novel, you sue the author. If your professor puts Dreiser on your reading list, you write a harsh student evaluation… You’re the consumer; you rule. (100)

Franzen, in declaring himself a “Contract kind of person,” assumes that the free-market extreme can be dismissed for its extremity. But Mrs. M. would probably challenge him on that. For many, particularly right-leaning readers, the market not only can but should be relied on to determine which books are good and which ones belong in some tiny niche. When the Modern Library conducted a readers' poll to create a popular ranking to balance the one made by experts, the ballot was stuffed by Ayn Rand acolytes and scientologists. Mrs. M. herself leaves little doubt as to her political sympathies. For her and her fellow travelers, things like literature departments, National Book Awards—like the one The Corrections won—Nobels and Pulitzers are all an evil form of intervention into the sacred workings of the divine free market, un-American, sacrilegious, communist. According to this line of thinking, authors aren’t much different from whores—except of course literal whoring is condemned in the bible (except when it isn’t).

A contract with readers who score high on the personality dimension of openness to new ideas and experiences (who tend to be liberal), those who have spent a lot of time in the past reading books like The Great Gatsby or Heart of Darkness or Lolita (the horror!), those who read enough to have developed finely honed comprehension skills—that contract is going to look quite a bit different from one with readers who attend Beck University, those for whom Atlas Shrugged is the height of literary excellence. At the same time, though, the cult of self-esteem is poisoning schools and homes with the idea that suggesting that a student or son or daughter is anything other than a budding genius is a form of abuse. Heaven forbid a young person feel judged or criticized while speaking or writing. And if an author makes you feel the least bit dumb or ignorant, well, it’s an outrage—heroes like Mrs. M. to the rescue.

One of the problems with the cult of self-esteem is that anticipating criticism tends to make people more, not less creative. And the link between low self-esteem and mental disorders is almost purely mythical. High self-esteem is correlated with school performance, but as far as researchers can tell it’s the performance causing the esteem, not the other way around. More invidious, though, is the tendency to view anything that takes a great deal of education or intelligence to accomplish as an affront to everyone less educated or intelligent. Conservatives complain endlessly about class warfare and envy of the rich—the financially elite—but they have no qualms about decrying intellectual elites and condemning them for flaunting their superior literary achievements. They see the elitist mote in the eye of Nobel laureates without noticing the beam in their own.

         What’s the point of difficult reading? Well, what’s the point of running five or ten miles? What’s the point of eating vegetables as opposed to ice cream or Doritos? Difficulty need not preclude enjoyment. And discipline in the present is often rewarded in the future. It very well may be that the complexity of the ideas you’re capable of understanding is influenced by how many complex ideas you attempt to understand. No matter how vehemently true believers in the magic of markets insist otherwise, markets don’t have minds. And though an individual’s intelligence need not be fixed a good way to ensure children never get any smarter than they already are is to make them feel fantastically wonderful about their mediocrity. We just have to hope that despite these ideological traps there are enough people out there determined to wrap their minds around complex situations depicted in complex narratives about complex people told in complex language, people who will in the process develop the types of minds and intelligence necessary to lead the rest of our lazy asses into a future that’s livable and enjoyable. For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them. How do we get the rewards without wasting our time?

Read More
Dennis Junk Dennis Junk

Useless Art and Beautiful Minds

The paradox of art is that the artist conveys more of him or herself by focusing on the subject of the work. The artists who cast the most powerful spells are the ones who can get the most caught up in things other than themselves. Falling in love exposes as much of the lover as it does of the one who’s loved.

When you experience a work of art you can’t help imagining the mind behind it. Many people go so far as to imagine the events of the natural world as attempts by incorporeal minds to communicate their intentions. Creating art demands effort and clear intentionality, and so we automatically try to understand the creator’s message. Ghanaian artist El Anatsui says the difference between the tools and consumer artifacts that go into his creations and the creations themselves is that you are meant to use bottle caps and can lids, but you are meant to contemplate works of art. 

Ai Weiwei’s marble sculpture of a surveillance camera, for instance, takes its form from an object that has a clear function and transforms it into an object that stands inert, useless but for the irresistible compulsion it evokes to ponder what it means to live under the watchful gaze of an oppressive state. Mastering the challenge posed by his tormenters, taking their tools and turning them into objects of beauty and contemplation, is an obvious intention and thus an obvious message. We look at the sculpture and we feel we understand what Ai Weiwei meant in creating it. 

Not all art is conducive to such ease of recognition, and sometimes unsettledness of meaning is its own meaning. We are given to classifying objects or images, primarily by their use. Asking the question, what is this, is usually the same as asking, what is it for? If we see an image surrounded by a frame hanging on a wall, even the least artistically inclined of us will assume the picture in some way pleases the man or woman who put it there. It could be a picture of a loved one. It could be an image whose symmetry and colors and complexity strike most people as beautiful. It could signify some aspect of group identity. 

Not all art pleases, and sometimes the artist’s intention is to disturb. John Keats believed what he called negative capability, a state in which someone “is capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason,” to be central to the creation and appreciation of art. If everything we encounter fits neatly into our long-established categories, we will never experience such uncertainty. The temptation is always to avoid challenges to what we know because being so challenged can be profoundly discomfiting. But if our minds are never challenged they atrophy.

        While artists often challenge us to contemplate topics like the slave trade for the manufacture of beer, the relation between industrial manufacturing and the natural world, or surveillance and freedom of expression, the mystery that lies at the foundation of art is the mind of the artist. Once we realize we’re to have an experience of art, we stop wondering, what is this for, and begin pondering what it means.

        Art isn’t however simply an expression of the artist’s thoughts or emotions, and neither should it be considered merely an attempt at rendering some aspect of the real world through one of the representative media. How Anatsui conveys his message about the slave trade is just as important as any attempt to decipher what that message might be. The paradox of art is that the artist conveys more of him or herself by focusing on the subject of the work. The artists who cast the most powerful spells are the ones who can get the most caught up in things other than themselves. Falling in love exposes as much of the lover as it does of the one who’s loved. 

Music and narrative arts rely on the dimension of time, so they illustrate the point more effectively. The pace of the rhythms and the pitch of voices and instruments convey emotion with immediacy and force. Musicians must to some degree experience the emotions they hope to spread through their performances (though they may begin in tranquility and only succumb afterward, affected by their own performance). They are like actors. But audiences do not assume that the man who plays low-pitched, violent music is angry when the song is over. Nor do they assume the singer who croons a plangent love song is at that moment in her life in the throes of an infatuation. The musicians throw themselves into their performances, and perhaps into the writing and composing of the songs, and, to the extent that we forget we’re listening to a performer as we feel or relive the anger or the pangs of love their music immerses us in, they achieve a transcendence we recognize as an experience of true art. We in the audience attribute that transcendence to the musicians, and infer that even though they may not currently be in the state their song inspired they must know a great deal about it.

Likewise a fiction writer accomplishes the most by betraying little or no interest in him or herself. Line by line, scene by scene, if the reader is thinking of the author and not the characters the work is a failure. When the story ceases to be a story and the fates of characters become matters of real concern for the reader, the author has achieved that same artistic transcendence as the musician whose songs take hold of our hearts and make us want to rage, to cry, to dance. But, finishing the chapter, leaving the company of the characters, reemerging from the story, we can marvel at the seeming magic that so consumed us. Contemplating the ultimate outcomes as the unfolding of the plot comes to an end, we’re free to treat the work holistically and see in it the vision of the writer.

The presence of the artist’s mind need not distract from the subject we are being asked to contemplate. But all art can be thought of as an exercise in empathy. More than that, though, the making strange of familiar objects and experiences, the communion of minds assumed to be separated by some hitherto unbridgeable divide, these experiences inspire an approach to living and socializing that is an art in its own right. Sometimes to see something clearly we have to be able to imagine it otherwise. To really hear someone else we have to appreciate ourselves. To break free of our own habits, it helps to know how others think and live.

Also read:
WHAT IS A STORY? AND WHAT ARE YOU SUPPOSED TO DO WITH ONE?

Read More