I was recently asked what it takes to become a writer. Three things, I answered: first, one must cultivate incompetence at almost every other form of profitable work. This must be accompanied, second, by a haughty contempt for all the forms of work that one has established one cannot do. To these two must be joined, third, the nuttiness to believe that other people can be made to care about your opinions and views and be charmed by the way you state them. Incompetence, contempt, lunacy—once you have these in place, you are set to go.
But why bother writing at all? What would motivate anyone to take up what often turns out to be a life fraught with many obstacles and few palpable rewards? This vexing question has received a number of usually unsatisfactory answers. They include the notions that serious writers are divinely inspired; that they have a preternatural love of aesthetic order; that they are in relentless pursuit of the truth (as they understand it); and, on the somewhat less complimentary side, that they are ego-driven and therefore attention-craving beyond all reckoning.
On each of these points, thus far, we have chiefly had to take the word of writers themselves. Even Sigmund Freud, though he had many zany ideas about the act of writing, the most famous of them being sublimation, threw in his cards on the question of motivation: “Before the problem of the creative artist,” Freud wrote in an essay on Dostoevsky, “analysis must, alas, lay down its arms.”
Nor has science, serious science, weighed in heavily on the issue of creativity—at least not until now. A prescient person might have seen the change coming when, a few years ago, the air began to be filled with cliché-ridden talk about the different sides of the brain, and about which side is responsible for what. Such as this:
The left [side] seems to have the more important role in logical or sequential thinking, while the right [side] is somewhat more significant for spatial and emotional thinking, musical appreciation, and visual imagery. The specializations of the two hemispheres are relative, however, not absolute.
That particular passage, complete with its characteristic qualifications, comes from a recent book, The Midnight Disease, by Alice W. Flaherty.1 A neurologist at the Massachusetts General Hospital in Boston and a teacher at the Harvard Medical School, Dr. Flaherty here tries to cover three related pieces of territory. She surveys, first, what has been written about the origin of the drive to write. Then she asks what causes writer's block, that dread “nightmare” of all writers. And finally she meditates on the factors that make for literary creativity, stopping along the way to inquire into the origin and function of metaphor, the question of inspiration, and the relation between emotion and expression. Underlying the whole exercise is her conviction that “everything in our personalities, sick or well, comes from our brains—modified, of course, by experience.”
Dr. Flaherty is an earnest and, more important, honest person. Thanks to her honesty, her inquiries into the effects of the brain on writing are more hedged about with the conditional tense and assorted qualifiers than a weather forecast for October 22, 2057. “May,” “can,” “seem,” “could,” “apparently,” “appears,” “perhaps,” “probably,” “relatively,” “it is likely that,” “lines of evidence point to,” are words and phrases that turn up again and again in her sentences. It gives one a distinct feeling of oddness to come upon these elaborate hedgings in the presence of such weighty and exact terminology as “occipital cortex,” “hypothalamus,” “transcranial magnetic stimulations,” “neuron transmitters,” “hippocampus,” “limbic systems,” “amygdala,” and the rest.
A standard passage in The Midnight Disease runs:
But several higher-level visual areas in temporal and parietal cortex show increased activation, even though there is no visual stimulus. Some increased frontal cortex activation occurs, which may reflect multimodal sensory processing. (Emphasis added.)
What is it that Humphrey Bogart says of the character played by Elisha Cook in The Maltese Falcon: “The cheaper the gunsel, the gaudier the patter”?
There is a reason for all this qualification. Neuroscience, in its current state, is less than dramatically impressive in the precision of its findings. Despite the high technology at its service, which makes possible various kinds of brain scans and imaging, a great deal is still in the realm of speculation, deduction, educated guessing. Paul McHugh, the Johns Hopkins psychiatrist and neurologist, has said that, so primitive is the contemporary study of the brain, it remains without its William Harvey, the physician who first mapped the circulatory system.
Much of what is known about the brain has been acquired through the study and correction of injuries: strokes, aphasias, tumors, and various sad jigeroos. Everyone has encountered such dirty tricks at work. I had a dentist who, after a stroke, lost his powers of recalling proper names or of following the simplest narratives. He killed himself. An acquaintance who suffers bipolar disease, considering shock therapy, was told that its exact effects could not be predicted; it was, the neurologist in charge added, like “restarting one's computer,” which, as we all know, sometimes works wonders and sometimes is no help at all.
None of this is to suggest that the study of the brain is in any way fraudulent—phrenology with technology tacked on. But, because it has not found the major systems or forces that propel and operate the brain, neurology wants the satisfying rigor and precision that other divisions of medical science have long been able to count upon. Often able to tell what is wrong, it can only occasionally make things right. In its present state, it is most useful for psychopharmacology, for aiding in the teaching of elementary reading and writing, and especially for the treatment of dyslexia and other learning disabilities. Neurology is at its best when it describes how the body works in connection with the nervous system, as in the treatment of Parkinson's disease and epilepsy.
But neurology is at its worst when it attempts to explain how consciousness arises and how it works, how thought derives out of brain tissue—how, in other words, the brain produces the mind. The pretensions of neurology in this area need always to be carefully examined for extravagance. Early in her book, Dr. Flaherty asserts enthusiastically “that science can help us write.” To adapt the Bill Clinton gambit, I would say that all depends on what you mean by “help,” “us,” and “write.”
The question that looms over The Midnight Disease is the extent to which writing, and the moods stemming from it, can be traced to brain abnormality. Sometimes Dr. Flaherty appears convinced that they can be. She concentrates on writers known to have had epilepsy, Dostoevsky and Flaubert notable among them. She confidently (and, in my opinion, crudely) diagnoses Henry James as “a unipolar depressive,” and speculates about what might happen to J. D. Salinger's famous decades-long block if he took an anti-depressive like Paxil (how does she know that he is not already on it, and that it has not stopped him permanently from writing?). Quoting various experts on the question of literary motivation, she cites the belief of one of them, Hanna Segal, that (in Dr. Flaherty's paraphrase) “artistic creation is a response to the emptiness of depression.” Even as she worries whether the tendency to treat writing as an “abnormal brain state” is a wise thing to do, or whether it does not amount to “pathologizing an activity that should be praised,” pathologizing, I fear, is what she finally does.
Certainly we have plenty of psychologically wounded writers: Dostoevsky, Melville, Baudelaire, Conrad. Then there are the drunks: Joyce, Hemingway, Fitzgerald, Dorothy Parker, and countless others. And let us not forget the dear drug addicts: Samuel Taylor Coleridge, Thomas DeQuincey William S. Burroughs, and the entire charmless Beat generation. At one point, Dr. Flaherty remarks that depression among writers is eight to ten times higher than among the general population. My own nonscientific response to this is that it makes very good sense, since there must be eight to ten times more people writing than there ought to be. These people, being in the wrong line of work, have earned their depression.
Dr. Flaherty writes from the standpoint of someone who has herself suffered depression. Along with regaling us with her own experiences as a writer—she has published one previous book, The Massachusetts General Hospital Handbook of Neurology—she tosses in her credentials as a former mental patient. She suffered post-partum depression after giving birth, at what one gathers was a fairly close interval, to two sets of twins (the first of whom were still-born), and required hospitalization. One of the results of her stay in the psychiatric ward—she does not say for how long—was manic hypergraphia, which is an unbounded energy and appetite for writing. Such was her happiness while scratching away that she seems to have felt the disease was much to be preferred to any cure.
While sometimes Dr. Flaherty makes writing seem a form of madness by other means, at others she makes it seem a matter of good intellectual hygiene. Thus, she traces much of the sad problem of writer's block—that is, the inability to produce words—to mood disorders of various intensities. Perfectionism, procrastination in its differing stripes, writer's cramp as a psychosomatic symptom, even lowered testosterone levels are brought into the discussion as possible reasons for writer's block, with mood-altering pills and psychotherapy being considered as the possible cure.
Dr. Flaherty's book is filled with standard quotations on writers' problems, including such matters as deadlines and their effect on productivity. A pity she does not appear to know the truth-laden aphorism of Karl Kraus, the Viennese wit: “a journalist, given time, writes worse.” When it comes to blocks, most serious writers will tell you that the chief cause of the malady is a loss of confidence in one's ideas, or in the plan behind one's work. Here mood-altering pills and psychotherapy are unlikely to help. What a block means to a professional writer, at least one determined not to give way to neurotic caperings, is that he has not yet become smart enough to gain mastery over his subject.
That writing itself may be a neurotic act is an idea with a fairly short history; so, too, the contrasting notion that writing can be a way of dealing with and even overcoming neurosis. Better that both were ideas whose time had passed, but that is unlikely to happen soon. For her part, Dr. Flaherty appears to subscribe to the writers-are-essentially-performing-therapy-by-other-means school. That is hardly surprising. In an age that has seen the triumph of the therapeutic, writing, once regarded as an act of intellectual and artistic discovery, is now regularly extolled as an act of self-expressiveness, of working out one's mental and emotional problems. This is no doubt what prompts Dr. Flaherty to use her own experiences, both emotional and literary, to illustrate her points.
Alas, these experiences are a good deal less than convincing, and for one simple reason: the person telling us about them is less than a first-class writer. She belongs, rather, to the category of the cheerful amateur. “Reading the New York Times Book Review every week,” Dr. Flaherty notes, “was a major part of my literary education”—a statement akin to claiming that one has learned how to fly by reading Superman comics. As a writer, not only does Dr. Flaherty use language in a loose and often dopey way, not only does she split infinitives with the easy exuberance of young Abe Lincoln splitting logs, but she provides no striking phrases or arresting metaphors, she over-dramatizes her own experience, lapses into cuteness and unconscious self-gratulation, and everywhere betrays many other marks of the amateur scribbler.
Something about The Midnight Disease suggests that it is written for other amateurs. If so, it may be the most dangerous how-to book in existence, offering all sorts of advice on anti-depressants, altering sleep cycles, changing the lighting in one's life, and other therapies to improve verbal flow. True, it also contains a warning from the author not to take any of the drugs mentioned except “under the supervision of a trained physician in the field.” Or, as the old knife jugglers used to say, “Kids, don't try any of this at home.”
My own early drive to write was, I think, fairly typical, but since it conforms to none of Flaherty's categories, I wonder how she would deal with it. As a young would-be writer, I harbored no elevated notions of bringing truth or beauty into the world. Instead I wanted, ardently, to bring me into the world: to call me to its attention. My desire to write, which began when I was twenty, was never separable from wanting to write for print. In my own little behaviorist Skinner box, I pecked at the lever that, I hoped, would deliver small but delicious pellets of praise. Later on, my writing drive was also impelled by a sense that other writers were describing the world wrongly, and by the pleasure I took from exploring the richness of life and the mysteries of human character with such power as my acquisition of craft made available to me.
I have been lucky enough to scribble away at a fairly steady rate over many decades. Does this suggest that I suffer from what Dr. Flaherty calls hypergraphia, and what others used to call graphomania? Might I also have an enlarged temporal lobe, right side, that makes all this scribbling possible? Or an overactive limbic system, or hypothalamus, perhaps?
I do not believe any of these parts of my brain have diddly to do with it. Nor do I believe, as Dr. Flaherty seems to, in the debt that inspiration owes to the muses. (If a belief in muses sounds strange coming from the mouth of a neurologist, it ought to be noted that Dr. Flaherty manages to reconcile her devotion to science with her penchant for mysticism by reporting “evidence that the temporal lobe underlies mystical experience as well.”) For the professional writer, at any rate, awaiting inspiration or the visit of a muse or the telltale twitch of the limbic system will produce no better results than waiting for Godot, who, when I last checked, had yet to show up. My allusion to Godot came, I believe, not straight from my intact temporal lobe but from my careful study of how metaphors work and sheer jolly cleverness.
But here is where Dr. Flaherty, like Dr. Freud before her, throws in her cards. “Science,” she suggests, “is the mouthpiece of determinism, and literature the last holdout of free will.” Safe to say that neuroscience these days holds brain chemistry and anatomy to be more decisive for human behavior and the formation of character than free will, whereas every serious writer will be firmly on the other side. Of course, whether neuroscience acknowledges even the existence of free will is another question, and one that Flaherty explicitly declines to adjudicate on behalf of neuroscience. Writers, of course, must come down on the side of free will. What, you might say, choice have we? Without free will there would be no literature in the first place: no drama, no insights into human nature, little, really, but the drab playing-out of the hands we have been dealt, with the aid of pills to bolster our lagging spirits. Artists are the natural opponents of determinism—which is why, for example, so many of them have mocked the heavily deterministic doctrines of Sigmund Freud.
Although it has often been wrong and then corrected itself, modern science, alone of all human endeavors, has until now kept its promises. One wonders if its current promises—to reproduce life, to explain all behavior through brain anatomy and chemistry—might be so extravagant as to cost it some of its prestige. At least as it is set out in the pages of The Midnight Disease, neuroscience is distinctly unhelpful in explaining the mysteries that continue to surround literary creation—in connection both with the need to write and with the mechanisms that are at the heart of the act of writing.
The choice of writing as a living and a way of life is more complex than is likely to show up in a neurologist's PET scan. Nor, unlike in other artistic fields—music, the visual arts—does literary talent make such a life any easier by appearing early. “No Mozarts in literature,” more than a well-known saying, is a fact. There are not too many Joseph Conrads, either, and Conrad published his first book when he was thirty-eight.
Nor, despite all the programs and creative-writing classes, can writing really be taught. In his “Dialogues on Art,” Paul Valéry observes: “There are products of the mind that cannot be reduced to neat formulas of expression or systematic methods and practices.” Writing happens to be one of them. One cannot teach a person to love language, to be smart, witty, have a dramatic sense, to be observant, to be someone on whom, in Henry James's phrase, “nothing is lost.” All this comes through on-the-job training, through solitary learning of a craft. It is a long-shot bet that someday one's undeveloped talent will come near one's already highly developed ambition. For most, it never comes close.
I taught would-be novelists, poets, and essayists for three decades at Northwestern University. Many of them demonstrated much greater ability than I at their age, yet nothing much has happened to the vast majority of them. Or, rather, the world happened to them, intervening in their grand plans to become serious writers by placing genuine obstacles in their way or by holding out other prospects and possibilities: marriage and family, honorable and better-paying work, the temptations of journalism. However high the degree of their talent, the desire, I have to assume, was not sufficiently intense in them to do what was required. As for whence the desire itself derives, that is yet another mystery.
One of the obstacles is the writing life itself, which is generally much less glamorous than is imagined. Although every so often a serious writer resoundingly rings the great commercial gong, sending him from welfare recipient to multi-millionaire, the number of men and women in the United States today who are able to live exclusively off their writing and do not need to supplement their literary income with revenue from teaching, lecturing, public readings (if they are lucky enough to get them), working on movie or television scripts, or performing jobs unrelated to writing is probably well below 500.
Things get worse. Mordecai Richler, the Canadian novelist, once said that he divided his life between the time before he decided to become a writer and the time after—and the time before was better. What I believe Richler meant was that once one determines to write, one no longer confronts experience directly; it becomes “copy,” recyclable in stories, articles, essays, poems. True, nothing in a skilled writer's life is wasted. But there is something mildly—and sometimes more than mildly—gruesome about collecting experience for one's work the way a certain kind of person collects grievances. I, for one, would never make the mistake my wife did of marrying a writer.
If one cannot learn one's craft through formal schooling, if one is highly unlikely to earn a good living at it, and if the writer's vocation tends to bring out the less pleasant side of one's nature, why would anyone want to risk it? (H. L. Mencken had no tolerance for the little Iliad of woes I have just compiled. When writers complained to him about the arduousness of their lot, he used to propose that they go try a week on an assembly line.)
When the going is good for a writer, though, it cannot be bettered. For writers whose productivity comes to fruition in regular publication, the activity itself is its own reward—and practicing it beats all other regular employment. Once one has achieved a relative mastery over one's craft, the pleasures of composition are like few others: certainly none that I have known. Constructing well-made sentences, in which words and thought appear to make a seamless fit, causing the small but intense light of insight to click on, can only be compared, I should imagine, to the delight of dancing faultlessly to one's own choreography.
Where do the words come from? The same mysterious place, I suspect, where notes of music go. They precede ideas, and are inseparable from them. For myself, I bow my head, touch wood, and utter a small prayer that the flow of them never cease.
Motivation, meanwhile, is as various as the subjects upon which one feels called upon to write, varying from time to time, subject to subject. I should like Dr. Flaherty to know that my two motives in writing this essay have been, first, to collect a decent fee, and, second, to try to knock down her book as an assemblage of profoundly muddled notions that I, given my calling, find mildly but genuinely offensive.
1 Houghton Mifflin, 307 pp., $24.00.