Nearly forty years ago, Edmund Wilson wrote a little essay about an underrated American novelist and called it "Justice to…
Nearly forty years ago, Edmund Wilson wrote a little essay about an underrated American novelist and called it “Justice to Edith Wharton.” She was in need of justice, he claimed, because “the more commonplace work of her later years had had the effect of dulling the reputation of her earlier and more serious work.” During this last period—a stretch of about seventeen years, from (roughly) 1920 to her death in 1937—Edith Wharton’s novels were best-sellers, her short stories commanded thousands of dollars; but both in mode and motivation she remained, like so many others in the 20’s and 30’s, a 19th-century writer. She believed in portraying character, her characters displayed the higher values, her prose was a platform for her own views. In 1937, when Wilson undertook to invigorate her reputation, the machinery of 19th-century fiction was beginning to be judged not so much as the expression of a long tradition, or (as nowadays we seem to view it) as the exhausted practice of a moribund convention, but more bluntly as a failure of talent. Wilson accounted for that apparent failure in Edith Wharton by speculating on the psychological differences between male and female writers:
It is sometimes true of women writers—less often, I believe, of men—that a manifestation of something like genius may be stimulated by some exceptional emotional strain, but will disappear when the stimulus has passed. With a man, his professional, his artisan’s life is likely to persist and evolve as a partially independent organism through the vicissitudes of his emotional experience. Henry James in a virtual vacuum continued to possess and develop his métier. But Mrs. Wharton had no métier in this sense.
What sort of “justice” is this? A woman typically writes best when her emotions are engaged; the barren female heart cannot seize the writer’s trade? Only a decade ago, such a declaration would have been derided by old-fashioned feminists as a passing insolence. But even the satiric reader, contending in one fashion or another with this passage, would have been able, ten years ago, to pluck the offending notion out as a lapse in the texture of a measured and generally moderating mind.
No longer. Wilson’s idea returns only to hold, and it holds nowhere so much as among the literary proponents of the current women’s movement: Wilson’s lapse is exalted to precept. The idea of Edith Wharton as a “woman writer” in need of constantly renewable internal stimuli, whose gifts are best sustained by “exceptional emotional strain”—all this suits the newest doctrine of sexual exclusiveness in literature. Indeed, one of the outstanding tenets of this doctrine embraces Wilson unrelentingly. “Rarely in the work now being written by women,” according to an article called “Toward a Definition of the Female Sensibility,”
does one feel the presence of writers genuinely penetrating their own experience, risking emotional humiliation and the facing-down of secret fears, unbearable wisdoms. . . . There are works, however, . . . in which one feels the heroic effort stirring,1
and there follow numerous examples of women writing well because of the stimulus of some exceptional emotional strain.
Restitution, then (one supposes), is to come to Edith Wharton not from the old-fashioned feminists, but from the newer sort, who embrace the proposition that strong emotion in women, emotion uniquely female, is what will best nourish a female literature. What we are to look for next, it follows, is an ambitious new-feminist critical work studying Wharton’s “vicissitudes of . . . emotional experience” and correlating the most fevered points with the most accomplished of the fictions.
Such a work, it turns out, more extensive and more supple than Wilson’s pioneer brief would suggest, has just made its appearance: Ellen Moers’s Literary Women. Like other new feminists, Moers believes that there is such an entity as the “history of women,” that there are poetic images uniquely female, and even “landscapes charged with female privacy.” She writes of “how much the freedom and tactile sensations of near-naked sea bathing has meant to modern women,” and insists that a scene recounting the sensation of walking through a field of sealike grass provides that “moment when Kate Chopin reveals herself most truly a woman writer.” Edith Wharton’s life—a buried life—ought, properly scrutinized, to feed such a set of sympathies, and to lure the attention of restitution. Literary Women, after all, is conceived of in part as a rescue volume, as a book of rehabilitation and justice: a number of writers, Moers explains, “came to life for me as women writers as they had not done before. Mrs. Gaskell and Anne Bronte had once bored me; Emily Dickinson was an irritating puzzle, as much as a genius; I could barely read Mary Shelley and Mrs. Browning. Reading them anew as women writers taught me how to get excited about these five, and others as well.”
Others as well. But Edith Wharton is omitted from Literary Women. Her name appears only once, as an entry in an appendix. Only The House of Mirth is mentioned there, along with a reference, apparently by way of explanation of the larger omission, to the chapter on Edith Wharton in Alfred Kazin’s On Native Grounds. Pursuing the citation, one discovers that Kazin, like Wilson, like the new feminists, speaks of “the need that drove her to literature.” Whatever the need, it does not engage Moers; or Kazin. He advances the notion that “to Edith Wharton, whose very career as a novelist was the tenuous product of so many maladjustments, the novel became an involuted expression of self.” Unlike the new feminists, Kazin will not celebrate this expression; it represents for him a “failure to fulfill herself in art.” Wharton, he concludes, “remains not a great artist but an unusual American, one who brought the weight of her personal experience to bear upon a modern American literature to which she was spiritually alien.”
Justice to Edith Wharton: where, then, is it to come from? Not taken seriously by the dominant criticism, purposefully ignored by the radical separatist criticism of the new feminists2—she represents an antagonism. The antagonism is not new. Wharton describes it herself in her memoir, A Backward Glance:
My literary success puzzled and embarrassed my old friends far more than it impressed them, and in my own family it created a kind of constraint which increased with the years. None of my relations ever spoke to me of my books, either to praise or blame—they simply ignored them; and among the immense tribe of my cousins, though it included many with whom I was on terms of affectionate intimacy, the subject was avoided as if it were a kind of family disgrace, which might be condoned but could not be forgotten. Only one eccentric widowed cousin, living a life of lonely invalidism, turned to my novels for occasional distraction, and had the courage to tell me so.
She continues: “At first I felt this indifference acutely; but now I no longer cared, for my recognition as a writer had transformed my life.”
So it is here—in this uplifting idea, “my life,” this teleological and novelistic idea above ail—that one will finally expect to look for Wharton’s restitution “as a writer.” The justice that criticism perversely fails to bring, biography will achieve.
Perhaps. The biography of a novelist contains a wonderful advantage: it accomplishes, when well executed, a kind of mimicry. A good biography is itself a kind of novel. Like the classic novel, a biography believes in the notion of “a life”—a life as a triumphal or tragic story with a shape, a story that begins at birth, moves on to a middle part, and ends with the death of the protagonist.
Despite the reliable pervasiveness of birth and death, hardly any “real” life is like that. Most simply unfold, or less than that, dreamwalk themselves out. The middle is missing. What governs is not pattern but drift. Most American lives, moreover, fail to recognize that they are sticks in a stream, and are conceived of as novels-of-progress, as purposeful Bildungsromane saturated with an unending hopefulness, with the notion of infinite improvement on the way toward a salubrious goal; the frontier continues to inhabit the American mentality unfailingly.
And most American biographies are written out of this same source and belief. A biography that is most like a novel is least like a life. Edith Wharton’s life, though much of it was pursued outside of America, is an American life in this sense: that, despite certain disciplines, it was predicated on drift, and fell out, rather than fell into place. If other American lives, less free than hers, drift less luckily between the Scylla and Charybdis of obligation and crisis, hers drifted in a setting all horizon, in a perpetual non-circumstance clear of external necessity. She had to invent her own environment and its conditions, and while this may seem the reverse of rudderlessness, what it signifies really is movement having to feign a destination. A life with a “shape” is occasioned by what is present in that life; drift grows out of what is absent. For Edith Wharton there was—outside the writing—no destination, and no obligation to get there. She had houses, she had wealth; she chose, rather than “had,” friends. She had no family (she was estranged from her brothers, and we hear nothing further about the affectionate cousins), she had no husband (though she was married to one for more than half her life), she had no children. For a long time she resented and disliked children, and was obsessed by a love for small dogs. She was Henry James’s ideal American heroine: she was indeed his very heiress of all the ages; she was “free,” she was cultivated both in the conventional and the spiritual sense, she was gifted, acute, mobile; she appeared to be mistress of her destiny.
The destiny of such freedom is drift, and though her life was American in this, it was European in its resignation; she had no illusion that—outside the writing—she was doing more than “filling in.” Her one moment of elevated and secure purpose occurred when, inspired by the model of Walt Whitman in the hospitals of the Civil War, she founded war relief agencies in France during World War I. She supervised brilliantly: she supervised her friendships, her gardeners, her guests, the particulars of her dinner parties, her households; she even, to a degree, supervised the insurmountable Henry James—she took him for long rides in her car, she demanded hours in London and tea at Lamb House, she finagled with his publisher to provide him with a handsome advance (she herself was the secret philanthropist behind the scenes), she politicked to try and get him the Nobel Prize for literature. She supervised and commanded, but since no one demanded anything of her (with a single exception which, like the Gorgon’s head, was not to be gazed at), she was captain, on an uncharted deep, of a ship without any imaginable port. She did everything on her own, to no real end; no one ever asked her to accommodate to any pressure of need, she had no obligations that she did not contrive or duty that she did not devise. Her necessities were self-imposed. Her tub went round and round in a sea of self-pleasing.
All this was outside the writing. One learns it from R. W. B. Lewis’s prize-winning biography,3 which is, like a posthumously uncovered Wharton novel, sustained by the idea of “a life.” It has the fecund progression, the mastery of incident, the affectionate but balanced devotion to its protagonist, the power of suspenseful development, even the unraveling of a mysterious love story, that the “old” novel used to deliver—the novel before it became a self-referring “contemporary” art-object. In its own way it is a thesis novel: it is full of its intention to bring justice to Edith Wharton. A massive biography, almost by its weight, insists on the importance of its subject. Who would dare pass that writer by to whom a scholar-writer has dedicated, as Lewis has, nearly a decade of investigation and discovery? “They are among the handsomest achievements in our literature,” he remarks of her major fictions. And adds: “I have wondered, with other admirers of Edith Wharton, whether her reputation might today stand even higher if she had been a man.”
If the last statement has overtones of the new feminism—glory but for the impediment of sex—the book does not. Lewis sets out to render the life of an artist, not of a “woman artist.” Unexpectedly, though it is the artist he is after, what he succeeds chiefly in giving us is the life of a woman. The “chiefly” is no small thing: it is useful to have a documented narrative of an exceptional upper-class woman of a certain American period. Still, without romanticizing what is meant by the phrase “an artist’s life,” there is a difference between the biography of a writer and the mode of living of a narrow American class.
Can the life justify the writer then? Or, to put it otherwise, can biography take the place of literary judgment? Lewis’s book is a straightforward “tale,” not a critical biography. Nor is it “psycho-biography”: though it yields new and revealing information about Edith Wharton’s sexual experience, it does not propose to illumine the hidden chambers of the writer’s sentience—as, for example, Rudy V. Redinger’s recent inquiry into George Eliot’s relationship to her brother Isaac, with its hunches and conjectures, purports to do, or Quentin Bell’s half-study, half-memoir of Virginia Woolf. Lewis has in common with these others the revelation of a secret. In the case of Quentin Bell, it is the exact extent of Virginia Woolf’s insanity; in the volume on George Eliot, the secret is the dense burden of humiliation imposed by an adored brother more cruel and rigid than society itself. And in Lewis, the secret is an undreamed-of, now minutely disclosed, adulterous affair with a journalist. In all three accounts, the writer is on the whole not there. It is understandable that the writer is mainly absent for the psychobiographer; something else is being sought. It is even more understandable that the writer should be absent for a nephew-biographer, whose preoccupation is with confirming family stories.
But if, for Lewis, the writer is not there, it is not because he fails to look for her but because she is very nearly invisible. What, through luck and diligence, he causes to become visible is almost not the point, however unpredictable and startling his discoveries are. And they are two: the surprising place of Morton Fullerton in Edith Wharton’s middle years, and the appearance of a candid manuscript, written in her seventies, describing, with the lyrical explicitness of an enraptured anatomist, a fictional incestuous coupling. The manuscript and the love affair are so contrary to the established Wharton legend of cold propriety that they go far to make us look again—but only at the woman, not at the writer.
The real secret in Lewis’s biography is devoid of sex, lived or imagined, though its centerpiece is a bed; and it concerns not the woman but the writer. The secret is divulged on page 353, when Wharton is fifty-one, and occupies ten lines in a volume of nearly six hundred pages. The ten lines recount a perplexing incident—“a minor fit of hysterics.” The occasion is mysterious: Edith Wharton and Bernard Berenson, touring the great cities and museums of Europe together, arrive at the Hotel Esplanade in Berlin. They check into their respective rooms, and Edith Wharton, ignoring the view of the city though she has never been there before, begins to rage
because the bed in her hotel was not properly situated; not until it had been moved to face the window did she settle down and begin to find Berlin “incomparable.” Berenson thought this an absurd performance; but because Edith never harped upon the physical requirements of her literary life, he did not quite realize that she worked in bed every morning and therefore needed a bed which faced the light. It had been her practice for more than twenty years; and for a woman . . . who clung seriously to her daily stint, the need was a serious one.
The fit and its moment pass; the ensuing paragraphs tell of German politics snubbed and German music imbibed—we are returned, in short, to the life of an upper-class American expatriate tourist, privileged to travel in the company of a renowned connoisseur. But the plangent moment—an outcry over the position of a bed—dominates the book: dominates what has gone before and what is to come, and recasts both. Either the biographer can stand up to this moment—the woman revealed as writer—or the book falls into the drifting ash of “a life.”
It falls, but it is not the biographer’s fault; or not his fault alone. Edith Wharton—as writer—is to blame. She put a veil over the bed that was her work-place, and screened away the real life that was lived in it. What moves like a long after-image in the wake of reading Lewis is a procession of stately majesties: Edith Wharton always standing, always regal, always stiffly dressed and groomed, standing with her wonderfully vertical spine in the hall of one of her great houses, or in the drawing room of her Paris apartment, with her fine hand out to some equally resplendent guest, or in her gardens, not so much admiring her flowers as instructing or reprimanding the servants of her flowers; or else “motoring” through the dust of some picturesque lane in the French countryside, her chauffeur in peaked hat and leather goggles, like blinders, on a high seat in front of her, indistinguishable from the horse that still headed most vehicles on the road.
If this is the Wharton myth, she made it, she wove it daily. It winds itself out like a vivid movie, yet darkly; it leaves out the window-lit bed. What went on outside the bed does not account for what went on in it. She frequented literary salons, and on a smaller scale held them (after dinner, Henry James reading aloud in the library); she talked bookishly, and with fervor; she was an intellectual. But she was not the only brilliant woman of her time and status; all of that, in the biography of a writer, weighs little.
Visualize the bed: she used a writing board. Her breakfast was brought to her by Gross, the housekeeper, who alone was privy to this inmost secret of the bedchamber. Out of bed, she would have had to be, according to her code, properly dressed, and this meant stays. In bed, her body was free, and freed her pen.
There is a famous photograph of Edith Wharton seated at a desk; we know now, thanks to the “minor fit of hysterics” at the Hotel Esplanade, how the camera lies—even though it shows us everything we might want to know about a way of life. The time is in the 1890’s, the writer is in her early thirties. The desk is vast, shining, with a gold-tooled leather top; at the rear of its far surface is a decorated rack holding half a dozen books, but these are pointless—not only because anyone using this desk would need an impossibly long reach, but because all the volumes are faced away from the writer, with their backs and titles to the open room. Two tall electrified candlestick-lamps (the wire drags awkwardly) stand sentinel over two smaller candlesticks; there is a single letter, already stamped; otherwise the desk is clear, except for a pair of nervous, ringed hands fiddling with a bit of paper.
The hands belong to a young woman got up, to our eyes, as theatrically as some fanciful notion of royalty: she is plainly a lady of fashion, with a constricted waist and a constricting tall collar; her dress is of the whitest fabric, all eyeleted, embroidered, sashed; her hair is elaborately rolled and ringleted; an earring makes a white dot below the high dark eave of her hair; her back is straight, even as she leans forward with concentrated mouth and lost eyes, in the manner of a writer in trance. Mellifluous folds hide her feet; a lady has no legs. She is sitting on a graceful chair, with whorled feet—rattan framed by the most beautiful carved and burnished wood. (A rattan chair with not a single hole? No one could ever have worked in such a chair; the photographer defrauds us—nothing more important than a letter will ever be written at this desk.) The Oriental carpet, with its curious and dense figures, is most explicitly in focus, and over the edge of it a tail of skirt spills, reflected white on a floor as sleek as polished glass. In the background, blurred to the camera’s lens but instructive to ours: a broad-shouldered velvet chair, a marble bust on an ebony pedestal, a table with a huge porcelain sculpture, a lofty shut oak or walnut door—in short, an “interior,” reminding us that the woman at the unused desk has undertaken, as her first writing venture, a collaborative work called The Decoration of Houses.
There are other portraits in this vein, formal, posed, poised, “intellectual” (meaning the subject muses over a seeming letter or book), all jeweled clips and chokers and pearls in heavy rows, pendants, feathered hats, lapdogs, furs, statuesque burdens of flounced bosom and grand liquescent sleeve, queenly beyond our bourgeois imaginings. And the portraits of houses: multiple chimneys, balconies, cupolas, soaring Romanesque windows, immense stone staircases, summer awnings of palatial breadth, shaped ivy, topiary like oversized chess pieces, walks, vistas, clouds of flower beds.
What are we (putting aside Marxist thoughts) to make of this avalanche of privilege? It is not enough to say: money. The class she derived from never talked of money; the money was invisible, like the writing in bed, and just as secret, and just as indispensable. The “love of beauty,” being part of class-habit, does not explain it; perhaps the class-habit does. It was the class-habit that kept her on the move, the class-habit that is restlessness and drift. She wore out houses and places, or else her spirit wore out in them: New York, Newport, Lenox—finally America. In France there was the Paris apartment in the Rue de Varenne, then a small estate in St. Brice-sous-Forêt, in the country north of Paris, then an old chateau in Hyères, on the warm Mediterranean coast. Three times in her life she supervised the total renovation of a colossal mansion and its grounds, in effect building and furnishing and landscaping from scratch; and once, in Lenox, she bought a piece of empty land and really did start from scratch, raising out of the earth an American palace called The Mount. All of this exacted from her the energy, attentiveness, and insatiable governing impulses of a corporation chief executive, or the head of a small state.
In an architectural lull, she would travel. All her life she traveled compulsively, early in her marriage with her husband, touring Europe from February to June, afterward with various male companions, with the sense, and with the propriety, of leading a retinue. Accumulating “scenes”—hotels, landscapes, seascapes, museums, villages, ruins—she saw all the fabled cities of Europe, the islands of the Aegean, Tunis, Algiers, Carthage, the Sahara.
And all the while she was surrounded by a crowd. Not simply while traveling: the crowd was part of the daily condition of her houses and possessions. She had a household staff consisting of maids (“housemaids” and “chambermaids”—there appears to be a difference), a chief gardener and several under-gardeners, cook, housekeeper, major-domo, chauffeur, personal maid, “traveling” maid, secretary, “general agent,” footmen. (One of the latter, accompanying her to I Tatti, the Berenson villa in Italy, inconveniently fell in love with a Berenson maid, and had to be surrendered.) These “establishments,” Lewis remarks, “gave her what her bountiful nature desired: an ordered life, a carefully tended beauty of surroundings, and above all, total privacy.” The “above all” engenders skepticism. Privacy? Surveying that mob of servants, even imagining them crossing silent carpets on tiptoe, one takes the impression, inevitably, of a hive. Her solitude was the congested solitude of a monarch; she was never, like other solitary-minded American writers (one thinks of Poe, or of course Emily Dickinson, or even Scott Fitzgerald), completely alone in the house. But these hectic movements of the hive were what she required; perhaps she would not have known how to do without them. Chekhov could sit at a table in the middle of the din of a large impoverished family, ignoring voices and footsteps in order to concentrate on the scratch of his pen. Edith Wharton sat up in bed with her writing board, in the middle of the active business of a house claiming her attention, similarly shutting out the only family she had. A hired family, an invented one. When she learned that her older brother Freddy, living not far away in Paris, had suffered a stroke, she was “unresponsive”; but when Gross, her housekeeper of long standing, and Elise, her personal maid, both grew fatally ill within a short space, she wrote in her diary, “All my life goes with those two dying women.”
Nicky Mariano, in her memoir of her life as secretary-companion to Berenson, recalls how Edith Wharton treated her with indifference—until one day, aboard a yacht near Naples, she happened to ask after Elise. She was at once dispatched to the cabin below to visit with the maid. “From then on I became aware of a complete change in Edith’s manner to me. There was a warmth, a tone of intimacy I had never heard before.” And again, describing how Wharton “looked after her servants with affectionate zeal and took a lively interest in all their joys and sorrows,” she produces another anecdote:
I remember how once during one of our excursions with her, she was deeply hurt and angry when on leaving a villa near Siena after a prolonged visit she discovered that neither her maid nor her chauffeur had been asked into the house.
What is the effect on a writer of being always encircled by servants? What we are to draw from this is not so much the sadness of purchased affections, or even the parasitism (once, left without much help for a brief period, she was bewildered about her daily survival), but something more perplexing: the moment-by-moment influence of continuous lower-class companionship. Room ought to be given to considering this; it took room in Wharton’s life: she was with her servants all the time, she was with her friends and peers only some of the time. E. M. Forster sought out the common people in the belief that too much education atrophies the senses; in life and in art he went after the lower orders because he thought them the embodiment of the spontaneous gods of nature. In theory, at least—perhaps it was only literary theory—Forster wanted to become “instinctual,” and instinct was with the working class. But Edith Wharton kept her distance even as she drew close; she remained mistress always. It made her a kind of double exile. As an expatriate settled in France, she had cut herself off from any direct infusion of the American sensibility and the American language. Through her attachment to her servants, she became intimately bound to illiterate lives remote from her mentality, preoccupations, habitual perceptions—a second expatriation as deliberate as the more obvious one. Nor did her servants give her access to “ordinary” life (she was no Lady Chatterley, there was no gamekeeper for her)—no one is “ordinary” while standing before the monarch of the house. Still, she fussed over her army of hirelings; it was a way of inventing claims. For her servants she provided pensions; she instituted a trust fund as a private charity for three Belgian children; she sent regular checks to her sister-in-law, divorced from her brother a quarter of a century and therefore clearly not to be taken for family. For family, in short, she substituted claims indisputably of her own making. She could feel responsible for servants and acquired dependents as others feel responsible for parents, brothers, children: but there was a tether made of money, and the power-end of the tether was altogether in her hand. With servants, there is no murkiness—as there sometimes is in friendship—about who is beholden to whom.
With her friends it was more difficult to invent claims; friendship has a way of resisting purchase, and she had to resort to ruses. When she wanted to release Morton Fullerton from the entangling blackmail of his former French mistress, she arranged with Henry James to make it seem as if the money were coming impersonally from a publisher. Fullerton having been, however briefly, her lover, it was hardly possible to hand over one hundred pounds and call it a “pension”; the object was not so much to keep Fullerton’s friendship free as to establish the illusion of such freedom. It was enough for the controlling end of the money-tether to know the tether was there; and anyhow the tether had a witness and an accomplice. “Please consider,” James wrote, entering into the plot, “that I will play my mechanical part in your magnificent combination with absolute piety, fidelity, and punctuality.”
But when it was James himself who came to be on the receiving end of the golden tether, he thundered against the tug of opulence, and the friendship was for a while impaired. The occasion was a proposal for his seventieth birthday: Edith Wharton, enlisting about forty moneyed Americans, thought to raise “not less than $5,000,” the idea being “that he should choose a fine piece of old furniture, or something of the kind”—but to James it all smelled blatantly of charity, meddling, pity, and cash. Once he got wind of the plan he called it a “reckless and indiscreet undertaking,” and announced in a cable that he was beginning “instant prohibitive action. Please express to individuals approached my horror. Money absolutely returned.”
It was returned, but within a few months James was hooked anyhow on that same line—hooked like Morton Fullerton, without being aware of it. This time the accomplice was Charles Scribner, who forwarded to James a phony “advance” of eight thousand dollars intended to see him through the writing of The Ivory Tower—but the money was taken out of Wharton’s own advance, from another publisher, of fifteen thousand dollars. The reluctant agent of the scheme, far from celebrating “your magnificent combination,” saw it rather as “our fell purpose.” “I feel rather mean and caddish and must continue so to the end of my days,” Charles Scribner grumbled. “Please never give me away.” In part this sullenness may have been guilt for not having himself volunteered, as James’s publisher, to keep a master artist free from money-anxiety, but beyond that there was a distaste for manipulation and ruse.
This moral confusion about proprieties—whom it is proper to tip, and whom not—expressed itself in other strange substitutions. It was not only that she wanted to pay her lover and her friend for services rendered, sexual or literary—clearly she had little overt recognition of the quid pro quo uses of philanthropy. It was not only that she loved her maid Gross more than her mother, and Arthur White her “man” more than her brother—it is understood that voluntary entanglements are not really entanglements at all. But there were more conspicuous replacements. Lacking babies, she habitually fondled small dogs: there is an absurd photograph of Edith Wharton as a young woman of twenty-eight, by then five years into her marriage, with an angry-looking Pekingese on each muttonleg shoulder; the animals, pressed against her cheeks, nearly obscure her face; the face is cautious and contemplative, as of one not wanting to jar precious things. A similar photograph shows her husband gazing straight out at us with rather empty pale eyes over a nicely-trimmed mustache and a perfect bow tie—on his lap, with no special repugnance, he is holding three small dogs, two of them of that same truculent breed, and though the caption reads “Teddy Wharton with his dogs,” somehow we know better whose dogs they are. His body is detached, his expression, very correct and patient, barely hides—though Lewis argues otherwise—how he is being put upon by such a pose.
Until late in life, she never knew a child. Effie, the little girl in The Reef, is a child observed from afar—she runs, she enters, she departs, she is sent, she is summoned, at one moment she is presented as very young, at another she is old enough to be having lessons in Latin. She is a figment of a child. But the little dogs, up to the end of Edith Wharton’s life, were always understood, always thought to have souls, always in her arms and in her bed; they were, Lewis says, “among the main joys of her being.” Drawing up a list of her “ruling passions” at forty-two, she put “Dogs” second after “Justice and Order.” At sixty-two she wrote in her journal of “the usness” in the eyes of animals, “with the underlying not-us ness which belies it,” and meditated on their “eternal inarticulateness and slavery. Why? their eyes seem to ask us.”
The fellow feeling she had for the not-usness of her Pekingese she did not have for her husband, who was, from her point of view, also “not-us.” He too was inarticulate and mired in the slavery of a lesser intellect. He was a good enough man, interested (like his wife) in being perfectly clothed, vigorous and humorous and kind and compliant (so compliant that he once actually tried to make his way through James’s The Golden Bowl)—undistinguished in any jot, the absolute product of his class. He had no work to do, and sought none. One of Edith Wharton’s friends—a phrase instantly revealing, since her friends were practically never his; the large-hearted Henry James was nearly the only one to cross this divide—observed that Teddy Wharton’s “idleness was busy and innocent.” His ostensible employment was the management of his wife’s trust funds, but he filled his days with sports and hunting, and his glass with fine wine. Wine was the one thing he had a connoisseur’s familiarity with; and, of all the elegant good things of the world, wine was the one thing his wife disliked. When he was fifty-three he began to go mad, chiefly, it would seem, because he had married the wrong wife, with no inkling that she would turn out to be the wrong wife. Edith New-bold Jones at twenty-three was exactly what Edward Wharton, a dozen years older, had a right to expect for himself: she had heritage (her ancestor, Ebenezer Stevens, was an enterprising artillery officer in the Revolutionary War), she had inheritance (the Joneses owned the Chemical Bank of New York and much of the West Side). In brief, family and money. The dominant quality—what he had married her for, with that same idle innocence that took note only of the pleasantly obvious—was what Edith Wharton was afterward to call “tribe.” The Whartons and the Joneses were of the same tribe—old Protestant money—and he could hardly predict that his wife would soon replace him in the nuptial bed with a writing board. At first he was perplexed but proud: Louis Auchincloss quotes a description of Teddy Wharton from Consuelo Vanderbilt’s memoirs as “more of an equerry than an equal, walking behind [his wife] and carrying whatever paraphernalia she happened to discard,” and once (Lewis tells us), walking as usual behind her, Teddy exclaimed to one of her friends, “Look-at that waist! No one would ever guess that she had written a line of poetry in her life.” She, meanwhile, was driven to writing in her journal, “Oh, Gods of derision! And you’ve given me over twenty years of it!” This outcry occurred immediately after having shown her husband, during a wearying train journey, “a particularly interesting passage” in a scientific volume called Heredity and Variation. His response was not animated. “I heard the key turn in my prison-lock,” she recorded, in the clear metaphorical style of her fiction.
A case can be made that it was she who turned the key on him. His encroaching madness altered him—he began to act oddly, out of character; or, rather, more in character than he had ever before dared. The equerry of the paraphernalia undertook to behave as if he were master of the paraphernalia—in short, he embezzled a part of the funds it had been his duty to preserve and augment. And, having been replaced in bed by a writing board, he suddenly confessed to his wife (or perhaps feverishly bragged) that he had recently gone to live with a prostitute in a Boston apartment, filling its remaining rooms with chorus girls; the embezzled funds paid for the apartment. The story was in the main confirmed. His madness had the crucial sanity of needs that are met.
His wife, who—granted that philanthropy is not embezzlement—was herself capable of money-ruse, and who had herself once rapturously fallen from merely spiritual friendship, locked him up for it. Against his protestations, and that of his sister and brother, he was sent to a sanitorium. Teddy had stolen, Teddy had fallen; he was an adulterer. She had never stolen (though there is a robust if mistaken critical tradition that insists she stole her whole literary outlook from Henry James); but she had fallen, she was an adulteress. Teddy’s sexual disgrace was public; hers went un-divulged until her biographer came upon it more than three decades after her death. But these sardonic parallels and opposites illumine little beyond the usual ironies of the pot and the kettle. What had all at once happened in Edith Wharton’s life was that something had happened. Necessity intervened, her husband was irrefutably a manic-depressive. He had hours of excitement and accusation; more often he was in a state of self-castigation. He begged her for help, he begged to be taken back and to be given a second chance. “. . . When you came back last year,” she told him, “I was ready to overlook everything you had done, and to receive you as if nothing had happened.” This referred to the Boston apartment; she herself had been in a London hotel with Fullerton at nearly the same time. In the matter of her money she was more unyielding. Replying to his plea to be allowed to resume the management of her trusts and property, she took the tone of a mistress with a servant who has been let go, and who is now discovered still unaccountably loitering in the house. “In order that no further questions of this kind should come up, the only thing left for me to do is to suggest that you should resign your Trusteeship. . . . Your health unfortunately makes it impossible for you to take any active part in the management of my affairs.” Gradually, over months, she evolved a policy: she did everything for him that seemed sensible, as long as it was cold-hearted. He was removed, still uncured, from the sanitorium, and subjected to a regime of doctors, trips, traveling companions, scoldings. In the end, when he was most sick and most desperate, she discarded him, handing him over to the doctors the way one hands over impeding paraphernalia to an equerry. She discarded him well before she divorced him; divorce, at that period and in her caste, took deliberation. She discarded him because he impeded, he distracted, he was a nuisance, he drained her, he wore her out. As a woman she was contemptuous of him, as a writer she fought off his interruptions. The doctors were more polite than Henry James, who characterized Teddy Wharton as able to “hold or follow no counter-proposal, no plan of opposition, of his own, for as much as a minute or two; he is immediately off—irrelevant and childish . . . one’s pity for her is at the best scarce bearable.”
She too pitied herself, and justly, though she forgot to pity him. He had lost all trust in himself, whatever he said he timidly or ingratiatingly or furiously took back. He was flailing vainly after the last flashes of an autonomy his wife had long ago stripped from him. And during all that angry space, when she was bitterly engaged in fending off the partisan ragings of his family, and coldly supervising his medical and traveling routines, she, in the stern autonomy of her morning bed, was writing Ethan Frome, finishing The Reef, bringing off short stories. She could do all this because she did not look into her husband’s eyes and read there, as she had read in the eyes of her little dogs, the helpless pathos of “Why?” It was true that she did not and could not love him, but her virtue was always according to principle, not passion. Presumably she also did not love the French soldiers who were sick with tuberculosis contracted in the trenches of World War I; nevertheless for them she organized a cure program, which she termed “the most vital thing that can be done in France now.” Whatever the most vital thing for Teddy might have been—perhaps there was nothing—she relinquished it at last. The question of the tubercular soldiers was, like all the claims on her spirit which she herself initiated, volitional and opportune. She had sought out these tragedies, they were not implicated in the conditions of her own life, that peculiar bed she had made for herself—“such a great big uncompromising 4-poster,” James called it. For the relief of tubercular soldiers and other good works, she earned a French medal, and was made a Chevalier of the Legion of Honor. An arena of dazzling public exertion. But in the lesser frame of private mess she did nothing to spare her husband the humiliation of his madness. It is one thing to go mad, it is another to be humiliated for it. The one time in her life drift stopped dead in its trackless spume, and a genuine claim made as if to seize her—necessity, redder in tooth and claw than any sacrifice one grandly chooses for oneself—she turned away. For her, such a claim was the Gorgon’s head, to gaze on which was death.
Writer’s death. This is something most writers not only fear but sweat to evade, though most do not practice excision with as clean a knife-edge as cut away “irrelevant and childish” Teddy from Edith Wharton’s life. “Friend, client, child, sickness,’ fear, want, charity, all knock at once at thy closet door and say—‘Come out unto us.’ But keep thy state,” Emerson advised, “come not into their confusion.” And Mann’s Tonio Kröger declaims that “one must die to life to be utterly a creator.” This ruthless romantic idea—it cannot be lived up to by weaklings who succumb to conscience, let alone to love—is probably at bottom less romantic than pragmatic. But it is an idea very nearly the opposite of Wilson’s and Kazin’s more affecting view of Edith Wharton: that joylessness was her muse, that her troubles energized her for fiction—the stimulus of “some exceptional emotional strain,” according to Wilson, “so many maladjustments,” according to Kazin, which made the novelist possible. If anything made the novelist possible, it was the sloughing off of the sources of emotional strain and personal maladjustment. As for the parallel new-feminist opinion that a woman writes best when she risks “unbearable wisdoms,” it does not apply: what wisdom Edith Wharton found unbearable she chose not to bear.
The rest was chatter. Having turned away from the Gorgon’s head, she spent the remainder of her life—indeed, nearly the whole of it—in the mainly insipid, sometimes inspired, adventure of elevated conversation. She had her friends. There were a few women—whether because she did not encounter her equals among women, or because she avoided them, her biographer yields no hint. The majority were men (one should perhaps say “gentlemen”)—Lapsley, Lubbock, Berenson, Fullerton, Simmons, James, Bourget, D’Humières, Berry, Sturgis, Hugh-Smith, Maynard, Gregory, Grant, Scott . . . the list is longer still. Lewis fleshes out all these names brilliantly, particularly Berry and Fullerton; the great comic miraculous James needs no fleshing out. James was in a way afraid of her. She swooped down on him to pluck him away for conversation or sightseeing, and he matched the “commotion and exhaustion” of her arrivals against the vengeance of Bonaparte, Attila, and Tamerlaine. “Her powers of devastation are ineffable,” he reported, and got into the habit of calling her the Angel of Devastation. She interrupted his work with the abruptness of a natural force (she might occur at any time) and at her convenience (she had particular hours for her work, he had all hours for his). He read her novels and dispatched wondrous celebrating smokescreens of letters (“I applaud, I mean I value, I egg you on”) to hide the insufficiency of his admiration. As for her “life,” it was a spectacle that had from the beginning upset him: her “desolating, ravaging, burning, and destroying energy.” And again: “Such a nightmare of perpetually renewable choice and decision, such a luxury of bloated alternatives.” “What an incoherent life!” he summed it up. Lewis disagrees, and reproaches James for partial views and a probable fear of strong women; but it may be, on all the lavish evidence Lewis provides, that the last word will after all lie with drift, exactly as James perceived it in her rushing aimlessness aimed at him.
Before Lewis’s landmark discovery of the Wharton-Fullerton liaison, Walter Van Rensselaer Berry—Wharton’s distant cousin, an international lawyer and an aristocrat—was commonly regarded as the tender center and great attachment of her life. Lewis does not refute this connection, though he convincingly drains it of sexual particularity, and gives us the portrait of a conventionally self-contained dry-hearted lifelong bachelor, a man caught, if not in recognizable drift, then in another sort of inconclusiveness. But Walter Berry was Edith Wharton’s first literary intellectual—a lightning-bolt of revelation that, having struck early, never lost its electrical sting. Clearly, she fed on intellectuals—but in a withdrawn and secretive way: she rarely read her work aloud, though she rejoiced to hear James read his. She brooded over history and philosophy, understood everything, but was incapable in fiction or elsewhere of expressing anything but the most commonplace psychology. This was, of course, her strength: she knew how human beings behave, she could describe and predict and surprise. Beyond that, she had a fertile capacity for thinking up stories. Plots and permutations of plots teemed. She was scornful of writers who agonized after subject matter. Subjects, she said, swarmed about her “like mosquitoes,” until she felt stifled by their multiplicity and variety.
The truth is she had only one subject, the 19th century’s unique European literary subject: society. Standard American criticism, struggling to “place” Edith Wharton in a literary environment unused to her subject, has contrived for her the role of a lesser Henry James. This has served to indict her as an imitative figure. But on no significant level is the comparison with James pertinent, except to say that by and large they wrote about the same kinds of people, derived from the same class. Otherwise the difference can be seized in a breath: James was a genius, Wharton not. James invented an almost metaphysical art, Wharton’s insights lay close against their molds: what she saw she judged. James became an American in the most ideal sense, Wharton remained an estranged New Yorker. James was an uncanny moralist, Wharton a canny realist. James scarcely ever failed—or, at least, his few failures when they occurred were nevertheless glorious in aspiration and seamless in execution. When Wharton failed, she fell into an embarrassing triteness of language and seeing.
It is a pity that her name is attached so unrelentingly—thanks to the American high school—to Ethan Frome, a desolate, even morbid, narrow, soft-at-the-center and at the last unsurprising novella not at all typical of her range. It is an outdoor book that ends mercilessly indoors; she was an indoor novelist. She achieved two permanent novels, one—The House of Mirth—a spoiled masterpiece, a kind of latter-day reverse Scarlet Letter, very direct yet eerie, the other The Age of Innocence, a combination of ode and elegy to the New York of her childhood, affirmation and repudiation both. A good many of her short stories and some of the novellas (“The Old Maid,” for instance) are marvels of shapeliness and pointedness. This applies also to stories written during her late period, when she is widely considered to have debased her gift. The common accusation—Wilson makes it—is that her prose finally came to resemble women’s magazine fiction. One can venture that she did not so much begin to sound like the women’s magazines, as that they began to sound like her, a condition that obtains until this moment. No one has explored Wharton’s ongoing subliminal influence on current popular fiction (see almost any issue of Redbook); such an investigation would probably be striking in its disclosure of the strength of her legacy. Like any hokey imitation long after the model is lost to consciousness, it is not a bad compliment, though it may be awkward to admit it. (One of the least likely tributes to the Roman Empire, after all, is the pervasiveness of 19th-century American civic architecture.) But The House of Mirth and The Age of Innocence are, like everything unsurpassable because deeply idiosyncratic, incapable of spawning versions of themselves; in these two novels she is in command of an inwardness commensurate with structure. In them she does not simply grab hold of society, or judge it merely; she turns society into an exulting bird of prey, with blood on its beak, steadily beating its wings just over our heads; she turns society into an untamable idea. The reader, apprehensive, yet lured by the bird’s lyric form, covers his face.
She could do all that; she had that power. Lewis, writing to justify and defend, always her sympathetic partisan, nevertheless hedges. Having acknowledged that she had “begun to locate herself—with a certain assurance, though without vanity—in the developing course of American literature,” he appends a doubt:
But in another part of her, there remained something of the conviction drilled into her in old New York that it was improper for a lady to write fiction. One could do so only if one joked about it—if one treated it, to borrow Lubbock’s word, as “an amusement.” She sometimes sounded as if her writing were her entertainingly guilty secret, and in her memoirs she referred to it (borrowing the title of a popular children’s book of her own New York youth) as her “secret garden.”
But in the winter of 1911 [she was then at work on The Reef], as on perhaps half a dozen other occasions, it was the believing artist that was in ascendancy during the hard-driving morning hours.
Somehow it is easy to doubt that she had this doubt—or, if she once had it, that she held it for long. To believe in her doubt is to make the bad case of the orthodox critics who, unlike Lewis, have shrunk from taking her seriously as an artist because as an American aristocrat she was born shockingly appurtenanced, and therefore deserves to be patronized for her sorrows. To believe in her doubt is to make the bad case of the new feminists, for whom female sex is, always and everywhere, an impediment difficult to transcend—even when, for an obsessed writer of talent, there is nothing to transcend. To believe in her doubt is to reverse the terms of her life and her work. Only “half a dozen other occasions” when Wharton was a “believing artist”? Only so few? This would mean that the life outside her bed—the dressed life of conversation and travel, the matchstick life of drift—was the primary life, and the life with her writing board—the life of the believing artist—the deviation, the anomaly, the distraction.
But we know, and have always known (Freud taught us only how to reinforce this knowledge), that the secret self is the true self, that obsession is confession. For Edith Wharton that is the only acceptable evaluation, the only possible justice. She did not doubt her allegiance. The writing came first. That she kept it separate from the rest was a misrepresentation and a mistake, but it may also have been a species of holy instinct—it was the one uncontaminated zone of her being: the place unprofaned. Otherwise she can be defined only by the horrific gyrations of “a life”—by the spiraling solipsism and tragic drift that led her to small dogs instead of babies, servants instead of family, high-minded male distance instead of connubial friendship, public virtue instead of private conscience, infatuation instead of the love that sticks. Only the writing board could justify these ugly substitutions. And some would say—myself not among them—that not even the writing board justified them.
1 Vivian Gornick, the Village Voice, May 31, 1973.
2 Though, to be fair, I have heard of at least one new-feminist literature class which has studied The House of Mirth—evidently because it is so easy to interpret its heroine as the ideal victim.
3 Edith Wharton: A Biography, Harper & Row, 592 pp., $15.00. The prizes are: the Pulitzer, the National Book Critics Circle Award, and Columbia University’s Bancroft Prize.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Justice (Again) to Edith Wharton
Must-Reads from Magazine
Exactly one week later, a Star Wars cantina of the American extremist right featuring everyone from David Duke to a white-nationalist Twitter personality named “Baked Alaska” gathered in Charlottesville, Virginia, to protest the removal of a statue honoring the Confederate general Robert E. Lee. A video promoting the gathering railed against “the international Jewish system, the capitalist system, and the forces of globalism.” Amid sporadic street battles between far-right and “antifa” (anti-fascist) activists, a neo-Nazi drove a car into a crowd of peaceful counterprotestors, killing a 32-year-old woman.
Here, in the time span of just seven days, was the dual nature of contemporary American anti-Semitism laid bare. The most glaring difference between these two displays of hate lies not so much in their substance—both adhere to similar conspiracy theories articulating nefarious, world-altering Jewish power—but rather their self-characterization. The animosity expressed toward Jews in Charlottesville was open and unambiguous, with demonstrators proudly confessing their hatred in the familiar language of Nazis and European fascists.
The socialists in Chicago, meanwhile, though calling for a literal second Holocaust on the shores of the Mediterranean, would fervently and indignantly deny they are anti-Semitic. On the contrary, they claim the mantle of “anti-fascism” and insist that this identity naturally makes them allies of the Jewish people. As for those Jews who might oppose their often violent tactics, they are at best bystanders to fascism, at worst collaborators in “white supremacy.”
So, whereas white nationalists explicitly embrace a tribalism that excludes Jews regardless of their skin color, the progressives of the DSA and the broader “woke” community conceive of themselves as universalists—though their universalism is one that conspicuously excludes the national longings of Jews and Jews alone. And whereas the extreme right-wingers are sincere in their anti-Semitism, the socialists who called for the elimination of Israel are just as sincere in their belief that they are not anti-Semitic, even though anti-Semitism is the inevitable consequence of their rhetoric and worldview.
The sheer bluntness of far-right anti-Semitism makes it easier to identify and stigmatize as beyond the pale; individuals like David Duke and the hosts of the “Daily Shoah” podcast make no pretense of residing within the mainstream of American political debate. But the humanist appeals of the far left, whose every libel against the Jewish state is paired with a righteous invocation of “justice” for the Palestinian people, invariably trigger repetitive and esoteric debates over whether this or that article, allusion, allegory, statement, policy, or political initiative is anti-Semitic or just critical of Israel. What this difference in self-definition means is that there is rarely, if ever, any argument about the substantive nature of right-wing anti-Semitism (despicable, reprehensible, wicked, choose your adjective), while the very existence of left-wing anti-Semitism is widely doubted and almost always indignantly denied by those accused of practicing it.T o be sure, these recent manifestations of anti-Semitism occur on the left and right extremes. And statistics tell a rather comforting story about the state of anti-Semitism in America. Since the Anti-Defamation League began tracking it in 1979, anti-Jewish hate crime is at an historic low; indeed, it has been declining since a recent peak of 1,554 incidents in 2006. America, for the most part, remains a very philo-Semitic country, one of the safest, most welcoming countries for Jews on earth. A recent Pew poll found Jews to be the most admired religious group in the United States.1 If American Jews have anything to dread, it’s less anti-Semitism than the loss of Jewish peoplehood through assimilation, that is being “loved to death” by Gentiles.2 Few American Jews can say that anti-Semitism has a seriously deleterious impact on their life, that it has denied them educational or employment opportunities, or that they fear for the physical safety of themselves or their families because of their Jewish identity.
The question is whether the extremes are beginning to move in on the center. In the past year alone, the DSA’s rolls tripled from 8,000 to 25,000 dues-paying members, who have established a conspicuous presence on social media reaching far beyond what their relatively miniscule numbers attest. The DSA has been the subject of widespread media coverage, ranging from the curious to the adulatory. The white supremacists, meanwhile, found themselves understandably heartened by the strange difficulty President Donald Trump had in disavowing them. He claimed, in fact, that there had been some “very fine people” among their ranks. “Thank you President Trump for your honesty & courage to tell the truth about #Charlottesville,” tweeted David Duke, while the white-nationalist Richard Spencer said, “I’m proud of him for speaking the truth.”
Indeed, among the more troubling aspects of our highly troubling political predicament—and one that, from a Jewish perspective, provokes not a small amount of angst—is that so many ideas, individuals, and movements that could once reliably be categorized as “extreme,” in the literal sense of articulating the views of a very small minority, are no longer so easily dismissed. The DSA is part of a much broader revival of the socialist idea in America, as exemplified by the growing readership of journals like Jacobin and Current Affairs, the popularity of the leftist Chapo Trap House podcast, and the insurgent presidential campaign of self-described democratic socialist Bernie Sanders—who, according to a Harvard-Harris poll, is now the most popular politician in the United States. Since 2015, the average age of a DSA member dropped from 64 to 30, and a 2016 Harvard poll found a majority of Millennials do not support capitalism.
Meanwhile, the Republican Party of Donald Trump offers “nativism and culture war wedges without the Reaganomics,” according to Nicholas Grossman, a lecturer in political science at the University of Illinois. A party that was once reliably internationalist and assertive against Russian aggression now supports a president who often preaches isolationism and never has even a mildly critical thing to say about the KGB thug ruling over the world’s largest nuclear arsenal.
Like ripping the bandage off an ugly and oozing wound, Trump’s presidential campaign unleashed a bevy of unpleasant social forces that at the very least have an indirect bearing on Jewish welfare. The most unpleasant of those forces has been the so-called alternative right, or “alt-right,” a highly race-conscious political movement whose adherents are divided on the “JQ” (Jewish Question). Throughout last year’s campaign, Jewish journalists (this author included) were hit with a barrage of luridly anti-Semitic Twitter messages from self-described members of the alt-right. The tamer missives instructed us to leave America for Israel, others superimposed our faces onto the bodies of concentration camp victims.3
I do not believe Donald Trump is himself an anti-Semite, if only because anti-Semitism is mainly a preoccupation—as distinct from a prejudice—and Trump is too narcissistic to indulge any preoccupation other than himself. And there is no evidence to suggest that he subscribes to the anti-Semitic conspiracy theories favored by his alt-right supporters. But his casual resort to populism, nativism, and conspiracy theory creates a narrative environment highly favorable to anti-Semites.
Nativism, of which Trump was an early and active practitioner, is never good for the Jews, no matter how affluent or comfortable they may be and notwithstanding whether they are even the target of its particular wrath. Racial divisions, which by any measure have grown significantly worse in the year since Trump was elected, hurt all Americans, obviously, but they have a distinct impact on Jews, who are left in a precarious position as racial identities calcify. Not only are the newly emboldened white supremacists of the alt-right invariably anti-Semites, but in the increasingly racialist taxonomy of the progressive left—which more and more mainstream liberals are beginning to parrot—Jews are considered possessors of “white privilege” and, thus, members of the class to be divested of its “power” once the revolution comes. In the racially stratified society that both extremes envision, Jews lose out, simultaneously perceived (by the far right) as wily allies and manipulators of ethnic minorities in a plot to mongrelize America and (by the far left) as opportunistic “Zionists” ingratiating themselves with a racist and exploitative “white” establishment that keeps minorities down.T his politics is bad for all Americans, and Jewish Americans in particular. More and more, one sees the racialized language of the American left being applied to the Middle East conflict, wherein Israel (which is, in point of fact, one of the most racially diverse countries in the world) is referred to as a “white supremacist” state no different from that of apartheid South Africa. In a book just published by MIT Press, ornamented with a forward by Cornel West and entitled “Whites, Jews, and Us,” a French-Algerian political activist named Houria Bouteldja asks, “What can we offer white people in exchange for their decline and for the wars that will ensue?” Drawing the Jews into her race war, Bouteldja, according to the book’s jacket copy, “challenges widespread assumptions among the left in the United States and Europe—that anti-Semitism plays any role in Arab–Israeli conflicts, for example, or that philo-Semitism doesn’t in itself embody an oppressive position.” Jew-hatred is virtuous, and appreciation of the Jews is racism.
Few political activists of late have done more to racialize the Arab–Israeli conflict—and, through insidious extension of the American racial hierarchy, designate American Jews as oppressors—than the Brooklyn-born Arab activist Linda Sarsour. An organizer of the Women’s March, Sarsour has seamlessly insinuated herself into a variety of high-profile progressive campaigns, a somewhat incongruent position given her reactionary views on topics like women’s rights in Saudi Arabia. (“10 weeks of PAID maternity leave in Saudi Arabia,” she tweets. “Yes PAID. And ur worrying about women driving. Puts us to shame.”) Sarsour, who is of Palestinian descent, claims that one cannot simultaneously be a feminist and a Zionist, when it is the exact opposite that is true: No genuine believer in female equality can deny the right of Israel to exist. The Jewish state respects the rights of women more than do any of its neighbors. In an April 2017 interview, Sarsour said that she had become a high-school teacher for the purpose of “inspiring young people of color like me.” Just three months earlier, however, in a video for Vox, Sarsour confessed, “When I wasn’t wearing hijab I was just some ordinary white girl from New York City.” The donning of Muslim garb, then, confers a racial caste of “color,” which in turn confers virtue, which in turn confers a claim on political power.
This attempt to describe the Israeli–Arab conflict in American racial vernacular marks Jews as white (a perverse mirror of Nazi biological racism) and thus implicates them as beneficiaries of “structural racism,” “white privilege,” and the whole litany of benefits afforded to white people at birth in the form of—to use Ta-Nehisi Coates’s abstruse phrase—the “glowing amulet” of “whiteness.” “It’s time to admit that Arthur Balfour was a white supremacist and an anti-Semite,” reads the headline of a recent piece in—where else? —the Forward, incriminating Jewish nationalism as uniquely perfidious by dint of the fact that, like most men of his time, a (non-Jewish) British official who endorsed the Zionist idea a century ago held views that would today be considered racist. Reading figures like Bouteldja and Sarsour brings to mind the French philosopher Pascal Bruckner’s observation that “the racialization of the world has to be the most unexpected result of the antidiscrimination battle of the last half-century; it has ensured that the battle continuously re-creates the curse from which it is trying to break free.”
If Jews are white, and if white people—as a group—enjoy tangible and enduring advantages over everyone else, then this racially essentialist rhetoric ends up with Jews accused of abetting white supremacy, if not being white supremacists themselves. This is one of the overlooked ways in which the term “white supremacy” has become devoid of meaning in the age of Donald Trump, with everyone and everything from David Duke to James Comey to the American Civil Liberties Union accused of upholding it. Take the case of Ben Shapiro, the Jewish conservative polemicist. At the start of the school year, Shapiro was scheduled to give a talk at UC Berkeley, his alma matter. In advance, various left-wing groups put out a call for protest in which they labeled Shapiro—an Orthodox Jew—a “fascist thug” and “white supremacist.” An inconvenient fact ignored by Shapiro’s detractors is that, according to the ADL, he was the top target of online abuse from actual white supremacists during the 2016 presidential election. (Berkeley ultimately had to spend $600,000 protecting the event from leftist rioters.)
A more pernicious form of this discourse is practiced by left-wing writers who, insincerely claiming to have the interests of Jews at heart, scold them and their communal organizations for not doing enough in the fight against anti-Semitism. Criticizing Jews for not fully signing up with the “Resistance” (which in form and function is fast becoming the 21st-century version of the interwar Popular Front), they then slyly indict Jews for being complicit in not only their own victimization but that of the entire country at the hands of Donald Trump. The first and foremost practitioner of this bullying and rather artful form of anti-Semitism is Jeet Heer, a Canadian comic-book critic who has achieved some repute on the American left due to his frenetic Twitter activity and availability when the New Republic needed to replace its staff that had quit en masse in 2014. Last year, when Heer came across a video of a Donald Trump supporter chanting “JEW-S-A” at a rally, he declared on Twitter: “We really need to see more comment from official Jewish groups like ADL on way Trump campaign has energized anti-Semitism.”
But of course “Jewish groups” have had plenty to say about the anti-Semitism expressed by some Trump supporters—too much, in the view of their critics. Just two weeks earlier, the ADL had released a report analyzing over 2 million anti-Semitic tweets targeting Jewish journalists over the previous year. This wasn’t the first time the ADL raised its voice against Trump and the alt-right movement he emboldened, nor would it be the last. Indeed, two minutes’ worth of Googling would have shown Heer that his pronouncements about organizational Jewish apathy were wholly without foundation.4
It’s tempting to dismiss Heer’s observation as mere “concern trolling,” a form of Internet discourse characterized by insincere expressions of worry. But what he did was nastier. Immediately presented with evidence for the inaccuracy of his claims, he sneered back with a bit of wisdom from the Jewish sage Hillel the Elder, yet cast as mild threat: “If I am not for myself, who will be for me?” In other words: How can you Jews expect anyone to care about your kind if you don’t sufficiently oppose—as arbitrarily judged by moi, Jeet Heer—Donald Trump?
If this sort of critique were coming from a Jewish donor upset that his preferred organization wasn’t doing enough to combat anti-Semitism, or a Gentile with a proven record of concern for Jewish causes, it wouldn’t have turned the stomach. What made Heer’s interjection revolting is that, to put it mildly, he’s not exactly known for being sympathetic toward the Jewish plight. In 2015, Heer put his name to a petition calling upon an international comic-book festival to drop the Israeli company SodaStream as a co-sponsor because the Jewish state is “built on the mass ethnic cleansing of Palestinian communities and sustained through racism and discrimination.” Heer’s name appeared alongside that of Carlos Latuff, a Brazilian cartoonist who won second place in the Iranian government’s 2006 International Holocaust Cartoon Competition. For his writings on Israel, Heer has been praised as being “very good on the conflict” by none other than Philip Weiss, proprietor of the anti-Semitic hate site Mondoweiss.
In light of this track record, Heer’s newfound concern about anti-Semitism appeared rather dubious. Indeed, the bizarre way in which he expressed this concern—as, ultimately, a critique not of anti-Semitism per se but of the country’s foremost Jewish civil-rights organization—suggests he cares about anti-Semitism insofar as its existence can be used as a weapon to beat his political adversaries. And since the incorrigibly Zionist American Jewish establishment ranks high on that list (just below that of Donald Trump and his supporters), Heer found a way to blame it for anti-Semitism. And what does that tell you? It tells you that—presented with a 16-second video of a man chanting “JEW-S-A” at a Donald Trump rally—Heer’s first impulse was to condemn not the anti-Semite but the Jews.
Heer isn’t the only leftist (or New Republic writer) to assume this rhetorical cudgel. In a piece entitled “The Dismal Failure of Jewish Groups to Confront Trump,” one Stephen Lurie attacked the ADL for advising its members to stay away from the Charlottesville “Unite the Right Rally” and let police handle any provocations from neo-Nazis. “We do not have a Jewish organizational home for the fight against fascism,” he quotes a far-left Jewish activist, who apparently thinks that we live in the Weimar Republic and not a stable democracy in which law-enforcement officers and not the balaclava-wearing thugs of antifa maintain the peace. Like Jewish Communists of yore, Lurie wants to bully Jews into abandoning liberalism for the extreme left, under the pretext that mainstream organizations just won’t cut it in the fight against “white supremacy.” Indeed, Lurie writes, some “Jewish institutions and power players…have defended and enabled white supremacy.” The main group he fingers with this outrageous slander is the Republican Jewish Coalition, the implication being that this explicitly partisan Republican organization’s discrete support for the Republican president “enables white supremacy.”
It is impossible to imagine Heer, Lurie, or other progressive writers similarly taking the NAACP to task for its perceived lack of concern about racism, or castigating the Human Rights Campaign for insufficiently combating homophobia. No, it is only the cowardice of Jews that is condemned—condemned for supposedly ignoring a form of bigotry that, when expressed on the left, these writers themselves ignore or even defend. The logical gymnastics of these two New Republic writers is what happens when, at base, one fundamentally resents Jews: You end up blaming them for anti-Semitism. Blaming Jews for not sufficiently caring enough about anti-Semitism is emotionally the same as claiming that Jews are to blame for anti-Semitism. Both signal an envy and resentment of Jews predicated upon a belief that they have some kind of authority that the claimant doesn’t and therefore needs to undermine.T his past election, one could not help but notice how the media seemingly discovered anti-Semitism when it emanated from the right, and then only when its targets were Jews on the left. It was enough to make one ask where they had been when left-wing anti-Semitism had been a more serious and pervasive problem. From at least 1996 (the year Pat Buchanan made his last serious attempt at securing the GOP presidential nomination) to 2016 (when the Republican presidential nominee did more to earn the support of white supremacists and neo-Nazis than any of his predecessors), anti-Semitism was primarily a preserve of the American left. In that two-decade period—spanning the collapse of the Oslo Accords and rise of the Second Intifada to the rancorous debate over the Iraq War and obsession with “neocons” to the presidency of Barack Obama and the 2015 Iran nuclear deal—anti-Israel attitudes and anti-Semitic conspiracy made unprecedented inroads into respectable precincts of the American academy, the liberal intelligentsia, and the Democratic Party.
The main form that left-wing anti-Semitism takes in the United States today is unhinged obsession with the wrongs, real or perceived, of the state of Israel, and the belief that its Jewish supporters in the United States exercise a nefarious control over the levers of American foreign policy. In this respect, contemporary left-wing anti-Semitism is not altogether different from that of the far right, though it usually lacks the biological component deeming Jews a distinct and inferior race. (Consider the left-wing anti-Semite’s eagerness to identify and promote Jewish “dissidents” who can attest to their co-religionists’ craftiness and deceit.) The unholy synergy of left and right anti-Semitism was recently epitomized by former CIA agent and liberal stalwart Valerie Plame’s hearty endorsement, on Twitter, of an article written for an extreme right-wing website by a fellow former CIA officer named Philip Giraldi: “America’s Jews Are Driving America’s Wars.” Plame eventually apologized for sharing the article with her 50,000 followers, but not before insisting that “many neocon hawks are Jewish” and that “just FYI, I am of Jewish descent.”
The main fora in which left-wing anti-Semitism appears is academia. According to the ADL, anti-Semitic incidents on college campuses doubled from 2014 to 2015, the latest year that data are available. Writing in National Affairs, Ruth Wisse observes that “not since the war in Vietnam has there been a campus crusade as dynamic as the movement of Boycott, Divestment, and Sanctions against Israel.” Every academic year, a seeming surfeit of controversies erupt on campuses across the country involving the harassment of pro-Israel students and organizations, the disruption of events involving Israeli speakers (even ones who identify as left-wing), and blatantly anti-Semitic outbursts by professors and student activists. There was the Oberlin professor of rhetoric, Joy Karega, who posted statements on social media claiming that Israel had created ISIS and had orchestrated the murderous attack on Charlie Hebdo in Paris. There is the Rutgers associate professor of women’s and gender studies, Jasbir Puar, who popularized the ludicrous term “pinkwashing” to defame Israel’s LGBT acceptance as a massive conspiracy to obscure its oppression of Palestinians. Her latest book, The Right to Maim, academically peer-reviewed and published by Duke University Press, attacks Israel for sparing the lives of Palestinian civilians, accusing its military of “shooting to maim rather than to kill” so that it may keep “Palestinian populations as perpetually debilitated, and yet alive, in order to control them.”
One could go on and on about such affronts not only to Jews and supporters of Israel but to common sense, basic justice, and anyone who believes in the prudent use of taxpayer dollars. That several organizations exist solely for the purpose of monitoring anti-Israel and anti-Semitic agitation on American campuses attests to the prolificacy of the problem. But it’s unclear just how reflective these isolated examples of the college experience really are. A 2017 Stanford study purporting to examine the issue interviewed 66 Jewish students at five California campuses noted for “being particularly fertile for anti-Semitism and for having an active presence of student groups critical of Israel and Zionism.” It concluded that “contrary to widely shared impressions, we found a picture of campus life that is neither threatening nor alarmist…students reported feeling comfortable on their campuses, and, more specifically, comfortable as Jews on their campuses.” To the extent that Jewish students do feel pressured, the report attempted to spread the blame around, indicting pro-Israel activists alongside those agitating against it. “[Survey respondents] fear that entering political debate, especially when they feel the social pressures of both Jewish and non-Jewish activist communities, will carry social costs that they are unwilling to bear.”
Yet by its own admission, the report “only engaged students who were either unengaged or minimally engaged in organized Jewish life on their campuses.” Researchers made a study of anti-Semitism, then, by interviewing the Jews least likely to experience it. “Most people don’t really think I’m Jewish because I look very Latina…it doesn’t come up in conversation,” one such student said in an interview. Ultimately, the report revealed more about the attitudes of unengaged (and, thus, uninformed) Jews than about the state of anti-Semitism on college campuses. That may certainly be useful in its own right as a means of understanding how unaffiliated Jews view debates over Israel, but it is not an accurate marker of developments on college campuses more broadly.
A more extensive 2016 Brandeis study of Jewish students at 50 schools found 34 percent agreed at least “somewhat” that their campus has a hostile environment toward Israel. Yet the variation was wide; at some schools, only 3 percent agreed, while at others, 70 percent did. Only 15 percent reported a hostile environment towards Jews. Anti-Semitism was found to be more prevalent at public universities than private ones, with the determinative factor being the presence of a Students for Justice in Palestine chapter on campus. Important context often lost in conversations about campus anti-Semitism, and reassuring to those concerned about it, is that it is simply not the most important issue roiling higher education. “At most schools,” the report found, “fewer than 10 percent of Jewish students listed issues pertaining to either Jews or Israel as among the most pressing on campus.”F or generations, American Jews have depended on anti-Semitism’s remaining within a moral quarantine, a cordon sanitaire, and America has reliably kept this societal virus contained. While there are no major signs that this barricade is breaking down in the immediate future, there are worrying indications on the political horizon.
Surveying the situation at the international level, the declining global position of the United States—both in terms of its hard military and economic power relative to rising challengers and its position as a credible beacon of liberal democratic values—does not portend well for Jews, American or otherwise. American leadership of the free world, has, in addition to ensuring Israel’s security, underwritten the postwar liberal world order. And it is the constituent members of that order, the liberal democratic states, that have served as the best guarantor of the Jews’ life and safety over their 6,000-year history. Were America’s global leadership role to diminish or evaporate, it would not only facilitate the rise of authoritarian states like Iran and terrorist movements such as al-Qaeda, committed to the destruction of Israel and the murder of Jews, but inexorably lead to a worldwide rollback of liberal democracy, an outcome that would inevitably redound to the detriment of Jews.
Domestically, political polarization and the collapse of public trust in every American institution save the military are demolishing what little confidence Americans have left in their system and governing elites, not to mention preparing the ground for some ominous political scenarios. Widely cited survey data reveal that the percentage of American Millennials who believe it “essential” to live in a liberal democracy hovers at just over 25 percent. If Trump is impeached or loses the next election, a good 40 percent of the country will be outraged and susceptible to belief in a stab-in-the-back theory accounting for his defeat. Whom will they blame? Perhaps the “neoconservatives,” who disproportionately make up the ranks of Trump’s harshest critics on the right?
Ultimately, the degree to which anti-Semitism becomes a problem in America hinges on the strength of the antibodies within the country’s communal DNA to protect its pluralistic and liberal values. But even if this resistance to tribalism and the cult of personality is strong, it may not be enough to abate the rise of an intellectual and societal disease that, throughout history, thrives upon economic distress, xenophobia, political uncertainty, ethnic chauvinism, conspiracy theory, and weakening democratic norms.
1 Somewhat paradoxically, according to FBI crime statistics, the majority of religiously based hate crimes target Jews, more than double the amount targeting Muslims. This indicates more the commitment of the country’s relatively small number of hard-core anti-Semites than pervasive anti-Semitism.
4 The ADL has had to maintain a delicate balancing act in the age of Trump, coming under fire by many conservative Jews for a perceived partisan tilt against the right. This makes Heer’s complaint all the more ignorant — and unhelpful.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Review of 'The Once and Future Liberal' By Mark Lilla
Lilla, a professor at Columbia University, tells us that “the story of how a successful liberal politics of solidarity became a failed pseudo-politics of identity is not a simple one.” And about this, he’s right. Lilla quotes from the feminist authors of the 1977 Combahee River Collective Manifesto: “The most profound and potentially most radical politics come directly out of our own identity, as opposed to working to end somebody else’s oppression.” Feminists looked to instantiate the “radical” and electrifying phrase which insisted that “the personal is political.” The phrase, argues Lilla, was generally seen in “a somewhat Marxist fashion to mean that everything that seems personal is in fact political.”
The upshot was fragmentation. White feminists were deemed racist by black feminists—and both were found wanting by lesbians, who also had black and white contingents. “What all these groups wanted,” explains Lilla, “was more than social justice and an end to the [Vietnam] war. They also wanted there to be no space between what they felt inside and what they saw and did in the world.” He goes on: “The more obsessed with personal identity liberals become, the less willing they become to engage in reasoned political debate.” In the end, those on the left came to a realization: “You can win a debate by claiming the greatest degree of victimization and thus the greatest outrage at being subjected to questioning.”
But Lilla’s insights into the emotional underpinnings of political correctness are undercut by an inadequate, almost bizarre sense of history. He appears to be referring to the 1970s when, zigzagging through history, he writes that “no recognition of personal or group identity was coming from the Democratic Party, which at the time was dominated by racist Dixiecrats and white union officials of questionable rectitude.”
What is he talking about? Is Lilla referring to the Democratic Party of Lyndon Johnson, Hubert Humphrey, and George McGovern? Is he referring obliquely to George Wallace? If so, why is Wallace never mentioned? Lilla seems not to know that it was the 1972 McGovern Democratic Convention that introduced minority seating to be set aside for blacks and women.
At only 140 pages, this is a short book. But even so, Lilla could have devoted a few pages to Frankfurt ideologist Herbert Marcuse and his influence on the left. In the 1960s, Marcuse argued that leftists and liberals were entitled to restrain centrist and conservative speech on the grounds that the universities had to act as a counterweight to society at large. But this was not just rhetoric; in the campus disruption of the early 1970s at schools such as Yale, Cornell, and Amherst, Marcuse’s ideals were pushed to the fore.
If Lilla’s argument comes off as flaccid, perhaps that’s because the aim of The Once and Future Liberal is more practical than principled. “The only way” to protect our rights, he tells the reader, “is to elect liberal Democratic governors and state legislators who’ll appoint liberal state attorneys.” According to Lilla, “the paradox of identity liberalism” is that it undercuts “the things it professes to want,” namely political power. He insists, rightly, that politics has to be about persuasion but then contradicts himself in arguing that “politics is about seizing power to defend the truth.” In other words, Lilla wants a better path to total victory.
Given what Lilla, descending into hysteria, describes as “the Republican rage for destruction,” liberals and Democrats have to win elections lest the civil rights of blacks, women, and gays are rolled back. As proof of the ever-looming danger, he notes that when the “crisis of the mid-1970s threatened…the country turned not against corporations and banks, but against liberalism.” Yet he gives no hint of the trail of liberal failures that led to the crisis of the mid-’70s. You’d never know reading Lilla, for example, that the Black Power movement intensified racial hostilities that were then further exacerbated by affirmative action and busing. And you’d have no idea that, at considerable cost, the poverty programs of the Great Society failed to bring poorer African Americans into the economic mainstream. Nor does Lilla deal with the devotion to Keynesianism that produced inflation without economic growth during the Carter presidency.
Despite his discursive ambling through the recent history of American political life, Lilla has a one-word explanation for identity politics: Reaganism. “Identity,” he writes, is “Reaganism for lefties.” What’s crucial in combating Reaganism, he argues, is to concentrate on our “shared political” status as citizens. “Citizenship is a crucial weapon in the battle against Reaganite dogma because it brings home that fact that we are part of a legitimate common enterprise.” But then he asserts that the “American right uses the term citizenship today as a means of exclusion.” The passage might lead the reader to think that Lilla would take up the question of immigration and borders. But he doesn’t, and the closing passages of the book dribble off into characteristic zigzags. Lilla tells us that “Black Lives Matter is a textbook example of how not to build solidarity” but then goes on, without evidence, to assert the accuracy of the Black Lives Matter claim that African-Americans have been singled out for police mistreatment.
It would be nice to argue that The Once and Future Liberal is a near miss, a book that might have had enduring importance if only it went that extra step. But Lilla’s passing insights on the perils of a politically correct identity politics drown in the rhetoric of conventional bromides that fill most of the pages of this disappointing book.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
n Athens several years ago, I had dinner with a man running for the national parliament. I asked him whether he thought he had a shot at winning. He was sure of victory, he told me. “I have hired a very famous political consultant from Washington,” he said. “He is the man who elected Reagan. Expensive. But the best.”
The political genius he then described was a minor political flunky I had met in Washington long ago, a more-or-less anonymous member of the Republican National Committee before he faded from view at the end of Ronald Reagan’s second term. Mutual acquaintances told me he still lived in a nice neighborhood in Northern Virginia, but they never could figure out what the hell he did to earn his money. (This is a recurring mystery throughout the capital.) I had to come to Greece to find the answer.
It is one of the dark arts of Washington, this practice of American political hacks traveling to faraway lands and suckering foreign politicians into paying vast sums for splashy, state-of-the-art, essentially worthless “services.” And it’s perfectly legal. Paul Manafort, who briefly managed Donald Trump’s campaign last summer, was known as a pioneer of the globe-trotting racket. If he hadn’t, as it were, veered out of his gutter into the slightly higher lane of U.S. presidential politics, he likely could have hoovered cash from the patch pockets of clueless clients from Ouagadougou to Zagreb for the rest of his natural life and nobody in Washington would have noticed.
But he veered, and now he and a colleague find themselves indicted by Robert Mueller, the Inspector Javert of the Russian-collusion scandal. When those indictments landed, they instantly set in motion the familiar scramble. Trump fans announced that the indictments were proof that there was no collusion between the Trump campaign and the Russians—or, in the crisp, emphatic phrasing of a tweet by the world’s Number One Trump Fan, Donald Trump: “NO COLLUSION!!!!” The Russian-scandal fetishists in the press corps replied in chorus: It’s still early! Javert required more time, and so will Mueller, and so will they.
A good Washington scandal requires a few essential elements. One is a superabundance of information. From these data points, conspiracy-minded reporters can begin to trace associations, warranted or not, and from the associations, they can infer motives and objectives with which, stretched together, they can limn a full-blown conspiracy theory. The Manafort indictment released a flood of new information, and at once reporters were pawing for nuggets that might eventually form a compelling case for collusion.
They failed to find any because Manafort’s indictment, in essence, involved his efforts to launder his profits from his international political work, not his work for the Trump campaign. Fortunately for the obsessives, another element is required for a good scandal: a colorful cast. The various Clinton scandals brought us Asian money-launderers and ChiCom bankers, along with an entire Faulkner-novel’s worth of bumpkins, sharpies, and backwoods swindlers, plus that intern in the thong. Watergate, the mother lode of Washington scandals, featured a host of implausible characters, from the central-casting villain G. Gordon Liddy to Sam Ervin, a lifelong segregationist and racist who became a hero to liberals everywhere.
Here, at last, is one area where the Russian scandal has begun to show promise. Manafort and his business partner seem too banal to hold the interest of anyone but a scandal obsessive. Beneath the pile of paper Mueller dumped on them, however, another creature could be seen peeking out shyly. This would be the diminutive figure of George Papadopoulos. An unpaid campaign adviser to Trump, Papadopoulos pled guilty to lying to the FBI about the timing of his conversations with Russian agents. He is quickly becoming the stuff of legend.
Papadopoulos is an exemplar of a type long known to American politics. He is the nebbish bedazzled by the big time—achingly ambitious, though lacking the skill, or the cunning, to climb the greasy pole. So he remains at the periphery of the action, ever eager to serve. Papadopoulos’s résumé, for a man under 30, is impressively padded. He said he served as the U.S. representative to the Model United Nations in 2012, though nobody recalls seeing him there. He boasted of a four-year career at the Hudson Institute, though in fact he spent one year there as an unpaid intern and three doing contract research for one of Hudson’s scholars. On his LinkedIn page, he listed himself as a keynote speaker at a Greek American conference in 2008, but in fact he participated only in a panel discussion. The real keynoter was Michael Dukakis.
With this hunger for achievement, real or imagined, Papadopoulos could not let a presidential campaign go by without climbing aboard. In late 2015, he somehow attached himself to Ben Carson’s campaign. He was never paid and lasted four months. His presence went largely unnoticed. “If there was any work product, I never saw it,” Carson’s campaign manager told Time. The deputy campaign manager couldn’t even recall his name. Then suddenly, in April 2016, Papadopoulos appeared on a list of “foreign-policy advisers” to Donald Trump—and, according to Mueller’s court filings, resolved to make his mark by acting as a liaison between Trump’s campaign and the Russian government.
While Mueller tells the story of Papadopoulos’s adventures in the dry, Joe Friday prose of a legal document, it could easily be the script for a Peter Sellers movie from the Cold War era. The young man’s résumé is enough to impress the campaign’s impressionable officials as they scavenge for foreign-policy advisers: “Hey, Corey! This dude was in the Model United Nations!”
Papadopoulus (played by Sellers) sets about his mission. A few weeks after signing on to the campaign, he travels to Europe, where he meets a mysterious “Professor” (Peter Ustinov). “Initially the Professor seemed uninterested in Papadopoulos,” says Mueller’s indictment. A likely story! Yet when Papadopoulus lets drop that he’s an adviser to Trump, the Professor suddenly “appeared to take great interest” in him. They arrange a meeting in London to which the Professor invites a “female Russian national” (Elke Sommer). Without much effort, the femme fatale convinces Papadopoulus that she is Vladimir Putin’s niece. (“I weel tell z’American I em niece of Great Leader! Zat idjut belief ennytink!”) Over the next several months our hero sends many emails to campaign officials and to the Professor, trying to arrange a meeting between them. As far we know from the indictment, nothing came of his mighty efforts.
And there matters lay until January 2017, when the FBI came calling. Agents asked Papadopoulos about his interactions with the Russians. Even though he must have known that hundreds of his emails on the subject would soon be available to the FBI, he lied and told the agents that the contacts had occurred many months before he joined the campaign. History will record Papadopoulos as the man who forgot that emails carry dates on them. After the FBI interview, according to the indictment, he tried to destroy evidence with the same competence he has brought to his other endeavors. He closed his Facebook account, on which several communications with the Russians had taken place. He threw out his old cellphone. (That should do it!) After that, he began wearing a blindfold, on the theory that if he couldn’t see the FBI, the FBI couldn’t see him.
I made that last one up, obviously. For now, the great hope of scandal hobbyists is that Papadopoulus was wearing a wire between the time he secretly pled guilty and the time his plea was made public. This would have allowed him to gather all kinds of incriminating dirt in conversations with former colleagues. And the dirt is there, all right, as the Manafort indictment proves. Unfortunately for our scandal fetishists, so far none of it shows what their hearts most desire: active collusion between Russia and the Trump campaign.
Choose your plan and pay nothing for six Weeks!
An affair to remember
All this changed with the release in 1967 of Arthur Penn’s Bonnie and Clyde and Mike Nichols’s The Graduate. These two films, made in nouveau European style, treated familiar subjects—a pair of Depression-era bank robbers and a college graduate in search of a place in the adult world—in an unmistakably modern manner. Both films were commercial successes that catapulted their makers and stars into the top echelon of what came to be known as “the new Hollywood.”
Bonnie and Clyde inaugurated a new era in which violence on screen simultaneously became bloodier and more aestheticized, and it has had enduring impact as a result. But it was The Graduate that altered the direction of American moviemaking with its specific appeal to younger and hipper moviegoers who had turned their backs on more traditional cinematic fare. When it opened in New York in December, the movie critic Hollis Alpert reported with bemusement that young people were lining up in below-freezing weather to see it, and that they showed no signs of being dismayed by the cold: “It was as though they all knew they were going to see something good, something made for them.”
The Graduate, whose aimless post-collegiate title character is seduced by the glamorous but neurotic wife of his father’s business partner, is part of the common stock of American reference. Now, a half-century later, it has become the subject of a book-length study, Beverly Gray’s Seduced by Mrs. Robinson: How The Graduate Became the Touchstone of a Generation.1 As is so often the case with pop-culture books, Seduced by Mrs. Robinson is almost as much about its self-absorbed Baby Boomer author (“The Graduate taught me to dance to the beat of my own drums”) as its subject. It has the further disadvantage of following in the footsteps of Mark Harris’s magisterial Pictures at a Revolution: Five Movies and the Birth of the New Hollywood (2008), in which the film is placed in the context of Hollywood’s mid-’60s cultural flux. But Gray’s book offers us a chance to revisit this seminal motion picture and consider just why it was that The Graduate spoke to Baby Boomers in a distinctively personal way.T he Graduate began life in 1963 as a novella of the same name by Charles Webb, a California-born writer who saw his book not as a comic novel but as a serious artistic statement about America’s increasingly disaffected youth. It found its way into the hands of a producer named Lawrence Turman who saw The Graduate as an opportunity to make the cinematic equivalent of Salinger’s The Catcher in the Rye. Turman optioned the book, then sent it to Mike Nichols, who in 1963 was still best known for his comic partnership with Elaine May but had just made his directorial debut with the original Broadway production of Barefoot in the Park.
Both men saw that The Graduate posed a problem to anyone seeking to put it on the screen. In Turman’s words, “In the book the character of Benjamin Braddock is sort of a whiny pain in the fanny [whom] you want to shake or spank.” To this end, they turned to Buck Henry, who had co-created the popular TV comedy Get Smart with Mel Brooks, to write a screenplay that would retain much of Webb’s dryly witty dialogue (“I think you’re the most attractive of all my parents’ friends”) while making Benjamin less priggish.
Nichols’s first major act was casting Dustin Hoffman, an obscure New York stage actor pushing 30, for the title role. No one but Nichols seems to have thought him suitable in any way. Not only was Hoffman short and nondescript-looking, but he was unmistakably Jewish, whereas Benjamin is supposedly the scion of a newly monied WASP family from southern California. Nevertheless, Nichols decided he wanted “a short, dark, Jewish, anomalous presence, which is how I experience myself,” in order to underline Benjamin’s alienation from the world of his parents.
Nichols filled the other roles in equally unexpected ways. He hired the Oscar winner Anne Bancroft, only six years Hoffman’s senior, to play the unbalanced temptress who lures Benjamin into her bed, then responds with volcanic rage when he falls in love with her beautiful daughter Elaine. He and Henry also steered clear of on-screen references to the campus protests that had only recently started to convulse America. Instead, he set The Graduate in a timeless upper-middle-class milieu inhabited by people more interested in social climbing than self-actualization—the same milieu from which Benjamin is so alienated that he is reduced to near-speechlessness whenever his family and their friends ask him what he plans to do now that he has graduated.
The film’s only explicit allusion to its cultural moment is the use on the soundtrack of Simon & Garfunkel’s “The Sound of Silence,” the painfully earnest anthem of youthful angst that is for all intents and purposes the theme song of The Graduate. Nevertheless, Henry’s screenplay leaves little doubt that the film was in every way a work of its time and place. As he later explained to Mark Harris, it is a study of “the disaffection of young people for an environment that they don’t seem to be in sync with.…Nobody had made a film specifically about that.”
This aspect of The Graduate is made explicit in a speech by Benjamin that has no direct counterpart in the novel: “It’s like I was playing some kind of game, but the rules don’t make any sense to me. They’re being made up by all the wrong people. I mean, no one makes them up. They seem to make themselves up.”
The Graduate was Nichols’s second film, following his wildly successful movie version of Edward Albee’s Who’s Afraid of Virginia Woolf?. Albee’s play was a snarling critique of the American dream, which he believed to be a snare and a delusion. The Graduate had the same skeptical view of postwar America, but its pessimism was played for laughs. When Benjamin is assured by a businessman in the opening scene that the secret to success in America is “plastics,” we are meant to laugh contemptuously at the smugness of so blinkered a view of life. Moreover, the contempt is as real as the laughter: The Graduate has it both ways. For the same reason, the farcical quality of the climactic scene (in which Benjamin breaks up Elaine’s marriage to a handsome young WASP and carts her off to an unknown fate) is played without musical underscoring, a signal that what Benjamin is doing is really no laughing matter.
The youth-oriented message of The Graduate came through loud and clear to its intended audience, which paid no heed to the mixed reviews from middle-aged reviewers unable to grasp what Nichols and Henry were up to. Not so Roger Ebert, the newly appointed 25-year-old movie critic of the Chicago Sun-Times, who called The Graduate “the funniest American comedy of the year…because it has a point of view. That is to say, it is against something.”
Even more revealing was the response of David Brinkley, then the co-anchor of NBC’s nightly newscast, who dismissed The Graduate as “frantic nonsense” but added that his college-age son and his classmates “liked it because it said about the parents and others what they would have said about us if they had made the movie—that we are self-centered and materialistic, that we are licentious and deeply hypocritical about it, that we try to make them into walking advertisements for our own affluence.”
A year after the release of The Graduate, a film-industry report cited in Pictures at a Revolution revealed that “48 percent of all movie tickets in America were now being sold to filmgoers under the age of 24.” A very high percentage of those tickets were to The Graduate and Bonnie and Clyde. At long last, Hollywood had figured out what the Baby Boomers wanted to see.A nd how does The Graduate look a half-century later? To begin with, it now appears to have been Mike Nichols’s creative “road not taken.” In later years, Nichols became less an auteur than a Hollywood director who thought like a Broadway director, choosing vehicles of solid middlebrow-liberal appeal and serving them faithfully without imposing a strong creative vision of his own. In The Graduate, by contrast, he revealed himself to be powerfully aware of the same European filmmaking trends that shaped Bonnie and Clyde. Within a naturalistic framework, he deployed non-naturalistic “new wave” cinematographic techniques with prodigious assurance—and he was willing to end The Graduate on an ambiguous note instead of wrapping it up neatly and pleasingly, letting the camera linger on the unsure faces of Hoffman and Ross as they ride off into an unsettling future.
It is this ambiguity, coupled with Nichols’s prescient decision not to allow The Graduate to become a literal portrayal of American campus life in the troubled mid-’60s, that has kept the film fresh. But The Graduate is fresh in a very particular way: It is a young person’s movie, the tale of a boy-man terrified by the prospect of growing up to be like his parents. Therein lay the source of its appeal to young audiences. The Graduate showed them what they, too, feared most, and hinted at a possible escape route.
In the words of Beverly Gray, who saw The Graduate when it first came out in 1967: “The Graduate appeared in movie houses just as we young Americans were discovering how badly we wanted to distance ourselves from the world of our parents….That polite young high achiever, those loving but smothering parents, those comfortable but slightly bland surroundings: They combined to form an only slightly exaggerated version of my own cozy West L.A. world.”
Yet to watch The Graduate today—especially if you first saw it when much younger—is also to be struck by the extreme unattractiveness of its central character. Hoffman plays Benjamin not as the comically ineffectual nebbish of Jewish tradition but as a near-catatonic robot who speaks by turns in a flat monotone and a frightened nasal whine. It is impossible to understand why Mrs. Robinson would want to go to bed with such a mousy creature, much less why Elaine would run off with him—an impression that has lately acquired an overlay of retrospective irony in the wake of accusations that Hoffman has sexually harassed female colleagues on more than one occasion. Precisely because Benjamin is so unlikable, it is harder for modern-day viewers to identify with him in the same way as did Gray and her fellow Boomers. To watch a Graduate-influenced film like Noah Baumbach’s Kicking and Screaming (1995), a poignant romantic comedy about a group of Gen-X college graduates who deliberately choose not to get on with their lives, is to see a closely similar dilemma dramatized in an infinitely more “relatable” way, one in which the crippling anxiety of the principal characters is presented as both understandable and pitiable, thus making it funnier.
Be that as it may, The Graduate is a still-vivid snapshot of a turning point in American cultural history. Before Benjamin Braddock, American films typically portrayed men who were not overgrown, smooth-faced children but full-grown adults, sometimes misguided but incontestably mature. After him, permanent immaturity became the default position of Hollywood-style masculinity.
For this reason, it will be interesting to see what the Millennials, so many of whom demand to be shielded from the “triggering” realities of adult life, make of The Graduate if and when they come to view it. I have a feeling that it will speak to a fair number of them far more persuasively than it did to those of us who—unlike Benjamin Braddock—longed when young to climb the high hill of adulthood and see for ourselves what awaited us on the far side.
1 Algonquin, 278 pages
Choose your plan and pay nothing for six Weeks!
“I think that’s best left to states and locales to decide,” DeVos replied. “If the underlying question is . . .”
Murphy interrupted. “You can’t say definitively today that guns shouldn’t be in schools?”
“Well, I will refer back to Senator Enzi and the school that he was talking about in Wapiti, Wyoming, I think probably there, I would imagine that there’s probably a gun in the school to protect from potential grizzlies.”
Murphy continued his line of questioning unfazed. “If President Trump moves forward with his plan to ban gun-free school zones, will you support that proposal?”
“I will support what the president-elect does,” DeVos replied. “But, senator, if the question is around gun violence and the results of that, please know that my heart bleeds and is broken for those families that have lost any individual due to gun violence.”
Because all this happened several million outrage cycles ago, you may have forgotten what happened next. Rather than mention DeVos’s sympathy for the victims of gun violence, or her support for federalism, or even her deference to the president, the media elite fixated on her hypothetical aside about grizzly bears.
“Betsy DeVos Cites Grizzly Bears During Guns-in-Schools Debate,” read the NBC News headline. “Citing grizzlies, education nominee says states should determine school gun policies,” reported CNN. “Sorry, Betsy DeVos,” read a headline at the Atlantic, “Guns Aren’t a Bear Necessity in Schools.”
DeVos never said that they were, of course. Nor did she “cite” the bear threat in any definitive way. What she did was decline the opportunity to make a blanket judgment about guns and schools because, in a continent-spanning nation of more than 300 million people, one standard might not apply to every circumstance.
After all, there might be—there are—cases when guns are necessary for security. Earlier this year, Virginia Governor Terry McAuliffe signed into law a bill authorizing some retired police officers to carry firearms while working as school guards. McAuliffe is a Democrat.
In her answer to Murphy, DeVos referred to a private meeting with Senator Enzi, who had told her of a school in Wyoming that has a fence to keep away grizzly bears. And maybe, she reasoned aloud, the school might have a gun on the premises in case the fence doesn’t work.
As it turns out, the school in Wapiti is gun-free. But we know that only because the Washington Post treated DeVos’s offhand remark as though it were the equivalent of Alexander Butterfield’s revealing the existence of the secret White House tapes. “Betsy DeVos said there’s probably a gun at a Wyoming school to ward off grizzlies,” read the Post headline. “There isn’t.” Oh, snap!
The article, like the one by NBC News, ended with a snarky tweet. The Post quoted user “Adam B.,” who wrote, “‘We need guns in schools because of grizzly bears.’ You know what else stops bears? Doors.” Clever.
And telling. It becomes more difficult every day to distinguish between once-storied journalistic institutions and the jabbering of anonymous egg-avatar Twitter accounts. The eagerness with which the press misinterprets and misconstrues Trump officials is something to behold. The “context” the best and brightest in media are always eager to provide us suddenly goes poof when the opportunity arises to mock, impugn, or castigate the president and his crew. This tendency is especially pronounced when the alleged gaffe fits neatly into a prefabricated media stereotype: that DeVos is unqualified, say, or that Rick Perry is, well, Rick Perry.
On November 2, the secretary of energy appeared at an event sponsored by Axios.com and NBC News. He described a recent trip to Africa:
It’s going to take fossil fuels to push power out to those villages in Africa, where a young girl told me to my face, “One of the reasons that electricity is so important to me is not only because I won’t have to try to read by the light of a fire, and have those fumes literally killing people, but also from the standpoint of sexual assault.” When the lights are on, when you have light, it shines the righteousness, if you will, on those types of acts. So from the standpoint of how you really affect people’s lives, fossil fuels is going to play a role in that.
This heartfelt story of the impact of electrification on rural communities was immediately distorted into a metaphor for Republican ignorance and cruelty.
“Energy Secretary Rick Perry Just Made a Bizarre Claim About Sexual Assault and Fossil Fuels,” read the Buzzfeed headline. “Energy Secretary Rick Perry Says Fossil Fuels Can Prevent Sexual Assault,” read the headline from NBC News. “Rick Perry Says the Best Way to Prevent Rape Is Oil, Glorious Oil,” said the Daily Beast.
“Oh, that Rick Perry,” wrote Gail Collins in a New York Times column. “Whenever the word ‘oil’ is mentioned, Perry responds like a dog on the scent of a hamburger.” You will note that the word “oil” is not mentioned at all in Perry’s remarks.
You will note, too, that what Perry said was entirely commonsensical. While the precise relation between public lighting and public safety is unknown, who can doubt that brightly lit areas feel safer than dark ones—and that, as things stand today, cities and towns are most likely to be powered by fossil fuels? “The value of bright street lights for dispirited gray areas rises from the reassurance they offer to some people who need to go out on the sidewalk, or would like to, but lacking the good light would not do so,” wrote Jane Jacobs in The Death and Life of Great American Cities. “Thus the lights induce these people to contribute their own eyes to the upkeep of the street.” But c’mon, what did Jane Jacobs know?
No member of the Trump administration so rankles the press as the president himself. On the November morning I began this column, I awoke to outrage that President Trump had supposedly violated diplomatic protocol while visiting Japan and its prime minister, Shinzo Abe. “President Trump feeds fish, winds up pouring entire box of food into koi pond,” read the CNN headline. An article on CBSNews.com headlined “Trump empties box of fish food into Japanese koi pond” began: “President Donald Trump’s visit to Japan briefly took a turn from formal to fishy.” A Bloomberg reporter traveling with the president tweeted, “Trump and Abe spooning fish food into a pond. (Toward the end, @potus decided to just dump the whole box in for the fish).”
Except that’s not what Trump “decided.” In fact, Trump had done exactly what Abe had done a few seconds before. That fact was buried in write-ups of the viral video of Trump and the fish. “President Trump was criticized for throwing an entire box of fish food into a koi pond during his visit to Japan,” read a Tweet from the New York Daily News, linking to a report on phony criticism Trump received because of erroneous reporting from outlets like the News.
There’s an endless, circular, Möbius-strip-like quality to all this nonsense. Journalists are so eager to catch the president and his subordinates doing wrong that they routinely traduce the very canons of journalism they are supposed to hold dear. Partisan and personal animus, laziness, cynicism, and the oversharing culture of social media are a toxic mix. The press in 2017 is a lot like those Japanese koi fish: frenzied, overstimulated, and utterly mindless.
Choose your plan and pay nothing for six Weeks!
Review of 'Lessons in Hope' By George Weigel
Standing before the eternal flame, a frail John Paul shed silent tears for 6 million victims, including some of his own childhood friends from Krakow. Then, after reciting verses from Psalm 31, he began: “In this place of memories, the mind and heart and soul feel an extreme need for silence. … Silence, because there are no words strong enough to deplore the terrible tragedy of the Shoah.” Parkinson’s disease strained his voice, but it was clear that the pope’s irrepressible humanity and spiritual strength had once more stood him in good stead.
George Weigel watched the address from NBC’s Jerusalem studios, where he was providing live analysis for the network. As he recalls in Lessons in Hope, his touching and insightful memoir of his time as the pope’s biographer, “Our newsroom felt the impact of those words, spoken with the weight of history bearing down on John Paul and all who heard him: normally a place of bedlam, the newsroom fell completely silent.” The pope, he writes, had “invited the world to look, hard, at the stuff of its redemption.”
Weigel, a senior fellow at the Ethics and Public Policy Center, published his biography of John Paul in two volumes, Witness to Hope (1999) and The End and the Beginning (2010). His new book completes a John Paul triptych, and it paints a more informal, behind-the-scenes portrait. Readers, Catholic and otherwise, will finish the book feeling almost as though they knew the 264th successor of Peter. Lessons in Hope is also full of clerical gossip. Yet Weigel never loses sight of his main purpose: to illuminate the character and mind of the “emblematic figure of the second half of the twentieth century.”
The book’s most important contribution comes in its restatement of John Paul’s profound political thought at a time when it is sorely needed. Throughout, Weigel reminds us of the pope’s defense of the freedom of conscience; his emphasis on culture as the primary engine of history; and his strong support for democracy and the free economy.
When the Soviet Union collapsed, the pope continued to promote these ideas in such encyclicals as Centesimus Annus. The 1991 document reiterated the Church’s opposition to socialist regimes that reduce man to “a molecule within the social organism” and trample his right to earn “a living through his own initiative.” Centesimus Annus also took aim at welfare states for usurping the role of civil society and draining “human energies.” The pope went on to explain the benefits, material and moral, of free enterprise within a democratic, rule-of-law framework.
Yet a libertarian manifesto Centesimus Annus was not. It took note of free societies’ tendency to breed spiritual poverty, materialism, and social incohesion, which in turn could lead to soft totalitarianism. John Paul called on state, civil society, and people of God to supply the “robust public moral culture” (in Weigel’s words) that would curb these excesses and ensure that free-market democracies are ordered to the common good.
When Weigel emerged as America’s preeminent interpreter of John Paul, in the 1980s and ’90s, these ideas were ascendant among Catholic thinkers. In addition to Weigel, proponents included the philosopher Michael Novak and Father Richard John Neuhaus of First Things magazine (both now dead). These were faithful Catholics (in Neuhaus’s case, a relatively late convert) nevertheless at peace with the free society, especially the American model. They had many qualms with secular modernity, to be sure. But with them, there was no question that free societies and markets are preferable to unfree ones.
How things have changed. Today all the energy in those Catholic intellectual circles is generated by writers and thinkers who see modernity as beyond redemption and freedom itself as the problem. For them, the main question is no longer how to correct the free society’s course (by shoring up moral foundations, through evangelization, etc.). That ship has sailed or perhaps sunk, according to this view. The challenges now are to protect the Church against progressivism’s blows and to see beyond the free society as a political horizon.
Certainly the trends that worried John Paul in Centesimus Annus have accelerated since the encyclical was issued. “The claim that agnosticism and skeptical relativism are the philosophy and the basic attitude which correspond to democratic forms of political life” has become even more hegemonic than it was in 1991. “Those who are convinced that they know the truth and firmly adhere to it” increasingly get treated as ideological lepers. And with the weakening of transcendent truths, ideas are “easily manipulated for reasons of power.”
Thus a once-orthodox believer finds himself or herself compelled to proclaim that there is no biological basis to gender; that men can menstruate and become pregnant; that there are dozens of family forms, all as valuable and deserving of recognition as the conjugal union of a man and a woman; and that speaking of the West’s Judeo-Christian patrimony is tantamount to espousing white supremacy. John Paul’s warnings read like a description of the present.
The new illiberal Catholics—a label many of these thinkers embrace—argue that these developments aren’t a distortion of the idea of the free society but represent its very essence. This is a mistake. Basic to the free society is the freedom of conscience, a principle enshrined in democratic constitutions across the West and, I might add, in the Catholic Church’s post–Vatican II magisterium. Under John Paul, religious liberty became Rome’s watchword in the fight against Communist totalitarianism, and today it is the Church’s best weapon against the encroachments of secular progressivism. The battle is far from lost, moreover. There is pushback in the courts, at the ballot box, and online. Sometimes it takes demagogic forms that should discomfit people of faith. Then again, there is a reason such pushback is called “reaction.”
A bigger challenge for Catholics prepared to part ways with the free society as an ideal is this: What should Christian politics stand for in the 21st century? Setting aside dreams of reuniting throne and altar and similar nostalgia, the most cogent answer offered by Catholic illiberalism is that the Church should be agnostic with respect to regimes. As Harvard’s Adrian Vermeule has recently written, Christians should be ready to jettison all “ultimate allegiances,” including to the Constitution, while allying with any party or regime when necessary.
What at first glance looks like an uncompromising Christian politics—cunning, tactical, and committed to nothing but the interests of the Church—is actually a rather passive vision. For a Christianity that is “radically flexible” in politics is one that doesn’t transform modernity from within. In practice, it could easily look like the Vatican Ostpolitik diplomacy that sought to appease Moscow before John Paul was elected.
Karol Wojtya discarded Ostpolitik as soon as he took the Petrine office. Instead, he preached freedom and democracy—and meant it. Already as archbishop of Krakow under Communism, he had created free spaces where religious and nonreligious dissidents could engage in dialogue. As pope, he expressed genuine admiration for the classically liberal and decidedly secular Vaclav Havel. He hailed the U.S. Constitution as the source of “ordered freedom.” And when, in 1987, the Chilean dictator Augusto Pinochet asked him why he kept fussing about democracy, seeing as “one system of government is as good as another,” the pope responded: No, “the people have a right to their liberties, even if they make mistakes in exercising them.”
The most heroic and politically effective Christian figure of the 20th century, in other words, didn’t follow the path of radical flexibility. His Polish experience had taught him that there are differences between regimes—that some are bound to uphold conscience and human dignity, even if they sometimes fall short of these commitments, while others trample rights by design. The very worst of the latter kind could even whisk one’s boyhood friends away to extermination camps. There could be no radical Christian flexibility after the Holocaust.