WE DO NOT YET have a full-scale history of intellectuals in the United States, but when that book comes to…
We do not yet have a full-scale history of intellectuals in the United States, but when that book comes to be written one of its central themes will surely be that our intellectuals have done their work mostly in isolation. Even the groups we locate in the past—the Transcendentalists encircling Emerson, the writers and critics following Van Wyck Brooks during the Seven Arts period—are groups mainly by courtesy of retrospect. The figures we see within them were not nearly so close to one another in experience nor so allied in opinion as our need for historical reconstruction makes them out to have been. The kind of inner fraternity we associate with literary groups in Paris and London has rarely been characteristic of American intellectual life. It is hardly an accident that one of our most poignant cultural legends concerns the brief friendship between Hawthorne and Melville and then its long sequel of separation. Ours is a culture in which people rattle around.
A seeming exception is the group of writers who have come to be known, these past few decades, as the New York intellectuals. They appear to have a common history, prolonged now for more than thirty years; a common political outlook, even if marked by ceaseless internecine quarrels; a common style of thought and perhaps composition; a common focus of intellectual interests; and once you get past politeness—which becomes, these days, easier and easier—a common ethnic origin. They are, or until recently have been, anti-Communist; they are, or until some time ago were, radicals; they have a fondness for ideological speculation; they write literary criticism with a strong social emphasis; they revel in polemic; they strive self-consciously to be “brilliant”; and by birth or osmosis, they are Jews.
The New York intellectuals are perhaps the only group America has ever had that could be described as an intelligentsia. This term comes awkwardly to our lips, and for good reason: it suggests, as Martin Malia, a historian of Russian culture, writes, “more than intellectuals in the ordinary sense. Whether merely ‘critical-thinking’ or actively oppositional, their name indicates that [in Russia] they thought of themselves as the embodied ‘intelligence’ . . . or ‘consciousness’ of the nation. They clearly felt an exceptional sense of apartness from the society in which they lived.”
Malia's phrase about “consciousness of the nation” seems special to the problems of the Russian intellectuals under Tsarism, but the rest of his description fits the New York intellectuals rather well: the stress upon “critical thinking,” the stance of active opposition, the sense of apartness. Or perhaps more accurately, it is a description which fits the past of the New York intellectuals. And just as the Russian “intelligentsia” was marked by a strongly Westernizing outlook, a wish to bring Russian culture out of its provincial limits and into a close relationship with the culture of Western Europe, so the New York intellectuals have played a role in the internationalization of American culture, serving as a liaison between American readers and Russian politics, French ideas, European writing.
A more complicated approach to the problem of the intelligentsia is provided by Renato Poggioli in his book The Theory of the Avant Garde. He describes the Russian intelligentsia as “an intellectual order from the lower ranks . . . created by those who were rejected by other classes: an intellectual order whose function was not so much cultural as political. . . .” Poggioli remarks that in Russia the term referred to a “cultural proletariat,” but
these intellectuals are not so much prolertrian as proletarianizing . . . they may become ideologically and politically bound to the mass of workers and peasants but they are not, at bottom, an order economically bound to the interests of those masses. A member of the intelligentsia is not born but made.
I suspect there may be a contradiction between regarding the intelligentsia as an order “from the lower ranks” and concluding that a member of this order “is not born but made.” But Poggioli's description is valuable insofar as it suggests that the intelligentsia is defined primarily by its position in society rather than by its relation to culture. Poggioli wishes sharply to distinguish the intelligentsia from an “intellectual elite,” which he regards as a self-mobilized group whose raison d'être is a cultural attitude, and in our time, a positive commitment to modernist literature. In respect to late 19th and early 20th-century Russia, this distinction is useful; when we turn to America, we are obviously dealing with loose analogies, yet useful ones too, for Poggioli's distinction should help us, a little later, to see the precise nature and limits of the New York intellectuals as a group.
Reflecting upon the experience of these writers, one begins to wonder whether—apart from a few years during the late 30's—they ever did constitute a coherent and self-defined group. The steady exchange of ideas, the reading of manuscripts, the preliminary discussion of work, all these characteristics of European intellectuals were not often evident in New York. On the contrary. In their work habits the New York intellectuals have mostly been loners, and in their relationships with one another, closer to the vision of life we associate with Hobbes than with Kropotkin. Repeatedly I have been struck by the way writers commonly associated with this group will hotly deny that it exists, or will say that if indeed it does exist they—they!—would not be so docile as to be part of it. Certain New York intellectuals like Harold Rosenberg and Lionel Abel have never been very strong in sentiments of group fraternity, and Rosenberg, in the course of a polemic against other New York writers, once coined the memorable phrase, “a herd of independent minds.” Some, like myself, have seen themselves as only in part and then ambivalently related, since we are also caught up with a separate political milieu.1 After a time, in Europe, it became a source of pride for writers to say they had once been associated with the Bloomsbury group or the Scrutiny critics or the socialists led by Gorky before the Revolution; but for whatever reasons, that point has not been reached among the New York writers. I doubt that it ever will be. Contentious and, by virtue of their origins and history, uncertain as to their relationship with American culture, the New York intellectuals wish, so far as I can tell, to form a loose and unacknowledged tribe.
Yet the mere fact that there does exist a commonly-shared perception of a New York intellectual group, even if that perception is held mainly by hostile academics and a parasitic mass media, must be taken as decisive. That people “out there” believe in the reality of the New York group, makes it a reality of sorts. And in all candor there is something else: the New York writers dislike being labeled, they can speak bitterly about each other's work and opinions, they may not see one another from year's start to year's end, but they are nervously alert to one another's judgments. Attention is paid—whether from warranted respect or collective vanity or provincial narrowness, it hardly matters.
Such groups approach a fragile state of coherence only at the point where writers are coming together and the point where they are drifting apart. Especially does this seem true at the end, when there comes that tremor of self-awareness which no one would have troubled to feel during the years of energy and confidence. A tradition in process of being lost, a generation facing assault and ridicule from ambitious younger men—the rekindled sense of group solidarity is brought to a half-hour's flame by the hardness of dying. And it is at such moments that the mass media, never more than twenty years late, become aware of the problem: their publicity signals recognition and recognition a certificate of death.
The social roots of the New York writers are not hard to trace. With a few delightful exceptions—a tendril from Yale, a vine from Seattle—they stem from the world of the immigrant Jews, either workers or petty bourgeois.2 They come at a moment in the development of immigrant Jewish culture when there is a strong drive not only to break out of the ghetto but also to leave behind the bonds of Jewishness entirely. Earlier generations had known such feelings, and through many works of fiction, especially those by Henry Roth, Michael Gold, and Daniel Fuchs, one can return to the classic pattern of a fierce attachment to the provincialism of origins as it becomes entangled with a fierce eagerness to plunge into the Gentile world of success, manners, freedom. As early as the 1890's this pattern had already come into view, and with diminishing intensity it has continued to control Jewish life deep into the 20th century; perhaps its last significant expression comes in Philip Roth's stories, where the sense of Jewish tradition is feeble but the urge to escape its suburban ruins extremely strong.
The New York intellectuals were the first group of Jewish writers to come out of the immigrant milieu who did not define themselves through a relationship, nostalgic or hostile, to memories of Jewishness. They were the first generation of Jewish writers for whom the recall of an immigrant childhood does not seem to have been completely overwhelming. (Is that perhaps one reason few of them tried to write fiction?) That this severance from Jewish roots and immigrant sources would later come to seem a little suspect, is another matter. All I wish to stress here is that, precisely at the point in the 30's when the New York intellectuals began to form themselves into a loose cultural-political tendency, Jewishness as idea and sentiment played no significant role in their expectations—apart, to be sure, from a bitter awareness that no matter what their political or cultural desires, the sheer fact of their recent emergence had still to be regarded, and not least of all by themselves, as an event within Jewish American life.
For decades the life of the East European Jews, both in the old country and the new, might be compared to a tightly-gathered spring, trembling with unused force, which had been held in check until the climactic moment of settlement in America. Then the energies of generations came bursting out, with an ambition that would range from pure to coarse, disinterested to vulgar, and indeed would mix all these together, but finally—this ambition—would count for more as an absolute release than in any of its local manifestations. What made Sammy run was partly that his father and his father's father had been bound hand and foot. And in all the New York intellectuals there was and had to be a fraction of Sammy. All were driven by a sense of striving, a thrust of will, an unspoken conviction that time had now to be regained.
The youthful experiences described by Alfred Kazin in his autobiography are, apart from his distinctive outcroppings of temperament, more or less typical of the experiences of many New York intellectuals—except, at one or two points, for the handful that involved itself deeply in the radical movement. It is my impression, however, that Kazin's affectionate stress on the Jewish sources of his experience is mainly a feeling of retrospect, mainly a recognition in the 50's and 60's that no matter how you might try to shake off your past, it would still cling to your speech, gestures, skin and nose, it would still shape, with a thousand subtle movements, the way you did your work and raised your children. In the 30's, however, it was precisely the idea of discarding the past, breaking away from families, traditions, and memories which excited intellectuals. They meant to declare themselves citizens of the world and, that succeeding, perhaps consider becoming writers of this country.
The Jewish immigrant world branded upon its sons and daughters marks of separateness even while encouraging them to dreams of universalism. This subculture may have been formed to preserve ethnic continuity, but it was the kind of continuity that would reach its triumph in self-disintegration. It taught its children both to conquer the Gentile world and to be conquered by it, both to leave an intellectual impress and to accept the dominant social norms. By the 20's and 30's the values dominating Jewish immigrant life were mostly secular, radical, and universalist, and if these were often conveyed through a parochial vocabulary, they nonetheless carried some remnants of European culture. Even as they were moving out of a constricted immigrant milieu, the New York intellectuals were being prepared by it for the tasks they would set themselves during the 30's. They were being prepared for the intellectual vocation as one of assertiveness, speculation, and free-wheeling; for the strategic maneuvers of a vanguard, at this point almost a vanguard in the abstract, with no ranks following in the rear; and for the union of politics and culture, with the politics radical and the culture cosmopolitan. What made this goal all the more attractive was that the best living American critic, Edmund Wilson, had triumphantly reached it: he was the author of both The Triple Thinkers and To the Finland Station, he served as a model for emulation, and he gave this view of the intellectual life a special authority in that he seemed to come out of the mainstream of American life.
That the literary avant garde and the political Left were not really comfortable partners would become clear with the passage of time; in Europe it already had. But during the years the New York intellectuals began to appear as writers and critics worthy of some attention, there was a feeling in the air that a union of the advanced—critical consciousness and political conscience—could be forged.
Throughout the 30's the New York intellectuals believed, somewhat naively, that this union was not only a desirable possibility but a tie both natural and appropriate. Except, however, for the Surrealists in Paris, and it is not clear how seriously this instance should be taken, the paths of political radicalism and cultural modernism have seldom met. To use Poggioli's terms, the New York writers were more an “intelligentsia” than an “intellectual elite,” and more inclined to an amorphous “proletarianizing” than to an austere partisanship for modernism.
The history of the West in the last century offers many instances in which Jewish intellectuals played an important role in the development of political radicalism; but almost always this occurred when there were sizable movements, with the intellectuals serving as spokesmen, propagandists, and functionaries of a party. In New York, by contrast, the intellectuals had no choice but to begin with a dissociation from the only significant radical movement in this country, the Communist party. What for European writers like Koestler, Silone, and Malraux would be the end of the road was here a beginning. In a fairly short time, the New York writers found that the meeting of political and cultural ideas which had stirred them to excitement could also leave them stranded and distressed. Radicalism, in both its daily practice and ethical biases, proved inhospitable to certain aspects of modernism—and not always, I now think, mistakenly. Literary modernism often had a way of cavalierly dismissing the world of daily existence, a world that remained intensely absorbing to the New York writers. Literary modernism could sometimes align itself with reactionary movements, a fact that was intensely embarrassing and required either torturous explanations or complex dissociations. The New York writers discovered, as well, that their relationship to modernism as a purely literary phenomenon was less authoritative and more ambiguous than they had wished to feel. The great battles for Joyce and Eliot and Proust had been fought in the 20's and mostly won; and now, while clashes with entrenched philistinism might still take place, they were mostly skirmishes or mopping-up operations (as in the polemics against the transfigured Van Wyck Brooks). The New York writers came at the end of the modernist experience, just as they came at what may yet have to be judged the end of the radical experience, and as they certainly came at the end of the Jewish experience. One shorthand way of describing their situation, a cause of both their feverish brilliance and their recurrent instability, is to say that they came late.
During the 30's and 40's their radicalism was anxious, problematic, and beginning to decay at the very moment it was adopted. They had no choice: the crisis of socialism was worldwide, profound, with no end in sight, and the only way to avoid that crisis was to bury oneself, as a few did, in the left-wing sects. Some of the New York writers had gone through the “political school” of Stalinism, a training in coarseness from which not all recovered; some had even spent a short time in the organizational coils of the Communist party. By 1936, when the anti-Stalinist Partisan Review was conceived, the central figures of that moment—Philip Rahv, William Phillips, Sidney Hook—had shed whatever sympathies they once felt for Stalinism, but the hope that they could find another ideological system, some cleansed version of Marxism associated perhaps with Trotsky or Luxemburg, was doomed to failure. Some gravitated for a year or two toward the Trotskyist group, but apart from admiration for Trotsky's personal qualities and dialectical prowess, they found little satisfaction there; no version of orthodox Marxism could retain a hold on intellectuals who had gone through the trauma of abandoning the Leninist Weltanschauung and had experienced the depth to which the politics of this century, most notably the rise of totalitarianism, called into question the once-sacred Marxist categories. From now on, the comforts of system would have to be relinquished.
Though sometimes brilliant in expression and often a stimulus to the kind of cultural speculation at which they excelled, the radicalism of the New York intellectuals during the 30's was not a deeply-grounded experience. It lacked roots in a popular movement which might bring intellectuals into relationship with the complexities of power and stringencies of organization. From a doctrine it became a style, and from a style a memory. It was symptomatic that the Marxist Quarterly, started in 1937 by a spectrum of Left intellectuals and probably the most distinguished Marxist journal ever published in this country, could survive no more than a year. The differences among its founders, some like James Burn-ham holding to a revolutionary Marxist line and others like Sidney Hook and Lewis Corey moving toward versions of liberalism and social democracy, proved too severe for collaboration. And even the radicalism of the Partisan Review editors and writers during its vivid early years—how deeply did it cut, except as a tool enabling them to break away from Marxism? Which of those writers and readers who now look back nostalgically have troubled to examine the early files of this important magazine and read—with embarrassment? amusement? pleasure?—the political essays it printed?
Yet if the radicalism of the New York intellectuals seems to have been without much political foundation or ideological strength, it certainly played an important role in their own development. For the New York writers, and even I suspect those among them who would later turn sour on the whole idea of radicalism (including the few who in the mid-60's would try to erase the memory of having turned sour), the 30's represented a time of intensity and fervor, a reality or illusion of engagement, a youth tensed with conviction and assurance: so that even Dwight Macdonald, who at each point in his life has made a specialty out of mocking his previous beliefs, could not help displaying tender feelings upon remembering his years, God help us, as a “revolutionist.” The radicalism of the 30's gave the New York intellectuals their distinctive style: a flair for polemic, a taste for the grand generalization, an impatience with what they regarded (often parochially) as parochial scholarship, an internationalist perspective, and a tacit belief in the unity—even if a unity beyond immediate reach—of intellectual work.
By comparison with competing schools of thought, the radicalism of the anti-Stalinist Left, as it was then being advanced in Partisan Review, seemed cogent, fertile, alive: it could stir good minds to argument, it could gain the attention of writers abroad, it seemed to offer a combination of system and independence. With time the anti-Stalinist intellectuals came to enjoy advantages somewhat like those which have enabled old radicals to flourish in the trade unions: they could talk faster than anyone else, they knew their way around better, they were quicker on their feet. Brief and superficial as their engagement with Marxism may have been, it gave the intellectuals the advantage of dialectic, sometimes dialectic as it lapsed into mere double-talk.
Yet in fairness I should add that this radicalism did achieve something of substantial value in the history of American culture. It helped destroy—once and for all, I would have said until recently—Stalinism as a force in our intellectual life, and with Stalinism those varieties of populist sentimentality which the Communist movement of the late 30's exploited with notable skill. If certain sorts of manipulative soft-headedness have been all but banished from serious American writing, and the kinds of rhetoric once associated with Archibald MacLeish and Van Wyck Brooks cast into permanent disrepute, at least some credit for this ought to go to the New York writers.
It has recently become fashionable, especially in the pages of the New York Review of Books, to sneer at the achievements of anti-Stalinism by muttering darkly about “the cold war.” But we ought to have enough respect for the past at least to avoid telescoping several decades. The major battle against Stalinism as a force within intellectual life, and in truth a powerful force, occurred before anyone heard of the cold war; it occurred in the late 30's and early 40's. In our own moment we see “the old crap,” as Marx once called it, rise to the surface with unnerving ease; there is something dizzying in an encounter with Stalin's theory of “social fascism,” particularly when it comes from the lips of young people who may not even be quite sure when Stalin lived. Still, I think there will not and probably cannot be repeated in our intellectual life the ghastly large-scale infatuation with a totalitarian regime which disgraced the 30's. Some achievements, a very few, seem beyond destruction.
A little credit is therefore due. Whatever judgments one may have about Sidney Hook's later political writings, and mine have been very critical, it is a matter of decency to recall the liberating role he played in the 30's as spokesman for a democratic radicalism and a fierce opponent of all the rationalizations for totalitarianism a good many intellectuals allowed themselves. One reason people have recently felt free to look down their noses at “anti-Communism” as if it were a mass voodoo infecting everyone from far Right to democratic Left, is precisely the toughness with which the New York intellectuals fought against Stalinism. Neither they nor anybody else could reestablish socialism as a viable politics in the United States; but for a time they did help to salvage the honor of the socialist idea—which meant primarily to place it in the sharpest opposition to all totalitarian states and systems. What many intellectuals now say they take for granted, had first to be won through bitter and exhausting struggle.
I should not like to give the impression that Stalinism was the beginning and end of whatever was detestable in American intellectual life during the 30's. Like the decades to come, perhaps like all decades, this was a “low dishonest” time. No one who grew up in, or lived through, these years should wish for a replay of their ideological melodramas. Nostalgia for the 30's is a sentiment possible only to the very young or the very old, those who have not known and those who no longer remember. Whatever distinction can be assigned to the New York intellectuals during the 30's lies mainly in their persistence as a small minority, in their readiness to defend unpopular positions against apologists for the Moscow trials and the vigilantism of Popular Front culture. Some historians, with the selectivity of retrospect, have recently begun to place the New York intellectuals at the center of cultural life in the 30's—but this is both a comic misapprehension and a soiling honor. On the contrary; their best hours were spent on the margin, in opposition.
Later, in the 40's and 50's, most of the New York intellectuals would abandon the effort to find a renewed basis for a socialist politics—to their serious discredit, I believe. Some would vulgarize anti-Stalinism into a politics barely distinguishable from reaction. Yet for almost all New York intellectuals the radical years proved a decisive moment in their lives. And for a very few, the decisive moment.
I have been speaking here as if the New York intellectuals were mainly political people, but in reality this was true for only a few of them, writers like Hook, Macdonald, and perhaps Rahv. Most were literary men or journalists with no experience in any political movement; they had come to radical politics through the pressures of conscience and a flair for the dramatic; and even in later years, when they abandoned any direct political involvement, they would in some sense remain “political.” They would maintain an alertness toward the public event. They would respond with eagerness to historical changes, even if these promised renewed favor for the very ideas they had largely discarded. They would continue to structure their cultural responses through a sharp, perhaps excessively sharp, kind of categorization, in itself a sign that political styles and habits persisted. But for the most part, the contributions of the New York intellectuals were not to political thought. Given the brief span of time during which they fancied themselves agents of a renewed Marxism, there was little they could have done. Sidney Hook wrote one or two excellent books on the sources of Marxism, Harold Rosenberg one or two penetrating essays on the dramatics of Marxism; and not much more. The real contribution of the New York writers was toward creating a new, and for this country almost exotic, style of work. They thought of themselves as cultural radicals even after they had begun to wonder whether there was much point in remaining political radicals. But what could this mean? Cultural radicalism was a notion extremely hard to define and perhaps impossible to defend, as Richard Chase would discover in the late 50's, when against the main drift of New York opinion he put forward the idea of a radicalism without immediate political ends but oriented toward criticism of a meretricious culture. What Chase did not live long enough to see was that his idea, much derided at the time, would lend itself a decade later to caricature through success.
Chase was seriously trying to preserve a major impetus of New York intellectual life: the exploration and defense of literary modernism.3 He failed to see, however, that this was a task largely fulfilled and, in any case, taking on a far more ambiguous and less militant character in the 50's than it would have had twenty or thirty years earlier. The New York writers had done useful work in behalf of modernist literature. Without fully realizing it, they were continuing a cultural movement that had begun in the United States during the mid-19th century: the return to Europe, not as provincials knocking humbly at the doors of the great, but as equals in an enterprise which by its very nature had to be international. We see this at work in Howells's reception of Ibsen and Tolstoy; in Van Wyck Brooks's use of European models to assault the timidities of American literature; in the responsiveness of The Little Review and The Dial to European experiments; and somewhat paradoxically, in the later fixation of the New Critics, despite an ideology of cultural provincialism, on modernist writing from abroad.
The New York critics, and most notably Partisan Review, helped complete this process of internationalizing American culture (also, by the way, Americanizing international culture). They gave a touch of glamor to that style which the Russians and Poles now call “cosmopolitan.” Partisan Review was the first journal in which it was not merely respectable but a matter of pride to print one of Eliot's Four Quartets side by side with Marxist criticism. And not only did the magazine break down the polar rigidities of the hard-line Marxists and the hardline nativists; it also sanctioned the idea, perhaps the most powerful cultural idea of the last half century, that there existed an all but incomparable generation of modern masters, some of them still alive, who in this terrible age represented the highest possibilities of the human imagination. On a more restricted scale, Partisan Review helped win attention and respect for a generation of European writers—Silone, Orwell, Malraux, Koestler, Serge—who were not quite of the first rank as novelists but had yielded themselves to and suffered the failure of socialism.
If the Partisan critics came too late for a direct encounter with new work from the modern masters, they did serve the valuable end of placing that work in a cultural context more vital and urgent than could be provided by any other school of American criticism. For many young people up to and through the Second World War, the Partisan critics helped to mold a new sensibility, a mixture of rootless radicalism and a de-sanctified admiration for writers like Joyce, Eliot, and Kafka. I can recall that even in my orthodox Marxist phase I felt that the central literary expression of the time was a now half-forgotten poem by a St. Louis writer called “The Wasteland.”
In truth, however, the New York critics were then performing no more than an auxiliary service. They were following upon the work of earlier, more fortunate critics. And even in the task of cultural consolidation, which soon had the unhappy result of overconsolidating the modern masters in the academy, the New York critics found important allies among their occasional opponents in the New Criticism. As it turned out, the commitment to literary modernism proved insufficient either as a binding literary purpose or as a theme that might inform the writings of the New York critics. By now modernism was entering its period of decline; the old excitements had paled and the old achievements been registered. Modernism had become successful; it was no longer a literature of opposition, and thereby had begun that metamorphosis signifying its ultimate death. The problem was no longer to fight for modernism, the problem was now to consider why the fight had so easily ended in triumph. And as time went on, modernism surfaced an increasing portion of its limitations and ambiguities, so that among some critics earlier passions of advocacy gave way to increasing anxieties of judgment. Yet the moment had certainly not come when a cool and objective reconsideration could be undertaken of works that had formed the sensibility of our time. The New York critics, like many others, were trapped in a dilemma from which no escape could be found but which lent itself to brilliant improvisation: it was too late for unobstructed enthusiasm, it was too soon for unobstructed valuation, and meanwhile the literary work that was being published, though sometimes distinguished, was composed in the heavy shadows of the modernists. At almost every point this work betrayed the marks of having come after.
Except for Harold Rosenberg, Who would make “the tradition of the new” a signature of his criticism, the New York writers slowly began to release those sentiments of uneasiness they had been harboring about the modernist poets and novelists. One instance was the notorious Pound case,4 in which literary and moral values, if not jammed into a head-on collision, were certainly entangled beyond easy separation. Essays on writers like D. H. Lawrence—what to make of his call for “blood consciousness,” what one's true response might be to his notions of the leader cult—began to appear. A recent book by John Harrison, The Reactionaries, which contains a full-scale attack on the politics of several modernist writers, is mostly a compilation of views that had been gathering force over the last few decades. And then, as modernism stumbled into its late period, those recent years in which its early energies have evidently reached a point of exhaustion, the New York critics became still more discomfited. There was a notable essay several years ago by Lionel Trilling in which he acknowledged mixed feelings toward the modernist writers he had long praised and taught. There was a cutting attack by Philip Rahv on Jean Genet, a perverse genius in whose fiction the compositional resources of modernism seem all but severed from its moral—one might even say, its human—interests. And more recently there has been an essay by myself ending with the gloomy expectation that no dignified funeral awaits modernism, only noisy prolongation “in publicity and sensation, the kind of savage parody which may indeed be the only fate worse than death.”
For the New York intellectuals in the 30's and 40's there was still another focus of interest, never quite as strong as radical politics or literary modernism but seeming, for a brief time, to promise a valuable new line of discussion. In the essays of writers like Clement Greenberg and Dwight Macdonald, more or less influenced by the German neo-Marxist school of Adorno-Horkheimer, there were beginnings at a theory of “mass culture,” that mass-produced pseudo-art characteristic of industrialized urban society, together with its paralyzed audiences, its inaccessible sources, its parasitic relation to high culture. More insight than system and more intuition than knowledge, this slender body of work, which appeared mostly in Politics and COMMENTARY, was nevertheless a contribution to the study of that hazy area where culture and society meet. It was attacked by writers like Edward Shils as being haughtily elitist, on the ground that it assumed a condescension to the tastes and experiences of the masses. It was attacked by writers like Harold Rosenberg, who charged that only people taking a surreptitious pleasure in dipping their noses into trash would study the “content” (he had no objection to sociological investigations) of mass culture. Even at its most penetrating, the criticism of mass culture was beset by uncertainty and improvisation; perhaps all necessary for a beginning.
Then, almost as if by common decision, the whole subject was dropped. For years hardly a word could be found in the advanced journals about what a little earlier had been called a crucial problem of the modern era. One reason was that the theory advanced by Greenberg and Macdonald turned out to be static: it could be stated but apparently not developed. It suffered from weaknesses parallel to those of Hannah Arendt's theory of totalitarianism: by positing a cul de sac, a virtual end of days, for 20th-century man and his culture it proposed a suffocating relationship between high or minority culture and the ever-multiplying mass culture. From this relationship there seemed neither relief nor escape, and if one accepted this view, nothing remained but to refine the theory and keep adding grisly instances.
In the absence of more complex speculations, there was little point in continuing to write about mass culture. Besides, hostility toward the commercial pseudo-arts was hard to maintain with unyielding intensity, mostly because it was hard to remain all that interested in them—only in Macdonald's essays did both hostility and interest survive intact. Some felt that the whole matter had been inflated and that writers should stick to their business, which was literature, and intellectuals to theirs, which was ideas. Others felt that the movies and TV were beginning to show more ingenuity and resourcefulness than the severe notions advanced by Greenberg and Macdonald allowed for—though no one could have anticipated that glorious infatuation with trash which Marshall McLuhan would make acceptable. And still others felt that the multiplication of insights, even if pleasing as an exercise, failed to yield significant results: a critic who contributes a nuance to Dostoevsky criticism is working within a structured tradition, while one who throws off a clever observation about Little Orphan Annie is simply showing that he can do what he has done.
There was another and more political reason for the collapse of mass culture criticism. One incentive toward this kind of writing was the feeling that industrial society had reached a point of affluent stasis where no major upheavals could now be registered much more vividly in culture than in economics. While aware of the dangers of reductionism here, I think the criticism of mass culture did serve, as some of its critics charged, conveniently to replace the criticism of bourgeois society. If you couldn't stir the proletariat to action, you could denounce Madison Avenue in comfort. Once, however, it began to be felt among intellectuals in the 50's that there was no longer so overwhelming a need for political criticism, and then, at the other pole, once it began to seem in the 60's that there were new openings for political criticism, the appetite for cultural surrogates became less keen.
Greenberg now said little more about mass culture; Macdonald made no serious effort to extend his theory or test it against new events; and in recent years, younger writers have seemed to feel that the whole approach of these men was heavy and humorless. An influential critic like Susan Sontag has proposed a cheerfully eclectic view which undercuts just about everything written from the Greenberg-Macdonald position. Now everyone is to do “his thing,” high, middle, or low; the old puritan habit of interpretation and judgment, so inimical to sensuousness, gives way to a programmed receptivity; and thus we are enlightened by lengthy studies of the ethos of the Beatles.
By the end of the Second World War, the New York writers had reached a point of severe intellectual crisis, though as frequently happens at such moments, they themselves often felt they were entering a phase of enlarged influence and power. Perhaps indeed there was a relation between inner crisis and external influence. Everything that had kept them going—the idea of socialism, the advocacy of literary modernism, the assault on mass culture, a special brand of literary criticism—was judged to be irrelevant to the postwar years. But as a group, just at the time their internal disintegration had seriously begun, the New York writers could be readily identified. The leading critics were Rahv, Phillips, Trilling, Rosenberg, Abel, and Kazin. The main political theorist was Hook. Writers of poetry and fiction related to the New York milieu were Delmore Schwartz, Saul Bellow, Paul Goodman, and Isaac Rosenfeld. And the recognized scholar, as also inspiring moral force, was Meyer Schapiro.
A sharp turn occurs, or is completed, soon after the Second World War. The intellectuals now go racing or stumbling from idea to idea, notion to notion, hope to hope, fashion to fashion. This instability often derives from a genuine eagerness to capture all that seems new—or threatening—in experience, sometimes from a mere desire to capture a bitch-goddess whose first name is Novelty. The abandonment of ideology can be liberating: a number of talents, thrown back on their own resources, begin to grow. The surrender of “commitment” can be damaging: some writers find themselves rattling about in a gray and chilly freedom. The culture opens up, with both temptation and generosity, and together with intellectual anxieties there are public rewards, often deserved. A period of dispersion; extreme oscillations in thought; and a turn in politics toward an increasingly conservative kind of liberalism—reflective, subtle, acquiescent.
The postwar years were marked by a sustained discussion of the new political and intellectual problems raised by the totalitarian state. Nothing in received political systems, neither Marxist nor liberal, adequately prepared one for the frightful mixture of terror and ideology, the capacity to sweep along the plebeian masses and organize a warfare state, and above all the readiness to destroy entire peoples, which characterized totalitarianism. Still less was anyone prepared—who had heeded the warning voices of the Russian socialist Martov or the English liberal Russell?—for the transformation of the revolutionary Bolshevik state, either through a “necessary” degeneration or an internal counterrevolution, into one of the major totalitarian powers. Marxist theories of fascism—the “last stage” of capitalism, with the economy statified to organize a permanent war machine and mass terror employed to put down rebellious workers—came to seem, if not entirely mistaken, then certainly insufficient. The quasi- or pseudo-Leninist notion that “bourgeois democracy” was merely a veiled form of capitalist domination, no different in principle from its open dictatorship, proved to be a moral and political disaster. The assumption that socialism was an ordained “next step,” or that nationalization of industry constituted a sufficient basis for working-class rule, was as great a disaster. No wonder intellectual certainties were shattered and these years marked by frenetic improvisation! At every point, with the growth of Communist power in Europe and with the manufacture of the Bomb at home, apocalypse seemed the name of tomorrow.
So much foolishness and malice has been written about the New York intellectuals and their anti-Communism, either by those who have signed a separate peace with the authoritarian idea or those who lack the courage to defend what is defensible in their own past, that I want here to be both blunt and unyielding.
Given the enormous growth of Russian power after the Second World War and the real possibility of a Communist takeover in Europe, the intellectuals—and not they alone—had to reconsider their political responses.5 An old-style Marxist declaration of rectitude, a plague repeated on both their houses? Or the difficult position of making foreign policy proposals for the United States, while maintaining criticism of its social order, so as to block totalitarian expansion without resort to war? Most intellectuals decided they had to choose the second course, and as far as that goes, I think they were right.
Like anti-capitalism, anti-Communism was a tricky politics, all too open to easy distortion. Like anti-capitalism, anti-Communism could be put to the service of ideological racketeering and reaction. Just as ideologues of the fanatic Right insisted that by some ineluctable logic anti-capitalism led to a Stalinist terror, so ideologues of the authoritarian Left, commandeering the same logic, declared that anti-Communism led to the politics of Dulles and Rusk. There is, of course, no “anti-capitalism” or “anti-Communism” in the abstract; these take on political flesh only when linked with a larger body of programs and values, so that it becomes clear what kind of “anti-capitalism” or “anti-Communism” we are dealing with. It is absurd, and indeed disreputable, for intellectuals in the 60's to write as if there were a unified “anti-Communism” which can be used to enclose the views of everyone from William Buckley to Michael Harrington.
There were difficulties. A position could be worked out for conditional support of the West when it defended Berlin or introduced the Marshall Plan or provided economic help to underdeveloped countries; but in the course of daily politics, in the effort to influence the foreign policy of what remained a capitalist power, intellectuals could lose their independence and slip into vulgarities of analysis and speech.
Painful choices had to be faced. When the Hungarian revolution broke out in 1956, most intellectuals sympathized strongly with the rebels yet feared that active intervention by the West might provoke a world war. For a rational and humane mind, anti-Communism could not be the sole motive, it could only be one of several, in political behavior and policy; and even those intellectuals who had by now swung a considerable distance to the Right did not advocate military intervention in Hungary. There was simply no way out—as, just recently, there was none in Czechoslovakia.
It became clear, furthermore, that U.S. military intervention in underdeveloped countries could help local reactionaries in the short run, and the Communists in the long run. These difficulties were inherent in postwar politics, and they ruled out—though for that very reason, also made tempting—a simplistic moralism. These difficulties were also exacerbated by the spread among intellectuals of a crude anti-Communism, often ready to justify whatever the U.S. might do at home and abroad. For a hard-line group within the American Committee for Cultural Freedom, all that seemed to matter in any strongly-felt way was a sour hatred of the Stalinists, historically justifiable but more and more a political liability even in the fight against Stalinism. The dangers in such a politics now seem all too obvious, but I should note, for whatever we may mean by the record, that in the early 50's they were already being pointed out by a mostly unheeded minority of intellectuals around Dissent. Yet, with all these qualifications registered, the criticism to be launched against the New York intellectuals in the postwar years is not that they were strongly anti-Communist but rather that many of them, through disorientation or insensibility, allowed their anti-Communism to become something cheap and illiberal.
Nor is the main point of moral criticism that the intellectuals abandoned socialism. We have no reason to suppose that the declaration of a socialist opinion induces a greater humaneness than does acquiescence in liberalism. It could be argued (I would) that in the ease with which ideas of socialism were now brushed aside there was something shabby. It was undignified, at the very least, for people who had made so much of their Marxist credentials now to put to rest so impatiently the radicalism of their youth. Still, it might be said by some of the New York writers that reality itself had forced them to conclude socialism was no longer viable or had become irrelevant to the American scene, and that while this conclusion might be open to political argument, it was not to moral attack.
Let us grant that for a moment. What cannot be granted is that the shift in ideologies required or warranted the surrender of critical independence which was prevalent during the 50's. In the trauma—or relief—of ideological ricochet, all too many intellectuals joined the American celebration. It was possible, to cite but one of many instances, for Mary McCarthy to write: “Class barriers disappear or tend to become porous [in the U.S.]; the factory worker is an economic aristocrat in comparison with the middle-class clerk. . . . The America . . . of vast inequalities and dramatic contrasts is rapidly ceasing to exist”6 (emphasis added). Because the New York writers all but surrendered their critical perspective on American society—that is why they were now open to attack.7
It was the growth of McCarthyism which brought most sharply into question the role of the intellectuals. Here, presumably, all men of good will could agree; here the interests of the intellectuals were beyond dispute and directly at stake. The record is not glorious. In New York circles it was often said that Bertrand Russell exaggerated wildly in describing the U.S. as “subject to a reign of terror” and that Simone de Beauvoir retained Stalinist clichés in her reportage from America. Yet it should not be forgotten that, if not “a reign of terror,” McCarthyism was frightful and disgusting, and that a number of Communists and fellow-travelers, not always carefully specified, suffered serious harm.
A magazine like Partisan Review was of course opposed to McCarthy's campaign, but it failed to take the lead on the issue of freedom which might once again have imbued the intellectuals with fighting spirit. Unlike some of its New York counterparts, it did print sharp attacks on the drift toward conservatism, and it did not try to minimize the badness of the situation in the name of anti-Communism. But the magazine failed to speak out with enough force and persistence, or to break past the hedgings of those intellectuals who led the American Committee for Cultural Freedom.
COMMENTARY, under Elliot Cohen's editorship, was still more inclined to minimize the threat of McCarthyism. In September 1952, at the very moment McCarthy became a central issue in the Presidential campaign, Cohen could write: “McCarthy remains in the popular mind an unreliable, second-string blowhard; his only support as a great national figure is from the fascinated fears of the intelligentsia”—a mode of argument all too close to that of the anti-anti-Communists who kept repeating that Communism was a serious problem only in the minds of anti-Communists.
In the American Committee for Cultural Freedom the increasingly conformist and conservative impulses of the New York intellectuals, or at least of a good number of them, found formal expression. I quote at length from Michael Harrington in a 1955 Dissent, first because it says precisely what needs to be said and second because it has the value of contemporary evidence:
In practice the ACCF has fallen behind Sidney Hook's views on civil liberties. Without implying any “conspiracy” theory of history . . . one may safely say that it is Hook who has molded the decisive ACCF policies. His Heresy Yes, Conspiracy No articles were widely circulated by the Committee, which meant that in effect it endorsed his systematic, explicit efforts to minimize the threat to civil liberties and to attack those European intellectuals who, whatever their own political or intellectual deficiencies, took a dim view of American developments. Under the guidance of Hook and the leadership of Irving Kristol, who supported Hook's general outlook, the American Committee cast its weight not so much in defense of these civil liberties which were steadily being nibbled away, but rather against those few remaining fellow-travelers who tried to exploit the civil-liberties issue.
At times this had an almost comic aspect. When Irving Kristol was executive secretary of the ACCF, one learned to expect from him silence on those issues that were agitating the whole intellectual and academic world, and enraged communiqués on the outrages performed by people like Arthur Miller and Bertrand Russell in exaggerating the dangers to civil liberties in the U.S.
Inevitably this led to more serious problems. In an article by Kristol, which first appeared in COMMENTARY and was later circulated under the ACCF imprimatur, one could read such astonishing and appalling statements as “there is one thing the American people know about Senator McCarthy; he, like them, is unequivocally anti-Communist. About the spokesmen for American liberalism, they feel they know no such thing. And with some justification.” This in the name of defending cultural freedom!
Harrington then proceeded to list several instances in which the ACCF had “acted within the United States in defense of freedom.” But
these activities do not absorb the main attention or interest of the Committee; its leadership is too jaded, too imbued with the sourness of indiscriminate anti-Stalinism to give itself to an active struggle against the dominant trend of contemporary intellectual life in America. What it really cares about is a struggle against fellow-travelers and “neutralists”—that is, against many European intellectuals. . . .
One of the crippling assumptions of the Committee has been that it would not intervene in cases where Stalinists or accused Stalinists were involved. It has rested this position on the academic argument . . . that Stalinists, being enemies of democracy, have no “right” to democratic privileges. . . . But the actual problem is not the metaphysical one of whether enemies of democracy (as the Stalinists clearly are) have a “right” to democratic privileges. What matters is that the drive against cultural freedom and civil liberties takes on the guise of anti-Stalinism.
Years later came the revelations that the Congress for Cultural Freedom, which had its headquarters in Paris and with which the American Committee was for a time affiliated, had received secret funds from the CIA. Some of the people, it turned out, with whom one had sincerely disagreed were not free men at all; they were knowing accomplices of an intelligence service. What a sad denouement! And yet not the heart of the matter, as the malicious Ramparts journalists have tried to make out. Most of the intellectuals who belonged to the ACCF seem not to have had any knowledge of the CIA connection—on this, as on anything else, I would completely accept the word of Dwight Macdonald. It is also true, however, that these intellectuals seem not to have inquired very closely into the Congress's sources of support. That a few, deceiving their closest associates, established connections with the CIA was not nearly so important, however, as that a majority within the Committee acquiesced to a politics of acquiescence. We Americans have a strong taste for conspiracy theories, supposing that if you scratch a trouble you'll find a villain. But history is far more complicated, and squalid as the CIA tie was, it should not be used to smear honest people who had nothing to do with secret services even as they remain open to criticism for what they did say and do.
At the same time, the retrospective defenses offered by some New York intellectuals strike me as decidedly lame. Meetings and magazines sponsored by the Congress, Daniel Bell has said, kept their intellectual freedom and contained criticism of U.S. policy—true but hardly to the point, since the issue at stake is not the opinions the Congress tolerated but the larger problem of good faith in intellectual life. The leadership of the Congress did not give its own supporters the opportunity to choose whether they wished to belong to a CIA-financed group. Another defense, this one offered by Sidney Hook, is that private backing was hard to find during the years it was essential to publish journals like Preuves and Encounter in Europe. Simply as a matter of fact, I do not believe this. For the Congress to have raised its funds openly, from non-governmental sources, would have meant discomfort, scrounging, penny-pinching: all the irksome things editors of little magazines have always had to do. By the postwar years, however, leading figures of both the Congress and the Committee no longer thought or behaved in that tradition.
Dwight Macdonald did. His magazine Politics was the one significant effort during the late 40's to return to radicalism. Enlivened by Macdonald's ingratiating personality and his table-hopping mind, Politics brought together sophisticated muckraking with torturous revaluations of Marxist ideology. Macdonald could not long keep in balance the competing interests which finally tore apart his magazine: lively commentary on current affairs and unavoidable if depressing retrospects on the failure of the Left. As always with Macdonald, honesty won out (one almost adds, alas) and the “inside” political discussion reached its climax with his essay The Root Is Man, in which he arrived at a kind of anarcho-pacifism based on an absolutist morality. This essay was in many ways the most poignant and authentic expression of the plight of those few intellectuals—Nicola Chiaramonte, Paul Goodman, Macdonald—who wished to dissociate themselves from the postwar turn to Realpolitik but could not find ways of transforming sentiments of rectitude and visions of Utopia into a workable politics. It was also a perfect leftist rationale for a kind of internal emigration of spirit and mind, with some odd shadings of similarity to the Salinger cult of the late 50's8
The overwhelming intellectual drift, however, was toward the Right. Arthur Schlesinger Jr., with moony glances at Kierkegaard, wrote essays in which he maintained that American society had all but taken care of its economic problems and could now concentrate on raising its cultural level. The “end of ideology” became a favorite shield for intellectuals in retreat, though it was never entirely clear whether this phrase meant the end of “our” ideology (partly true) or that all ideologies were soon to disintegrate (not true) or that the time had come to abandon the nostalgia for ideology (at least debatable). And in the mid 50's, as if to codify things, there appeared in Partisan Review a symposium, “Our Country and Our Culture,” in which all but three or four of the thirty participants clearly moved away from their earlier radical views. The rapprochement with “America the Beautiful,” as Mary McCarthy now called it in a tone not wholly ironic, seemed almost complete.
In these years there also began that series of gyrations in opinion, interest, and outlook—so frenetic, so unserious—which would mark our intellectual life. In place of the avant-garde idea we now had the style of fashion, though to suggest a mere replacement may be too simple, since as Poggioli remarks, fashion has often shadowed the avant garde as a kind of dandified double. Some intellectuals turned to a weekend of religion, some to a semester of existentialism,9 some to a holiday of Jewishness without faith or knowledge, some to a season of genteel conservatism. Leslie Fiedler, no doubt by design, seemed to go through more of such episodes than anyone else: even his admirers could not always be certain whether he was davenning or doing a rain dance.
These twists and turns were lively, and they could all seem harmless if only one could learn to look upon intellectual life as a variety of play, like potsie or king of the hill. What struck one as troubling, however, was not this or that fashion (tomorrow morning would bring another), but the dynamic of fashion itself, the ruthlessness with which, to remain in fashion, fashion had to keep devouring itself.
It would be unfair to give the impression that the fifteen years after the war were without significant growth or achievement among the New York writers. The attempt of recent New Left ideologues to present the 40's and 50's as if they were no more than a time of intellectual sterility and reaction is an oversimplification. Together with the turn toward conservative acquiescence, there were serious and valuable achievements. Hannah Arendt's book on totalitarianism may now seem open to many criticisms, but it certainly must rank as a major piece of work which, at the very least, made impossible—I mean, implausible—those theories of totalitarianism which, before and after she wrote, tended to reduce fascism and Stalinism to a matter of class rule or economic interest. Daniel Bell's writing contributed to the rightward turn of these years, but some of it, such as his excellent little book Work and Its Discontents, constitutes a permanent contribution, and one that is valuable for radicals too. The stress upon complexity of thought which characterized intellectual life during these years could be used as a rationale for conservatism, and perhaps even arose from the turn toward conservatism; but in truth, the lapsed radicalism of earlier years had proved to be simplistic, the world of late capitalism was perplexing, and for serious people complexity is a positive value. Even the few intellectuals who resisted the dominant temper of the 50's underwent during these years significant changes in their political outlooks and styles of thought: e.g., those around Dissent who cut whatever ties of sentiment still held them to the Bolshevik tradition and made the indissoluble connection between democracy and socialism a crux of their thought. Much that happened during these years is to be deplored and dismissed, but not all was waste; the increasing sophistication and complication of mind was a genuine gain, and it would be absurd, at this late date, to forgo it.
In literary criticism there were equivalent achievements. The very instability that might make a shambles out of political thought could have the effect of magnifying the powers required for criticism. Floundering in life and uncertainty in thought could make for an increased responsiveness to art. In the criticism of men like Trilling, Rahv, Chase, and Dupee there was now a more authoritative relation to the literary text and a richer awareness of the cultural past than was likely to be found in their earlier work. And a useful tension was also set up between the New York critics, whose instinctive response to literature was through a social-moral contextualism, and the New Critics, whose formalism may have been too rigid yet proved of great value to those who opposed it.
Meanwhile, the world seemed to be opening up, with all its charms, seductions, and falsities. In the 30's the life of the New York writers had been confined: the little magazine as island, the radical sect as cave. Partly they were recapitulating the pattern of immigrant Jewish experience: an ingathering of the flock in order to break out into the world and taste the Gentile fruits of status and success. Once it became clear that waiting for the revolution might turn out to be steady work and that the United States would neither veer to fascism nor sink into depression, the intellectuals had little choice but to live within (which didn't necessarily mean, become partisans of) the existing society.
There was money to be had from publishers, no great amounts but more than in the past. There were jobs in the universities, even for those without degrees. Some writers began to discover that publishing a story in The New Yorker or Esquire was not a sure ticket to Satan; others to see that the academy, while perhaps less exciting than the Village, wasn't invariably a graveyard for intellect and might even provide the only harbor in which serious people could do their own writing and perform honorable work. This dispersion involved losses, but usually there was nothing sinister about it—unless one clung, past an appropriate age, to the fantasy of being a momentarily unemployed “professional revolutionist.” Writers ought to know something about the world; they ought to test their notions against the reality of the country in which they live. Worldly involvements would, of course, bring risks, and one of these was power, really a very trifling kind of power, but still enough to raise the fear of corruption. That power corrupts everyone knows by now, but we ought also to recognize that powerlessness, if not corrupting, can be damaging—as in the case of Paul Goodman, a very courageous writer who stuck to his anarchist beliefs through years in which he was mocked and all but excluded from the New York journals, yet who could also come to seem, in his very rectitude, an example of asphyxiating righteousness.
What brought about these changes? Partly ideological adaptation, a feeling that capitalist society was here to stay and there wasn't much point in maintaining a radical position or posture. Partly the sly workings of prosperity. But also a loosening of the society itself, the start of that process which only now is in full swing—I mean the remarkable absorptiveness of modern society, its readiness to abandon traditional precepts for a moment of excitement, its growing permissiveness toward social criticism, perhaps out of indifference, or security, or even tolerance.
In the 60's well-placed young professors and radical students would denounce the “success,” sometimes the “sellout” of the New York writers. Their attitude reminds one a little of George Orwell's remark about wartime France: only a Pétain could afford the luxury of asceticism, ordinary people had to live by the necessities of materialism. But really, when you come to think of it, what did this “success” of the intellectuals amount to? A decent or a good job, a chance to earn extra money by working hard, and in the case of a few, like Trilling and Kazin, some fame beyond New York—rewards most European intellectuals would take for granted, so paltry would they seem. For the New York writers who lived through the 30's expecting never to have a job at all, a regular pay check might be remarkable; but in the American scale of things it was very modest indeed. And what the “leftist” prigs of the 60's, sons of psychiatrists and manufacturers, failed to understand—or perhaps understood only too well—was that the “success” with which they kept scaring themselves was simply one of the possibilities of adult life, a possibility, like failure, heavy with moral risks and disappointment. Could they imagine that they too might have to face the common lot?—I mean the whole business: debts, overwork, varicose veins, alimony, drinking, quarrels, hemorrhoids, depletion, the recognition that one might not prove to be another T.S. Eliot, but also some good things, some lessons learned, some “rags of time” salvaged and precious.
Here and there you could find petty greed or huckstering, now and again a drop into opportunism; but to make much of this would be foolish. Common clay, the New York writers had their share of common ambition. What drove them, and sometimes drove them crazy, was not, however, the quest for money, nor even a chance to “mix” with White House residents; it was finally, when all the trivia of existence was brushed aside, a gnawing ambition to write something, even three pages, that might live.
The intellectuals should have regarded their entry into the outer world as utterly commonplace, at least if they kept faith with the warning of Stendhal and Balzac that one must always hold a portion of the self forever beyond the world's reach. Few of the New York intellectuals made much money on books and articles. Few reached audiences beyond the little magazines. Few approached any centers of power, and precisely the buzz of gossip attending the one or two sometimes invited to a party beyond the well-surveyed limits of the West Side showed how confined their life still was. What seems most remarkable in retrospect is the innocence behind the assumption, sometimes held by the New York writers themselves with a nervous mixture of guilt and glee, that whatever recognition they won was cause for either preening or embarrassment. For all their gloss of sophistication, they had not really moved very far into the world. The immigrant milk was still on their lips.
In their published work during these years, the New York intellectuals developed a characteristic style of exposition and polemic. With some admiration and a bit of irony, let us call it the style of brilliance. The kind of essay they wrote was likely to be wide-ranging in reference, melding notions about literature and politics, sometimes announcing itself as a study of a writer or literary group but usually taut with a pressure to “go beyond” its subject, toward some encompassing moral or social observation. It is a kind of writing highly self-conscious in mode, with an unashamed vibration of bravura and display. Nervous, strewn with knotty or flashy phrases, impatient with transitions and other concessions to dullness, willfully calling attention to itself as a form or at least an outcry, fond of rapid twists, taking pleasure in dispute, dialectic, dazzle—such, at its best or most noticeable, was the essay cultivated by the New York writers. Until recently its strategy of exposition was likely to be impersonal (the writer did not speak much as an “I”) but its tone and bearing were likely to be intensely personal (the audience was to be made aware that the aim of the piece was not judiciousness but rather a strong impress of attitude, a blow of novelty, a wrenching of accepted opinion, sometimes a mere indulgence of vanity).
In most of these essays there was a sense of tournament, the writer as gymnast with one eye on other rings, or as skilled infighter juggling knives of dialectic. Polemics were harsh, often rude. And audiences nurtured, or spoiled, on this kind of performance, learned not to form settled judgments about a dispute until all sides had registered their blows: surprise was always a possible reward.
This style may have brought new life to the American essay, but in contemporary audiences it often evoked a strong distaste and even fear. “Ordinary” readers could be left with the fretful sense that they were not “in,” the beauties of polemic racing past their sluggish eye. Old-line academics, quite as if they had just crawled out of The Dunciad, enjoyed dismissing the New York critics as “unsound.” And for some younger souls, the cliffs of dialectic seemed too steep. Seymour Krim has left a poignant account of his disablement before “the overcerebral, Europeanish, sterilely citified, pretentiously alienated” New York intellectuals. Resentful at the fate which drove them to compare themselves with “the over-cerebral etc. etc.,” Krim writes that he and his friends “were often tortured and unappeasably bitter about being the offspring of this unhappily unique-ingrown-screwed-up breed.” Similar complaints could be heard from other writers and would-be writers who felt that New York intellectualism threatened their vital powers.
At its best the style of brilliance reflected a certain view of the intellectual life: free-lance dash, peacock strut, daring hypothesis, knockabout synthesis. For better or worse it was radically different from the accepted modes of scholarly publishing and middle-brow journalism. It celebrated the idea of the intellectual as anti-specialist, or as a writer whose speciality was the lack of a speciality: the writer as dilettante-connoisseur, Luftmensch of the mind, roamer among theories. But it was a style which also lent itself with peculiar ease to a stifling mimicry and decadence. Sometimes it seemed—no doubt mistakenly—as if any sophomore, indeed any parrot, could learn to write one of those scintillating Partisan reviews, so thoroughly could manner consume matter. In the 50's the cult of brilliance became a sign that writers were offering not their work or ideas but themselves, the persona as content; and this was but a step or two away from the exhibitionism of the 60's. Brilliance could become a sign of intellect unmoored: the less assurance, the more pyrotechnics. In making this judgment I ought to be frank enough to register the view that serious writers may prove to be brilliant and take pleasure in the proving, but insofar as they are serious, their overriding aim must be absolute lucidity.
If to the minor genre of the essay the New York writers made a major contribution, to the major genres of fiction and poetry they made only a minor contribution. As a literary group and no more than a literary group, they will seem less important than, say, the New Critics, who did set in motion a whole school of poetry. A few poets—Berryman, Lowell, Jarrell, perhaps Kunitz—have been influenced by the New York intellectuals, though in ways hard to specify and hardly comprising a major pressure on their work: all were finished writers by the time they brushed against the New York milieu. For one or two poets, the influence of New York meant becoming aware of the cultural pathos resident in the idea of the Jew (not always distinguished from the idea of Del-more Schwartz). But the main literary contribution of the New York milieu has been to legitimate a subject and tone we must uneasily call American Jewish writing. The fiction of urban malaise, second-generation complaint, Talmudic dazzle, woeful alienation, and dialectical irony, all found its earliest expression in the pages of COMMENTARY and Partisan Review—fiction in which the Jewish world is not merely regained in memory as a point of beginnings, an archetypal Lower East Side of spirit and. place, but is also treated as a portentous metaphor of man's homelessness and wandering.
Such distinguished short fictions as Bellow's “Seize the Day,” Schwartz's “In Dreams Begin Responsibility,” Mailer's “The Man Who Studied Yoga,” and Malamud's “The Magic Barrel” seem likely to survive the cultural moment in which they were written. And even if one concludes that these and similar pieces are not enough to warrant speaking of a major literary group, they certainly form a notable addition—a new tone, a new sensibility—to American writing. In time, these writers may be regarded as the last “regional” group in American literature, parallel to recent Southern writers in both sophistication of craft and a thematic dissociation from the values of American society. Nor is it important that during the last few decades both of these literary tendencies, the Southern and the Jewish, have been overvalued. The distance of but a few years has already made it clear that except for Faulkner Southern writing consists of a scatter of talented minor poets and novelists; and in a decade or so a similar judgment may be commonly accepted about most of the Jewish writers—though in regard to Bellow and Mailer settled opinions are still risky.
What is clear from both Southern and Jewish writing is that in a society increasingly disturbed about its lack of self-definition, the recall of regional and traditional details can be intensely absorbing in its own right, as well as suggestive of larger themes transcending the region. (For the Jewish writers New York was not merely a place, it was a symbol, a burden, a stamp of history.) Yet the writers of neither school have thus far managed to move from their particular milieu to a grasp of the entire culture; the very strengths of their localism define their limitations; and especially is this true for the Jewish writers, in whose behalf critics have recently overreached themselves. The effort to transform a Jewishness without religious or ethnic content into an emblem of universal dismay can easily lapse into sentimentality.
Whatever the hopeful future of individual writers, the “school” of American Jewish writing is by now in an advanced state of decomposition: how else explain the attention it has lately enjoyed? Or the appearance of a generation of younger Jewish writers who, without authentic experience or memory to draw upon, manufacture fantasies about the lives of their grandfathers? Or the popularity of Isaac Bashevis Singer who, coming to the American literary scene precisely at the moment when writers composing in English had begun to exhaust the Jewish subject, could, by dazzling contrast, extend it endlessly backward in time and deeper in historical imagination?
Just as there appear today young Jewish intellectuals who no longer know what it is that as Jews they do not know, so in fiction the fading immigrant world offers a thinner and thinner yield to writers of fiction. It no longer presses on memory, people can now choose whether to care about it. We are almost at the end of a historic experience, and it now seems unlikely that there will have arisen in New York a literary school comparable to the best this country has had. Insofar as the New York intellectual atmosphere has affected writers like Schwartz, Rosenfeld, Bellow, Malamud, Mailer, Goodman, and Roth (some of these would hotly deny that it has), it seems to have been too brittle, too contentious, too insecure for major creative work. What cannot yet be estimated is the extent to which the styles and values of the New York world may have left a mark on the work of American writers who never came directly under its influence or have been staunchly hostile to all of its ways.
Thinking back upon intellectual life in the 40's and 50's, and especially the air of malaise that hung over it, I find myself turning to a theme as difficult to clarify as it is impossible to evade. And here, for a few paragraphs, let me drop the porous shield of impersonality and speak openly in the first person.
We were living directly after the Holocaust of the European Jews. We might scorn our origins; we might crush America with discoveries of ardor; we might change our names. But we knew that but for an accident of geography we might also now be bars of soap. At least some of us could not help feeling that in our earlier claims to have shaken off all ethnic distinctiveness there had been something false, something shaming. Our Jewishness might have no clear religious or national content, it might be helpless before the criticism of believers; but Jews we were, like it or not, and liked or not.
To recognize that we were living after one of the greatest and least explicable catastrophes of human history, and one for which we could not claim to have adequately prepared ourselves either as intellectuals or as human beings, brought a new rush of feelings, mostly unarticulated and hidden behind the scrim of consciousness. It brought a low-charged but nagging guilt, a quiet remorse. Sartre's brilliant essay on authentic and inauthentic Jews left a strong mark. Hannah Arendt's book on totalitarianism had an equally strong impact, mostly because it offered a coherent theory, or at least a coherent picture, of the concentration camp universe. We could no longer escape the conviction that, blessing or curse, Jewishness was an integral part of our life, even if—and perhaps just because—there was nothing we could do or say about it. Despite a few simulated seders and literary raids on-Hasidism, we could not turn back to the synagogue; we could only express our irritation with “the community” which kept nagging us like disappointed mothers; and sometimes we tried, through imagination and recall, to put together a few bits and pieces of the world of our fathers. I cannot prove a connection between the Holocaust and the turn to Jewish themes in American fiction, at first urgent and quizzical, later fashionable and manipulative. I cannot prove that my own turn to Yiddish literature during the 50's was due to the shock following the war years. But it would be foolish to scant the possibility.
The violent dispute which broke out among the New York intellectuals when Hannah Arendt published her book on Eichmann had as one of its causes a sense of guilt concerning the Jewish tragedy—a guilt pervasive, unmanageable, yet seldom declared at the surface of speech or act. In the quarrel between those attacking and those defending Eichmann in Jerusalem there were polemical excesses on both sides, insofar as both were acting out of unacknowledged passions. Yet even in the debris of this quarrel there was, I think, something good. At least everyone was acknowledging emotions that had long gone unused. Nowhere else in American academic and intellectual life was there such ferocity of concern with the problems raised by Hannah Arendt. If left to the rest of the American intellectual world, her book would have been praised as “stimulating” and “thoughtful,” and then everyone would have gone back to sleep. Nowhere else in the country could there have been the kind of public forum sponsored on this subject by Dissent: a debate sometimes ugly and outrageous, yet also urgent and afire—evidence that in behalf of ideas we were still ready to risk personal relationships. After all, it had never been dignity that we could claim as our strong point.
Nothing about the New York writers is more remarkable than the sheer fact of their survival. In a country where tastes in culture change more rapidly than lengths of skirts, they have succeeded in maintaining a degree of influence, as well as a distinctive milieu, for more than thirty years. Apart from reasons intrinsic to the intellectual life, let me note a few that are somewhat more worldly in nature.
- There is something, perhaps a quasi-religious dynamism, about an ideology, even a lapsed ideology that everyone says has reached its end, which yields force and coherence to those who have closely experienced it. A lapsed Catholic has tactical advantages in his apostasy which a lifelong skeptic does not have. And just as Christianity kept many 19th-century writers going long after they had discarded religion, so Marxism gave bite and edge to the work of 20th-century writers long after they had turned from socialism.
- The years in which the New York writers gained some prominence were those in which the style at which they had arrived—irony, ambiguity, complexity, the problematic as mode of knowledge—took on a magnified appeal for the American educated classes. After the Second World War the cultivation of private sensibility and personal responsibility were values enormously popular among reflective people, to whom the very thought of public life smacked of betrayal and vulgarity.
- An intelligentsia flourishes in a capital: Paris, St. Petersburg, Berlin. The influence of the New York writers grew at the time New York itself, for better or worse, became the cultural center of the country. And thereby, to return to Poggioli's categories, the New York writers slowly shed the characteristics of an intelligentsia and transformed themselves into—
Perhaps. But what precisely is an Establishment? Vaguely sinister in its overtones, the term is used these days with gay abandon on the American campus; but except as a spread-eagle put-down it has no discernible meaning, and if accepted as a put-down, the problem then becomes to discover who, if anyone, is not in the Establishment. In England the term has had a certain clarity of usage, referring to an intellectual elite which derives from the same upper and middle classes as the men who wield political power and which shares with these men Oxbridge education and Bloomsbury culture. But except in F. R. Leavis's angrier tirades, “Establishment” does not bear the conspiratorial overtones we are inclined to credit in this country. What it does in England is to locate the social-cultural stratum guiding the tastes of the classes in power and thereby crucially affecting the tastes of the country as a whole.
In this sense, neither the New York writers nor any other group can be said to comprise an American Establishment, simply because no one in this country has ever commanded an equivalent amount of cultural power. The New York writers have few, if any, connections with a stable class of upper-rank civil servants or with a significant segment of the rich. They are notably without connections in Washington. They do not shape official or dominant tastes. And they cannot exert the kind of control over cultural opinion that the London Establishment is said to have maintained until recently. Critics like Trilling and Kazin are listened to by people in publishing, Rosenberg and Greenberg by people in the art world; but this hardly constitutes anything so formidable as an Establishment. Indeed, at the very time mutterings have been heard about a New York literary Establishment, there has occurred a rapid disintegration of whatever group ties may still have remained among the New York writers. They lack—and it is just as well—the first requirement for an Establishment: that firm sense of internal discipline which enables it to impose its values and tastes on a large public.
During the last few years the talk about a New York Establishment has taken an extremely unpleasant turn. Whoever does a bit of lecturing about the country is likely to encounter, after a few drinks, literary academics who inquire enviously, sometimes spitefully, about “what's new in New York.” Such people seem to feel that exile in outlying regions means they are missing something remarkable (and so they are: the Balan-chine company). The cause of their cultural envy is, I think, a notion that has become prevalent in our English departments that scholarship is somehow unworthy and the “real” literary life is to be found in the periodical journalism of New York. Intrinsically this is a dubious notion and for the future of American education, a disastrous one; when directed against the New York writers it leads to some painful situations. As polite needling questions are asked about the cultural life of New York, a rise of sweat comes to one's brow, for everyone knows what no one says: New York means Jews.10
Whatever the duration or extent of the influence enjoyed by the New York intellectuals, it is now reaching an end. There are signs of internal disarray: unhealed wounds, a dispersal of interests, the damage of time. More important, however, is the appearance these last few years of a new and powerful challenge to the New York writers. And here I shall have to go off on what may appear to be a long digression, since one cannot understand the present situation of the New York writers without taking into detailed account the cultural-political scene of America in the 60's.
There is a rising younger generation of intellectuals: ambitious, self-assured, at ease with prosperity while conspicuously alienated, unmarred by the traumas of the totalitarian age, bored with memories of defeat, and attracted to the idea of power. This generation matters, thus far, not so much for its leading figures and their meager accomplishments, but for the political-cultural style—what I shall call the new sensibility—it thrusts into absolute opposition both to the New York writers and to other groups. It claims not to seek penetration into, or accommodation with, our cultural and academic institutions; it fancies the prospect of a harsh generational fight; and given the premise with which it begins—that everything touched by older men reeks of betrayal—its claims and fancies have a sort of propriety. It proposes a revolution, I would call it a counterrevolution, in sensibility. Though linked to New Left politics, it goes beyond any politics, making itself felt, like a spreading blot of anti-intellectualism, in every area of intellectual life. Not yet fully cohered, this new cultural group cannot yet be fully defined, nor is it possible fully to describe its projected sensibility, since it declares itself through a refusal of both coherence and definition.
There is no need to discuss once more the strengths and weaknesses of the New Left, its moral energies and intellectual muddles. Nor need we be concerned with the tactical issues separating New Left politics from that of older left-wing intellectuals. Were nothing else at stake than, say, “coalition politics,” the differences would be both temporary and tolerable. But in reality a deeper divergence of outlook has begun to show itself. The new intellectual style, insofar as it approximates a politics, mixes sentiments of anarchism with apologies for authoritarianism; bubbling hopes for “participatory democracy” with manipulative elitism; unqualified populist majoritarianism with the reign of the cadres.
A confrontation of intellectual outlooks is unavoidable. And a central issue is certain to be the problem of liberalism, not liberalism as one or another version of current politics, nor even as a theory of power, but liberalism as a cast of mind, a structure of norms by means of which to humanize public life. For those of us who have lived through the age of totalitarianism and experienced the debacle of socialism, this conflict over liberal values is extremely painful. We have paid heavily for the lesson that democracy, even “bourgeois democracy,” is a precious human achievement, one that, far from being simply a mode of mass manipulation, has been wrested through decades of struggle by the labor, socialist, and liberal movements. To protect the values of liberal democracy, often against those who call themselves liberals, is an elementary task for the intellectuals as a social group.
Yet what I have just been saying, axiomatic as it may seem, has in the last few years aroused opposition, skepticism, open contempt among professors, students, and intellectuals. On the very crudest, though by no means unpopular, level, we find a vulgarization of an already vulgar Marxism. The notion that we live in a society that can be described as “liberal fascism” (a theoretic contribution from certain SDS leaders) isn't one that serious people can take seriously; but the fact that it is circulated in the academic community signifies a counterrevolution of the mind: a refusal of nuance and observation, a willed return to the kind of political primitivism which used to declare the distinctions of bourgeois rule—democratic, authoritarian, totalitarian—as slight in importance.
For the talk about “liberal fascism” men like Norman Mailer must bear a heavy responsibility, insofar as they have recklessly employed the term “totalitarian” as a descriptive for present-day American society. Having lived through the ghastliness of the Stalinist theory of “social fascism” (the grand-daddy of “liberal fascism”) I cannot suppose any literate person really accepts this kind of nonsense, yet I know that people can find it politically expedient to pretend that they do. It is, in Ernest Nolte's phrase, “a lie which the intellect sees for what it is but which is [felt to be] at one with the deeper motivations of life.”
There are sophisticated equivalents. One of these points to the failings and crises of democracy, concluding that the content of decision has been increasingly separated from the forms of decision-making. Another emphasizes the manipulation of the masses by communication media and declares them brainwashed victims incapable of rational choice and acquiescing in their own subjugation. A third decries the bureaucratic entanglements of the political process and favors some version, usually more sentiment than scheme, for direct plebiscitory rule. With varying intelligence, all point to acknowledged problems of democratic society; and there could be no urgent objection were these criticisms not linked with the premise that the troubles of democracy can be overcome by undercutting or bypassing representative institutions. Thus, it is quite true that the masses are manipulated, but to make that the crux of a political analysis is to lead into the notion that elections are mere “formalities” and majorities mere tokens of the inauthentic; what is needed, instead, is Marcuse's “educational dictatorship” (in which, I hope, at least some of the New York intellectuals would require the most prolonged reeducation). And in a similar vein, all proposals for obligatory or pressured “participation,” apart from violating the democratic right not to participate, have a way of discounting those representative institutions and limitations upon power which can alone provide a degree of safeguard for liberal norms.
Perhaps the most sophisticated and currently popular of anti-democratic notions is that advanced by Herbert Marcuse: his contempt for tolerance on the ground that it is a veil for subjection, a rationale for maintaining the status quo, and his consequent readiness to suppress “regressive” elements of the population lest they impede social “liberation.” About these theories, which succeed in salvaging the worst of Leninism, Henry David Aiken has neatly remarked: “Whether garden-variety liberties can survive the ministrations of such ‘liberating tolerance’ is not a question that greatly interests Marcuse.” Indeed not.
Such theories are no mere academic indulgence or sectarian irrelevance; they have been put to significant use on the American campus as rationalizations for schemes to break up meetings of political opponents and as the justification for imaginary coups d'état by tiny minorities of enraged intellectuals. How depressing that “men of the Left,” themselves so often victims of repression, should attack the values of tolerance and freedom.11
These differences concerning liberal norms run very deep and are certain to affect American intellectual life in the coming years; yet they do not quite get to the core of the matter. In the Kulturkampf now emerging there are issues more consequential than the political ones, issues that have to do with basic views concerning the nature of human life.
One of these has been with us for a long time, and trying now to put it into simple language, I feel a measure of uneasiness, as if it were bad form to violate the tradition of antinomianism in which we have all been raised.
What, for “emancipated” people, is the surviving role of moral imperatives, or at least moral recommendations? Do these retain for us a shred of sanctity or at least of coercive value? The question to which I am moving is not, of course, whether the moral life is desirable or men should try to live it; no, the question has to do with the provenance and defining conditions of the moral life. Do moral principles continue to signify insofar as and if they come into conflict with spontaneous impulses, and more urgently still, can we conceive of moral principles retaining some validity if they do come into conflict with spontaneous impulses? Are we still to give credit to the idea, one of the few meeting-points between traditional Christianity and modern Freudianism, that there occurs and must occur a deepseated clash between instinct and civilization, nature and nurture, or can we now, with a great sigh of collective relief, dismiss this as still another hangup, perhaps the supreme hangup, of Western civilization?
For more than 150 years there has been a line of Western thought, as also of sentiment in modern literature, which calls into question not one or another moral commandment or regulation, but the very idea of commandment and regulation; which insists that the ethic of control, like the ethic of work, should be regarded as spurious, a token of a centuries-long heritage of repression. Sometimes this view comes to us as a faint residue of Christian heresy, more recently as the blare of Nietzschean prophecy, and on our own day as a psychoanalytic gift.
Now, even those of us raised on the premise of revolt against received values, against the whole system of bourgeois constriction and anti-pleasure, did not—I suppose it had better be said outright—imagine ourselves to be exempt from the irksome necessity of regulation, even if we had managed to escape the reach of the commandments. Neither primitive Christians nor romantic naïfs, we did not suppose that we could entrust ourselves entirely to the beneficence of nature, or the signals of our bodies, as a sufficient guide to conduct. My very use of the word “conduct,” freighted as it is with normative associations, puts the brand of time on what I am saying.
By contrast, the emerging new sensibility rests on a vision of innocence: an innocence through lapse or will or recovery, an innocence through a refusal of our and perhaps any other culture, an innocence not even to be preceded by the withering away of the state, since in this view of things the state could wither away only if men learned so to be at ease with their desires, all need for regulation would fade. This is a vision of life beyond good and evil, not because these experiences or possibilities of experience have been confronted and transcended, but because the categories by which we try to designate them have been dismissed. There is no need to taste the apple: the apple brings health to those who know how to bite it: and look more closely, there is no apple at all, it exists only in your sickened imagination.
The new sensibility posits a theory that might be called the psychology of unobstructed need: men should satisfy those needs which are theirs, organic to their bodies and psyches, and to do this they now must learn to discard or destroy all those obstructions, mostly the result of cultural neurosis, which keep them from satisfying their needs. This does not mean that the moral life is denied; it only means that in the moral economy costs need not be entered as a significant item. In the current vocabulary, it becomes a matter of everyone doing “his own thing,” and once all of us are allowed to do “his own thing,” a prospect of easing harmony unfolds. Sexuality is the ground of being, and vital sexuality the assurance of the moral life.
Whether this outlook is compatible with a high order of culture or a complex civilization I shall not discuss here; Freud thought they were not compatible, though that does not foreclose the matter. More immediately, and on a less exalted plane, one is troubled by the following problem: what if the needs and impulses of human beings clash, as they seem to do, and what if the transfer of energies from sexuality to sociality does not proceed with the anticipated abundance and smoothness? The new sensibility, as displayed in the writings of Norman Brown and Norman Mailer, falls back upon a curious analogue to laissez faire economics, Adam Smith's invisible hand, by means of which innumerable units in conflict with one another achieve a resultant of cooperation. Is there, however, much reason to suppose that this will prove more satisfactory in the economy of moral conduct than it has in the morality of economic relations?
Suppose that, after reading Mailer's “The White Negro,” my “thing” happens to be that, to “dare the unknown” (as Mailer puts it), I want to beat in the brains of an aging candystore keeper; or after reading LeRoi Jones, I should like to cut up a few Jews, whether or not they keep stores—how is anyone going to argue against the outpouring of my need? Who will declare himself its barrier? Against me, against my ideas it is possible to argue, but how, according to this new dispensation, can anyone argue against my need? Acting through violence I will at least have realized myself, for I will have entered (to quote Mailer) “a new relation with the police” and introduced “a dangerous element” into my life; thereby, too, I will have escaped the cell-block of regulation which keeps me from the free air of self-determination. And if you now object that this very escape may lead to brutality, you reveal yourself as hopelessly linked to imperfection and original sin. For why should anyone truly heeding his nature wish to kill or wound or do anything but love and make love? That certain spokesmen of the new sensibility seem to be boiling over with fantasies of blood, or at least suppose that a verbal indulgence in such fantasies is a therapy for the boredom in their souls, is a problem for dialecticians. And as for skeptics, what have they to offer but evidence from history, that European contamination?
When it is transposed to a cultural setting, this psychology—in earlier times it would have been called a moral psychology—provokes a series of disputes over “complexity” in literature. Certain older critics find much recent writing distasteful and tiresome because it fails to reach or grasp for that complexity which they regard as intrinsic to the human enterprise. More indulgent critics, not always younger, find the same kind of writing forceful, healthy, untangled. At first this seems like a problem in taste, a pardonable difference between those who like their poems and novels knotty and those who like them smooth; but soon it becomes clear that this clash arises from a meeting of incompatible world-outlooks. For if the psychology of unobstructed need is taken as a sufficient guide to life, it all but eliminates any need for complexity—or rather, the need for complexity comes to be seen as a mode of false consciousness, an evasion of true feelings, a psychic bureaucratism in which to trap the pure and the strong. If good sex signifies good feeling; good feeling, good being; good being, good action; and good action, a healthy polity, then we have come the long way round, past the Reichian way or the Lawrentian way, to an Emersonian romanticism minus Emerson's complicatedness of vision. The world snaps back into a system of burgeoning potentialities, waiting for free spirits to attach themselves to the richness of natural object and symbol—except that now the orgasmic blackout is to replace the Oversoul as the current through which pure transcendent energies will flow.
We are confronting, then, a new phase in our culture, which in motive and spring represents a wish to shake off the bleeding heritage of modernism and reinstate one of those periods of the collective naï which seem endemic to American experience. The new sensibility is impatient with ideas. It is impatient with literary structures of complexity and coherence, only yesterday the catchwords of our criticism. It wants instead works of literature—though literature may be the wrong word—that will be as absolute as the sun, as unarguable as orgasm, and as delicious as a lollipop. It schemes to throw off the weight of nuance and ambiguity, legacies of high consciousness and tired blood. It is weary of the habit of reflection, the making of distinctions, the squareness of dialectic, the tarnished gold of inherited wisdom. It cares nothing for the haunted memories of old Jews. It has no taste for the ethical nail-biting of those writers of the Left who suffered defeat and could never again accept the narcotic of certainty. It is sick of those magnifications of irony that Mann gave us, sick of those visions of entrapment to which Kafka led us, sick of those shufflings of daily horror and grace that Joyce left us. It breathes contempt for rationality, impatience with mind, and a hostility to the artifices and decorums of high culture. It despises liberal values, liberal cautions, liberal virtues. It is bored with the past: for the past is a fink.
Where Marx and Freud were diggers of intellect, mining deeper and deeper into society and the psyche, and forever determined to strengthen the dominion of reason, today the favored direction of search is not inward but sideways, an “expansion of consciousness” through the kick of drugs. The new sensibility is drawn to images of sickness, but not, as with the modernist masters, out of dialectical canniness or religious blasphemy; it takes their denials literally and does not even know the complex desperations that led them to deny. It seeks to charge itself into dazzling sentience through chemicals and the rhetoric of violence. It gropes for sensations: the innocence of blue, the ejaculations of red. It ordains life's simplicity. It chooses surfaces as against relationships, the skim of texture rather than the weaving of pattern. Haunted by boredom, it transforms art into a sequence of shocks which, steadily magnified, yield fewer and fewer thrills, so that simply to maintain a modest frisson requires mounting exertions. It proposes an art as disposable as a paper dress, to which one need give nothing but a flicker of notice. Especially in the theater it resurrects tattered heresies, trying to collapse aesthetic distance in behalf of touch and frenzy. (But if illusion is now worn out, what remains but staging the realities of rape, fellatio, and murder?) Cutting itself off from a knowledge of what happened before the moment of its birth, it repeats with a delighted innocence most of what did in fact happen: expressionist drama reduced to skit, agit-prop tumbled to farce, Melvillean anguish slackened into black humor. It devalues the word, which is soaked up with too much past history, and favors monochromatic cartoons, companionate grunts, and glimpses of the ineffable in popular ditties. It has some humor, but not much wit. Of the tragic it knows next to nothing. Where Dostoevsky made nihilism seem sinister by painting it in jolly colors, the new American sensibility does something no other culture could have aspired to: it makes nihilism seem casual, good-natured, even innocent. No longer burdened by the idea of the problematic, it arms itself with the paraphernalia of post-industrial technique and crash-dives into a Typee of neo-primitivism.
Its high priests are Norman Brown, Herbert Marcuse, and Marshall McLuhan,12 all writers with a deeply conservative bias: all committed to a stasis of the given: the stasis of unmoving instinct, the stasis of unmovable society, the stasis of endlessly moving technology. Classics of the latest thing, these three figures lend the new sensibility an aura of profundity. Their prestige can be employed to suggest an organic link between cultural modernism and the new sensibility, though in reality their relation to modernism is no more than biographical.
Perhaps because it is new, some of the new style has its charms—mainly along the margins of social life, in dress, music, and slang. In that it captures the yearnings of a younger generation, the new style has more than charm: a vibration of moral desire, a desire for goodness of heart. Still, we had better not deceive ourselves. Some of those shiny-cheeked darlings adorned with flowers and tokens of love can also be campus enragés screaming “Up Against the Wall, Motherfuckers, This Is a Stickup” (a slogan that does not strike one as a notable improvement over “Workers of the World, Unite”).
That finally there should appear an impulse to shake off the burdens and entanglements of modernism need come as no surprise. After all the virtuosos of torment and enigma we have known, it would be fine to have a period in Western culture devoted to relaxed pleasures and surface hedonism. But so far this does not seem possible: the century forbids it. What strikes one most forcefully about a great deal of the new writing and theater is its grindingly ideological tone, even if now the claim is for an ideology of pleasure. And what strikes one even more is the air of pulsing ressentiment which pervades this work, an often unearned and seemingly inexplicable hostility. If one went by the cues of a critic like Susan Sontag, one might suppose that the ethical torments of Kamanetz Podolsk and the moral repressiveness of Salem, Massachusetts had finally been put to rest, in favor of creamy delights in texture, color, and sensation. But nothing of the sort is true, at least not yet; it is only advertised.
Keen on tactics, the spokesmen for the new sensibility proclaim it to be still another turn in the endless gyrations of modernism, still another revolt in the permanent revolution of 20th-century sensibility. This approach is very shrewd, since it can disarm in advance those older New York (and other) critics who still respond with enthusiasm to the battlecries of modernism. But several objections or qualifications need to be registered:
- Modernism, by its very nature, is uncompromisingly a minority culture, creating and defining itself through opposition to a dominant culture. Today, however, nothing of the sort is true. Floodlights glaring and tills overflowing, the new sensibility is a success from the very start. The middle-class public, eager for thrills and humiliations, welcomes it; so do the mass media, always on the alert for exploitable sensations; and naturally there appear intellectuals with handy theories. The new sensibility is both embodied and celebrated in the actions of Norman Mailer, whose condition as a swinger in America is not quite comparable with that of Joyce in Trieste or Kafka in Prague or Lawrence anywhere; it is reinforced with critical exegesis by Susan Sontag, a publicist able to make brilliant quilts from grandmother's patches; it is housed and braced by Robert Brustein, who has been writing drama reviews as if thumbing one's nose on the stage were a sufficient act of social criticism.13 And on a far lower level, it has even found its Smerdyakov in LeRoi Jones, that parodist of apocalypse who rallies Jewish audiences with calls for Jewish blood. Whatever one may think of this situation, it is surely very different from the classical picture of a besieged modernism.
- By now the search for the “new,” often reduced to a trivializing of form and matter, has become the predictable old. To suppose that we keep moving from cultural breakthrough to breakthrough requires a collective wish to forget what happened yesterday and even the day before: ignorance always being a great spur to claims for originality. Alienation has been transformed from a serious and revolutionary concept into a motif of mass culture, and the content of modernism into the decor of kitsch. As Harold Rosenberg has pungently remarked:
The sentiment of the diminution of personality is an historical hypothesis upon which writers have constructed a set of literary conventions by this time richly equipped with theatrical machinery and symbolic allusions. . . . The individual's emptiness and inability to act have become an irrefrangible cliché, untiringly supported by an immense, voluntary phalanx of latecomers to modernism. In this manifestation, the notion of the void has lost its critical edge and is thoroughly reactionary.
- The effort to assimilate new cultural styles to the modernist tradition brushes aside problems of value, quality, judgment. It rests upon a philistine version of the theory of progress in the arts: all must keep changing, and change signifies a realization of progress. Yet even if an illicit filiation can be shown, there is a vast difference in seriousness and accomplishment between the modernism of some decades ago and what we have now. The great literary modernists (to cite but one instance) put at the center of their work a confrontation and struggle with the demons of nihilism; the literary swingers of the 60's, facing a nihilist violation, cheerfully remove the threat by what Fielding once called “a timely compliance.” Just as in the verse of Swinburne echoes of Romanticism sag through the stanzas, so in much current writing there is indeed a continuity with modernism, but a continuity of grotesque and parody, through the doubles of fashion.
Still, it would be foolish to deny that in the Kulturkampf of the 60's, the New York intellectuals are at a severe disadvantage. Some have simply gone over to the other camp. A critic like Susan Sontag employs the dialectical skills and accumulated knowledge of intellectual life in order to bless the new sensibility as a dispensation of pleasure, beyond the grubby reach of interpretation and thereby, it would seem, beyond the tight voice of judgment. That her theories are skillfully-rebuilt versions of aesthetic notions long familiar and discarded; that in her own critical writing she interprets like mad and casts an image anything but hedonistic, relaxed, or sensuous—none of this need bother her admirers, for a highly literate spokesman is very sustaining to those who have discarded or not acquired intellectual literacy. Second only to Miss Sontag in trumpeting the new sensibility is Leslie Fiedler, a critic with an amiable weakness for thrusting himself at the head of parades marching into sight.14 But for those New York (or any other) writers not quite enchanted with the current scene there are serious difficulties.
They cannot be quite sure. Having fought in the later battles for modernism, they must acknowledge to themselves the possibility that, now grown older, they have lost their capacity to appreciate innovation. Why, they ask themselves with some irony, should “their” cultural revolution have been the last one, or the last good one? From the publicists of the new sensibility they hear the very slogans, catchwords, and stirring appeals which a few decades ago they were hurling in behalf of modernism and against such diehards as Van Wyck Brooks and Bernard de Voto. And given the notorious difficulties in making judgments about contemporary works of art, how can they be certain that Kafka is a master of despair and Burroughs a symptom of disintegration, Pollack a pioneer of innovation and Warhol a triviality of pop? The capacity for self-doubt, the habit of self-irony, which is the reward of decades of experience renders them susceptible to the simplistic cries of the new.
Well, the answer is that there can be no certainty: we should neither want nor need it. One must speak out of one's taste and conviction, and let history make whatever judgments it will care to. But this is not an easy stand to take, for it means that after all these years one may have to face intellectual isolation and perhaps dismissal, and there are moments when it must seem as if the best course is to be promiscuously “receptive,” swinging along with a grin of resignation.
In the face of this challenge, surely the most serious of the last twenty-five years, the New York intellectuals have not been able to mount a coherent response, certainly not a judgment sufficiently inclusive and severe. There have been a few efforts, some intellectual polemics by Lionel Abel and literary pieces by Philip Rahv; but no more. Yet if ever there was a moment when our culture needed an austere and sharp criticism—the one talent the New York writers supposedly find it death to hide—it is today. One could imagine a journal with the standards, if hopefully not the parochialism, of Scrutiny. One could imagine a journal like Partisan Review stripping the pretensions of the current scene with the vigor it showed in opposing the Popular Front and neo-conservative cultures. But these are fantasies. In its often accomplished pages Partisan Review betrays a hopeless clash between its editors' capacity to apply serious standards and their yearnings to embrace the moment. Predictably, the result satisfies no one.
One example of the failure of the New York writers to engage in criticism is their relation to Norman Mailer. He is not an easy man to come to grips with, for he is “our genius,” probably the only one, and in more than a merely personal way he is a man of enormous charm. Yet Mailer has been the central and certainly most dramatic presence in the new sensibility, even if in reflective moments he makes clear his ability to brush aside its incantations.15 Mailer as thaumaturgist of orgasm; as metaphysician of the gut; as psychic herb-doctor; as advance man for literary violence;16 as dialectician of unreason; and above all, as a novelist who has laid waste his own formidable talent—these masks of brilliant, nutty restlessness, these papery dikes against squalls of boredom—all require sharp analysis and criticism. Were Mailer to read these lines he would surely grin and exclaim that, whatever else, his books have suffered plenty of denunciation. My point, however, is not that he has failed to receive adverse reviews, including some from such New York critics as Norman Podhoretz, Elizabeth Hardwick, and Philip Rahv; perhaps he has even had too many adverse reviews, given the scope and brightness of his talent. My point is that the New York writers have failed to confront Mailer seriously as an intellectual spokesman, a cultural agent, and instead have found it easier to regard him as a hostage to the temper of our times. What has not been forthcoming is a recognition, surely a painful one, that in his major public roles he has come to represent values in deep opposition to liberal humaneness and rational discourse. That the New York critics have refused him this confrontation is both a disservice to Mailer and a sign that, whatever it may once have been, the New York intellectual community no longer exists as a significant force.
An equally telling sign is the recent growth in popularity and influence of the New York Review of Books. Emerging at least in part from the New York intellectual milieu, this journal has steadily moved away from the styles and premises with which it began. Its early dependence on those New York writers who lent their names to it and helped establish it seems all but over. The Jewish imprint has been blotted out; the New York Review, for all its sharp attacks on current political policies, is thoroughly at home in the worlds of American culture, publishing, and society. It features a strong Anglophile slant in its literary pieces, perhaps in accord with the New Statesman formula of blending leftish (and at one time, fellow-traveling) politics with Bloomsbury culture, Kingsley Martin with tips on wine. More precisely, what the New York Review has managed to achieve—I find it quite fascinating as a portent of things to come—is a link between campus “leftism” and East Side stylishness, the worlds of Tom Hayden and George Plimpton. Opposition to Communist politics and ideology is frequently presented in the pages of the New York Review as if it were an obsolete, indeed a pathetic, hangover from a discredited past or worse yet, a dark sign of the CIA. A snappish and crude anti-Americanism has swept over much of its political writing—and to avoid misunderstanding, let me say that by this I do not mean anything so necessary as attacks on the ghastly Vietnam war or on our failures in the cities. And in the hands of writers like Andrew Kopkind (author of the immortal phrase, “morality . . . starts at the barrel of a gun”), liberal values and norms are treated with something very close to contempt.
Though itself too sophisticated to indulge in the more preposterous New Left notions, such as “liberal fascism” and “confrontationism,” the New York Review has done the New Left the considerable service of providing it with a link of intellectual respectability to the academic world. In the materials it has published by Kopkind, Tom Hayden, Philip Rahv, Edgar Z. Friedenberg, Jason Epstein, and others, one finds not an acceptance of the fashionable talk about “revolution” which has become an indoor and outdoor sport on the American campus, but a kind of rhetorical violence, a verbal “radicalism,” which gives moral and intellectual encouragement to precisely such fashionable (self-defeating) talk.
This is by no means the only kind of political material to have appeared in the New York Review; at least in my own experience I have found its editors prepared to print articles of a sharply different kind; and in recent years it has published serious political criticism by George Lichtheim, Theodore Draper, and Walter Laqueur.
And because it is concerned with maintaining a certain level of sophistication and accomplishment, the New York Review has not simply taken over the new sensibility. No, at stake here is the dominant tone of this skillfully edited paper, an editorial keenness in responding to the current academic and intellectual temper—as for instance in that memorable issue with a cover featuring, no doubt for the benefit of its university readers, a diagram explaining how to make a Molotov cocktail. The genius of the New York Review, and it has been a genius of sorts, is not, in either politics or culture, for swimming against the stream.
Perhaps it is all too late. Perhaps there is no longer available among the New York writers enough energy and coherence to make possible a sustained confrontation with the new sensibility. Still, one would imagine that their undimmed sense of the Zeitgeist would prod them to sharp responses, precise statements, polemical assaults. What, after all, would be risked in saying that we have entered a period of overwhelming cultural sleaziness?
Having been formed by, and through opposition to, the New York intellectual experience, I cannot look with joy at the prospect of its ending. But neither with dismay. Such breakups are inevitable, and out of them come new voices and energies. Yet, precisely at this moment of dispersion, might not some of the New York writers achieve renewed strength if they were to struggle once again for whatever has been salvaged from these last few decades? For the values of liberalism, for the politics of a democratic radicalism, for the norms of rationality and intelligence, for the standards of literary seriousness, for the life of the mind as a humane dedication—for all this it should again be worth finding themselves in a minority, even a beleaguered minority, and not with fantasies of martyrdom but with a quiet recognition that for the intellectual this is likely to be his usual condition.
1 Is it “they” or “we”? To speak of the New York intellectuals as “they” might seem coy or disloyal; to speak of “we” self-assertive or cozy. Well, let it be “they,” with the proviso that I do not thereby wish, even if I could, to exempt myself from judgment.
2 In placing this emphasis on the Jewish origins of the New York intellectuals, I am guilty of a certain—perhaps unavoidable—compression of the realities. Were I writing a book rather than an essay, I would have to describe in some detail the relationship between the intellectuals who came on the scene in the 30's and those of earlier periods. There were significant ties between Partisan Review and The Dial, Politics and the Masses. But I choose here to bypass this historical connection because I wish to stress what has been distinctive and perhaps unique.
A similar qualification has to be made concerning those intellectuals who have been associated with this milieu but have not been Jewish. I am working on the premise that in background and style there was something decidedly Jewish about the intellectuals who began to cohere as a group around Partisan Review in the late 30's—and one of the things that was “decidedly Jewish” was that most were of Jewish birth! Perhaps it ought to be said, then, that my use of the phrase “New York intellectuals” is simply a designation of convenience. I don't mean to suggest that there have been or will be no other intellectuals in New York. I am using the phrase as a shorthand for what might awkwardly be spelled out as “the intellectuals of New York who began to appear in the 30's, most of whom were Jewish.”
3 In a lengthy essay printed in this journal, “The Culture of Modernism,” November 1967, I have tried to suggest what this term can signify.
4 In 1948 Ezra Pound, who had spent the war years as a propagandist for Mussolini and whose writings contained strongly anti-Semitic passages, was awarded the prestigious Bollingen Prize. The committee voting for this award contained a number of ranking American poets. After the award was announced, there occurred in the pages of Partisan Review, COMMENTARY, and other journals a harsh dispute as to its appropriateness.
5 Some recent historians, under New Left inspiration, have argued that in countries like France and Italy the possibility of a Communist seizure of power was really quite small. Perhaps; counter-factuals are hard to dispose of. What matters is the political consequences these historians would retrospectively have us draw, if they were at all specific on this point. Was it erroneous, or reactionary, to believe that resistance had to be created in Europe against further Communist expansion? What attitude, for example, would they have had intellectuals, or anyone else, take during the Berlin crisis? Should the city, in the name of peace, have been yielded to the East Germans? Did the possibility of Communist victories in Western Europe require an extraordinary politics? And to what extent are present reconsiderations of Communist power in postwar Europe made possible by the fact that it was, in fact, successfully contained?
6 Fifteen years later, again swept along by the Zeitgeist, Miss McCarthy would write that the Communist societies, because of their concentration of ownership, made economic planning more feasible than did capitalist societies. She is perhaps the last intellectual in the world who seems not to have heard about the disasters of “planning” in totalitarian society (e.g., recent reports from Czechoslovakia).
7 One such attack was an essay by myself, “This Age of Conformity,” Partisan Review, 1954. Looking at it again I believe that, apart from some gratuitous polemical sentences, its main thrust still holds. No close, let alone sympathetic, analysis can be found in this essay as to why intellectuals now felt themselves so much more at home in capitalist society than they had in the 30's or why they felt themselves driven to an intransigent anti-Communism. I wrote as a polemicist, not as a historian or a sociologist of knowledge; and if that limited the scope it did not, I think, blunt the point of my attack.
8 It is not clear whether Macdonald still adheres to The Root Is Man. In a recent BBC broadcast he said about the student uprising at Columbia: “I don't approve of their methods, but Columbia will be a better place afterwards.” Perhaps it will, perhaps it won't; but I don't see how the author of The Root Is Man could say this, since the one thing he kept insisting was that means could not be separated from ends, as the Marxists too readily separated them. He would surely have felt that if the means used by the students were objectionable, then their ends would be contaminated as well—and thereby the consequences of their action. But in the swinging 60's not many people trouble to remember their own lessons.
9 The most lasting contribution this school of thought seems to have made to America is an adjective, as in “existential crisis,” which communicates the sensation of depth without the burden of content.
10 Not quite no one. In an attack on the New York writers (Hudson Review, Autumn 1965) Richard Kostelanetz speaks about “Jewish group-aggrandizement” and “the Jewish American push.” One appreciates the delicacy of his phrasing.
11 That Marcuse chooses not to apply his theories to the area of society in which he himself functions is a tribute to his personal realism, or perhaps merely a sign of a lack of intellectual seriousness. In a recent public discussion, recorded by the New York Times Magazine (May 26, 1968), there occurred the following exchange:
Hentoff: We've been talking about new institutions, new structures, as the only way to get fundamental change. What would that mean to you, Mr. Marcuse, in terms of the university, in terms of Columbia?
Marcuse: I was afraid of that because I now finally reveal myself as a fink. I have never suggested or advocated or supported destroying the established universities and building new anti-institutions instead. I have always said that no matter how radical the demands of the students and no matter how justified, they should be pressed within the existing universities. . . . I believe—and this is where the finkdom comes in—that American universities, at least quite a few of them, today are still enclaves of relatively critical thought and relatively free thought.
12 John Simon has some cogent things to say about Brown and McLuhan, the pop poppas of the new: “. . . like McLuhan, Brown fulfills the four requirements for our prophets: (I) to span and reconcile, however grotesquely, various disciplines to the relief of a multitude of specialists; (2) to affirm something, even if it is something negative, retrogressive, mad; (3) to justify something vulgar or sick or indefensible in us, whether it be television-addiction (McLuhan) or schizophrenia (Brown); (4) to abolish the need for discrimination, difficult choices, balancing mind and appetite, and so reduce the complex orchestration of life to the easy strumming of a monochord. Brown and McLuhan have nicely apportioned the world between them: the inward madness for the one, the outward manias for the other.”
13 Reviewing a theatrical grope-in called Dionysus in 69 (New Republic, August 10, 1968), Brustein pulls back a little from his enthusiasm for the swinging new. He remarks that in Dionysus “the pelvis becomes the actor's primary organ of expression” and that “only about a third of Euripides's play” The Bacchae is used in this “adaptation.” (But why even a third? Who needs words at all?) And then says Brustein: “The off-off-Broadway movement which began so promisingly with America Hurrah, MacBird, and the experimental probes of the Open Theatre, is now cultivating its worst faults, developing an anarchic Philistinism which virtually throws the writer out of the theater.”
But if what the Dean of the Yale Drama School counter-poses to Dionysus in 69 is America Hurrah and MacBird—noisy, coarse, derivative, and third-rate—how can he possibly bring to bear serious critical standards?
14 Fiedler's essay “The New Mutants” (Partisan Review, Fall 1965) is a sympathetic charting of the new sensibility, with discussions of “porno-esthetics,” the effort among young people to abandon habits and symbols of masculinity in favor of a feminized receptiveness, “the aspiration to take the final evolutionary leap and cast off adulthood completely,” and above all, the role of drugs as “the crux of the futurist revolt.”
With uncharacteristic forebearance, Fiedler denies himself any sustained or explicit judgments of this “futurist revolt,” so that the rhetorical thrust of his essay is somewhere between acclaim and resignation. He cannot completely suppress his mind, perhaps because he has been using it too long, and so we find this acute passage concerning the responses of older writers to “the most obscene forays of the young”:
. . . after a while, there will be no more Philip Rahvs and Stanley Edgar Hymans left to shock—anti-language becoming mere language with repeated use and in the face of acceptance; so that all sense of exhilaration will be lost along with the possibility of offense. What to do then except to choose silence, since raising the ante of violence is ultimately self-defeating; and the way of obscenity in any case leads as naturally to silence as to further excess?
About drugs Fiedler betrays no equivalent skepticism, so that it is hard to disagree with Lionel Abel's judgment that, “while I do not want to charge Mr. Fieldler with recommending the taking of drugs, I think his whole essay is a confession that he cannot call upon one value in whose name he could oppose it.”
15 Two examples:
“Tom Hayden began to discuss revolution with Mailer. ‘'m for Kennedy,’ said Mailer, ‘because I'm not so sure I want a revolution. Some of those kids are awfully dumb.’ Hayden the Revolutionary said a vote for George Wallace would further his objective more than a vote for RFK.” (Village Voice, May 30, 1968—and by the way, some Revolutionary!)
“If he still took a toke of marijuana from time to time for Auld Lang Syne, or in recognition of the probability that good sex had to be awfully good before it was better than on pot, yet, still!—Mailer was not in approval of any drug, he was virtually conservative about it, having demanded of his 18-year-old daughter . . . that she not take marijuana, and never LSD, until she completed her education, a mean promise to extract in these apocalyptic times.” (The Armies of the Night).
16 In this regard the editor of Dissent bears a heavy responsibility. When he first received the manuscript of “The White Negro,” he should have expressed in print his objections to the passage in which Mailer discusses the morality of beating up a fifty-year-old storekeeper. That he could not bring himself to risk losing a scoop is no excuse whatever.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The New York Intellectuals: A Chronicle & A Critique
Must-Reads from Magazine
Sex and Work in an Age Without Norms
In the Beginning Was the ‘Hostile Work Environment’
In 1979, the feminist legal thinker Catharine MacKinnon published a book called Sexual Harassment of Working Women. Her goal was to convince the public (especially the courts) that harassment was a serious problem affecting all women whether or not they had been harassed, and that it was discriminatory. “The factors that explain and comprise the experience of sexual harassment characterize all women’s situation in one way or another, not only that of direct victims of the practice,” MacKinnon wrote. “It is this level of commonality that makes sexual harassment a women’s experience, not merely an experience of a series of individuals who happen to be of the female sex.” MacKinnon was not only making a case against clear-cut instances of harassment, but also arguing that the ordinary social dynamic between men and women itself created what she called “hostile work environments.”
The culture was ripe for such arguments. Bourgeois norms of sexual behavior had been eroding for at least a decade, a fact many on the left hailed as evidence of the dawn of a new age of sexual and social freedom. At the same time, however, a Redbook magazine survey published a few years before MacKinnon’s book found that nearly 90 percent of the female respondents had experienced some form of harassment on the job.
MacKinnon’s views might have been radical—she argued for a Marxist feminist jurisprudence reflecting her belief that sexual relations are hopelessly mired in male dominance and female submission—but she wasn’t entirely wrong. The postwar America in which women like MacKinnon came of age offered few opportunities for female agency, and the popular culture of the day reinforced the idea that women were all but incapable of it.
It wasn’t just the perfect housewives in the midcentury mold of Donna Reed and June Cleaver who “donned their domestic harness,” as the historian Elaine Tyler May wrote in her social history Homeward Bound. Popular magazines such as Good Housekeeping, McCall’s, and Redbook reinforced the message; so did their advertisers. A 1955 issue of Family Circle featured an advertisement for Tide detergent that depicted a woman with a rapturous expression on her face actually hugging a box of Tide under the line: “No wonder you women buy more Tide than any other washday product! Tide’s got what women want!” Other advertisements infantilized women by suggesting they were incapable of making basic decisions. “You mean a -woman can open it?” ran one for Alcoa aluminum bottle caps. It is almost impossible to read the articles or view the ads without thinking they were some kind of put-on.
The competing view of women in the postwar era was equally pernicious: the objectified pinup or sexpot. Marilyn Monroe’s hypersexualized character in The Seven Year Itch from 1955 doesn’t even have a name—she’s simply called The Girl. The 1956 film introducing the pulchritudinous Jayne Mansfield to the world was called The Girl Can’t Help It. The behavior of Rat Pack–era men has now been so airbrushed and glamorized that we’ve forgotten just how thoroughly debased their treatment of women was. Even as we thrill to Frank Sinatra’s “nice ’n’ easy” style, we overlook the classic Sinatra movie character’s enjoying an endless stream of showgirls and (barely disguised) prostitutes until forced to settle down with a killjoy ball-and-chain girlfriend. The depiction of women either as childish wives living under the protection of their husbands or brainless sirens sexually available to the first taker was undoubtedly vulgar, but it reflected a reality about the domestic arrangements of Americans after 1945 that was due for a profound revision when the 1960s came along.
And change they did, with a vengeance. The sexual revolution broke down the barriers between the sexes as the women’s-liberation movement insisted that bourgeois domesticity was a prison. The rules melted away, but attitudes don’t melt so readily; Sinatra’s ball-and-chain may have disappeared by common consent, but for a long time it seemed that the kooky sexpot of the most chauvinistic fantasy had simply become the ideal American woman. The distinction between the workplaces of the upper middle class and the singles bars where they sought companionship was pretty blurred.
Which is where MacKinnon came in—although if we look back at it, her objection seems not Marxist in orientation but almost Victorian. She described a workplace in which women were unprotected by old-fashioned social norms against adultery and general caddishness and found themselves mired in a “hostile environment.” She named the problem; it fell to the feminist movement as a whole to enshrine protections against it. They had some success. In 1986, the U.S. Supreme Court embraced elements of MacKinnon’s reasoning when it ruled unanimously in Meritor Savings Bank v. Vinson that harassment that was “sufficiently severe or pervasive” enough to create “a hostile or abusive work environment” was a violation of Title VII of the Civil Rights Act of 1964. The U.S. Equal Employment Opportunity Commission issued rules advising employers to create procedures to combat harassment, and employers followed suit by establishing sexual-harassment policies. Human-resource departments spent countless hours and many millions of dollars on sexual-harassment-awareness training for employees.
With new regulations and enforcement mechanisms, the argument went, the final, fusty traces of patriarchal, protective norms and bad behavior would be swept away in favor of rational legal rules that would ensure equal protection for women in the workplace. The culture might still objectify women, but our legal and employment systems would, in fits and starts, erect scaffolding upon which women who were harassed could seek justice.
But as the growing list of present-day harassers and predators attests—Harvey Weinstein, Louis C.K., Charlie Rose, Michael Oreskes, Glenn Thrush, Mark Halperin, John Conyers, Al Franken, Roy Moore, Matt Lauer, Garrison Keillor, et al.—the system appears to have failed the people it was meant to protect. There were searing moments that raised popular awareness about sexual harassment: (Anita Hill’s testimony about U.S. Supreme Court nominee Clarence Thomas in 1991; Senator Bob Packwood’s ouster for serial groping in 1995). There was, however, still plenty of space for men who harassed and assaulted women (and, in Kevin Spacey’s case, men) to shelter in place.
This wasn’t supposed to happen. Why did it?
Sex and Training
What makes sexual harassment so unnerving is not the harassment. It’s the sex—a subject, even a half-century into our so-called sexual revolution, about which we remain deeply confused.
The challenge going forward, now that the Hollywood honcho Weinstein and other notoriously lascivious beneficiaries of the liberation era have been removed, is how to negotiate the rules of attraction and punish predators in a culture that no longer embraces accepted norms for sexual behavior. Who sets the rules, and how do we enforce them? The self-appointed guardians of that galaxy used to be the feminist movement, but it is in no position to play that role today as it reckons not only with the gropers in its midst (Franken) but the ghosts of gropers past (Bill Clinton).
The feminist movement long ago traded MacKinnon’s radical feminism for political expedience. In 1992 and 1998, when her husband was a presidential candidate and then president, Hillary Clinton covered for Bill, enthusiastically slut-shaming his accusers. Her sin was and is at least understandable, if not excusable, given that the two are married. But what about America’s most glamorous early feminist, Gloria Steinem? In 1998, Steinem wrote of Clinton accuser Kathleen Willey: “The truth is that even if the allegations are true, the President is not guilty of sexual harassment. He is accused of having made a gross, dumb and reckless pass at a supporter during a low point in her life. She pushed him away, she said, and it never happened again. In other words, President Clinton took ‘no’ for an answer.” As for Monica Lewinsky, Steinem didn’t even consider the president’s behavior with a young intern to be harassment: “Welcome sexual behavior is about as relevant to sexual harassment as borrowing a car is to stealing one.”
The consequences of applying to Clinton what Steinem herself called the “one-free-grope” rule are only now becoming fully visible. Even in the case of a predator as malevolent as Weinstein, it’s clear that feminists no longer have a shared moral language or the credibility with which to condemn such behavior. Having tied their movement’s fortunes to political power, especially the Democratic Party, it is difficult to take seriously their injunctions about male behavior on either side of the aisle now (just as it was difficult to take seriously partisans on the right who defended the Alabama Senate candidate and credibly accused child sexual predator Roy Moore). Democrat Nancy Pelosi’s initial hemming and hawing about denouncing accused sexual harasser Representative John Conyers was disappointing but not surprising. As for Steinem, she’s gone from posing undercover as a Playboy bunny in order to expose male vice to sitting on the board of Playboy’s true heir, VICE Media, an organization whose bro-culture has spawned many sexual-harassment complaints. She’s been honored by Rutgers University, which created the Gloria Steinem Chair in Media, Culture, and Feminist Studies. One of the chair’s major endowers? Harvey Weinstein.
In place of older accepted norms or trusted moral arbiters, we have weaponized gossip. “S—-y Media Men” is a Google spreadsheet created by a woman who works in media and who, in the wake of the Weinstein revelations, wanted to encourage other women to name the gropers among us. At first a well-intentioned effort to warn women informally about men who had behaved badly, it quickly devolved into an anonymous unverified online litany of horribles devoid of context. The men named on the list were accused of everything from sending clumsy text messages to rape; Jia Tolentino of the New Yorker confessed that she didn’t believe the charges lodged against a male friend of hers who appeared on the list.
Others have found sisterhood and catharsis on social media, where, on Twitter, the phrase #MeToo quickly became the symbol for women’s shared experiences of harassment or assault. Like the consciousness-raising sessions of earlier eras, the hashtag supposedly demonstrated the strength of women supporting other women. But unlike in earlier eras, it led not to group hugs over readings of The Feminine Mystique, but to a brutally efficient form of insta-justice meted out on an almost daily basis against the accused. Writing in the Guardian, Jessica Valenti praised #MeToo for encouraging women to tell their stories but added, “Why have a list of victims when a list of perpetrators could be so much more useful?” Valenti encouraged women to start using the hashtag as a way to out predators, not merely to bond with one another. Even the New York Times has gone all-in on the assumption that the reckoning will continue: The newspaper’s “gender editor,” Jessica Bennett, launched a newsletter, The #MeToo Moment, described as “the latest news and insights on the sexual harassment and misconduct scandals roiling our society.”
As the also-popular hashtag #OpenSecret suggests, this #MeToo moment has brought with it troubling questions about who knew what and when—and a great deal of anger at gatekeepers and institutions that might have turned a blind eye to predators. The backlash against the Metropolitan Opera in New York is only the most recent example. Reports of conductor James Levine’s molestation of teenagers have evidently been widespread in the classical-music world for decades. And, as many social-media users hinted with their use of the hashtag #itscoming, Levine is not the only one who will face a reckoning.
To be sure, questioning and catharsis are welcome if they spark reforms such as crackdowns on the court-approved payoffs and nondisclosure agreements that allowed sexual predators like Weinstein to roam free for so long. And they have also brought a long-overdue recognition of the ineffectiveness of so much of what passes for sexual-harassment-prevention training in the workplace. As the law professor Lauren Edelman noted in the Washington Post, “There have been only a handful of empirical studies of sexual-harassment training, and the research has not established that such training is effective. Some studies suggest that training may in fact backfire, reinforcing gendered stereotypes that place women at a disadvantage.” One specific survey at a university found that “men who participated in the training were less likely to view coercion of a subordinate as sexual harassment, less willing to report harassment and more inclined to blame the victim than were women or men who had not gone through the training.”
Realistic Change vs. Impossible Revolution
Because harassment lies at the intersection of law, politics, ideology, and culture, attempts to re-regulate behavior, either by returning to older, more traditional norms, or by weaponizing women’s potential victimhood via Twitter, won’t work. America is throwing the book at foul old violators like Weinstein and Levine, but aside from warning future violators that they may be subject to horrible public humiliation and ruination, how is all this going to fix the problem?
We are a long way from Phyllis Schlafly’s ridiculous remark, made years ago during a U.S. Senate committee hearing, that “virtuous women are seldom accosted,” but Vice President Mike Pence’s rule about avoiding one-on-one social interactions with women who aren’t his wife doesn’t really scale up in terms of effective policy in the workplace, either. The Pence Rule, like corporate H.R. policies about sexual harassment, really exists to protect Pence from liability, not to protect women.
Indeed, the possibility of realistic change is made almost moot by the hysterical ambitions of those who believe they are on the verge of bringing down the edifice of American masculinity the way the Germans brought down the Berlin wall. Bennett of the Times spoke for many when she wrote in her description of the #MeToo newsletter: “The new conversation goes way beyond the workplace to sweep in street harassment, rape culture, and ‘toxic masculinity’—terminology that would have been confined to gender studies classes, not found in mainstream newspapers, not so long ago.”
Do women need protection? Since the rise of the feminist movement, it has been considered unacceptable to declare that women are weaker than men (even physically), yet, as many of these recent assault cases make clear, this is a plain fact. Men are, on average, physically larger and more aggressive than women; this is why for centuries social codes existed to protect women who were, by and large, less powerful, more vulnerable members of society.
MacKinnon’s definition of harassment at first seemed to acknowledge such differences; she described harassment as “dominance eroticized.” But like all good feminist theorists, she claimed this dominance was socially constructed rather than biological—“the legally relevant content of the term sex, understood as gender difference, should focus upon its social meaning more than upon any biological givens,” she wrote. As such, the reasoning went, men’s socially constructed dominance could be socially deconstructed through reeducation, training, and the like.
Culturally, this is the view that now prevails, which is why we pinball between arguing that women can do anything men can do and worrying that women are all the potential victims of predatory, toxic men. So which is it? Girl Power or the Fainting Couch?
Regardless, when harassment or assault claims arise, the cultural assumptions that feminism has successfully cultivated demand we accept that women are right and men are wrong (hence the insistence that we must believe every woman’s claim about harassment and assault, and the calling out of those who question a woman’s accusation). This gives women—who are, after all, flawed human beings just like men—too much accusatory power in situations where context is often crucial for understanding what transpired. Feminists with a historical memory should recall how they embraced this view after mandatory-arrest laws for partner violence that were passed in the 1990s netted many women for physically assaulting their partners. Many feminist legal scholars at the time argued that such laws were unfair to women precisely because they neglected context. (“By following the letter of the law… law enforcement officers often disregard the context in which victims of violence resort to using violence themselves,” wrote Susan L. Miller in the Violence Against Women journal in 2001.)
Worse, the unquestioned valorization of women’s claims leaves men in the position of being presumed guilty unless proven innocent. Consider a recent tweet by Washington Post reporter and young-adult author Monica Hesse in response to New York Times reporter Farhad Manjoo’s self-indulgent lament. Manjoo: “I am at the point where i seriously, sincerely wonder how all women don’t regard all men as monsters to be constantly feared. the real world turns out to be a legit horror movie that I inhabited and knew nothing about.”
Hesse’s answer: “Surprise! The answer is that we do, and we must, regard all men as potential monsters to be feared. That’s why we cross to the other side of the street at night, and why we sometimes obey when men say ‘Smile, honey!’ We are always aware the alternative could be death.” This isn’t hyperbole in her case; Hesse has so thoroughly internalized the message that men are to be feared, not trusted, that she thinks one might kill her on the street if she doesn’t smile at him. Such illogic makes the Victorian neurasthenics look like the Valkyrie.
But while most reasonable people agree that women and men both need to take responsibility for themselves and exercise good judgment, what this looks like in practice is not going to be perfectly fair, given the differences between men and women when it comes to sexual behavior. In her book, MacKinnon observed of sexual harassment, “Tacitly, it has been both acceptable and taboo; acceptable for men to do, taboo for women to confront, even to themselves.”
That’s one thing we can say for certain is no longer true. Nevertheless, if you begin with the assumption that every sexual invitation is a power play or the prelude to an assault, you are likely to find enemies lurking everywhere. As Hesse wrote in the Washington Post about male behavior: “It’s about the rot that we didn’t want to see, that we shoveled into the garbage disposal of America for years. Some of the rot might have once been a carrot and some it might have once been a moldy piece of rape-steak, but it’s all fetid and horrific and now, and it’s all coming up at once. How do we deal with it? Prison for everyone? Firing for some? …We’re only asking for the entire universe to change. That’s all.”
But women are part of that “entire universe,” too, and it is incumbent on them to make it clear when someone has crossed the line. Both women and men would be better served if they adopted the same rule—“If you see something, say something”—when it comes to harassment. Among the many details that emerged from the recent exposé at Vox about New York Times reporter Glenn Thrush was the setting for the supposedly egregious behavior: It was always after work and after several drinks at a bar. In all of the interactions described, one or usually both of the parties was tipsy or drunk; the women always agreed to go with Thrush to another location. The women also stayed on good terms with Thrush after he made his often-sloppy passes at them, in one case sending friendly text messages and ensuring him he didn’t need to apologize for his behavior. The Vox writer, who herself claims to have been victimized by Thrush, argues, “Thrush, just by his stature, put women in a position of feeling they had to suck up and move on from an uncomfortable encounter.” Perhaps. But he didn’t put them in the position of getting drunk after work with him. They put themselves in that position.
Also, as the Thrush story reveals, women sometimes use sexual appeal and banter for their own benefit in the workplace. If we want to clarify the blurred lines that exist around workplace relationships, then we will have to reckon with the women who have successfully exploited them for their own advantage.
None of this means women should be held responsible when men behave badly or illegally. But it puts male behavior in the proper context. Sometimes, things really are just about sex, not power. As New York Times columnist Ross Douthat bluntly noted in a recent debate in New York magazine with feminist Rebecca Traister, “I think women shouldn’t underestimate the extent to which male sexual desire is distinctive and strange and (to women) irrational-seeming. Saying ‘It’s power, not sex’ excludes too much.”
Social-Media Justice or Restorative Justice?
What do we want to happen? Do we want social-media justice or restorative justice for harassers and predators? The first is immediate, cathartic, and brutal, with little consideration for nuance or presumed innocence for the accused. The second is more painstaking because it requires reaching some kind of consensus about the allegations, but it is also ultimately less destructive of the community and culture as a whole.
Social-media justice deploys the powerful force of shame at the mere whiff of transgression, so as to create a regime of prevention. The thing is, Americans don’t really like shame (the sexual revolution taught us that). Our therapeutic age doesn’t think that suppressing emotions and inhibiting feelings—especially about sex—is “healthy.” So either we will have to embrace the instant and unreflective emotiveness of #MeToo culture and accept that its rough justice is better than no justice at all—or we will have to stop overreacting every time a man does something that is untoward—like sending a single, creepy text message—but not actually illegal (like assault or constant harassment).
After all, it’s not all bad news from the land of masculinity. Rates of sexual violence have fallen 63 percent since 1993, according to statistics from the Rape, Abuse, and Incest National Network, and as scholar Steven Pinker recently observed: “Despite recent attention, workplace sexual harassment has declined over time: from 6.1 percent of GSS [General Social Survey] respondents in 2002 to 3.6 percent in 2014. Too high, but there’s been progress, which can continue.”
Still, many men have taken this cultural moment as an opportunity to reflect on their own understanding of masculinity. In the New York Times, essayist Stephen Marche fretted about the “unexamined brutality of the male libido” and echoed Catharine MacKinnon when he asked, “How can healthy sexuality ever occur in conditions in which men and women are not equal?” He would have done better to ask how we can raise boys who will become men who behave honorably toward women. And how do we even raise boys to become honorable men in a culture that no longer recognizes and rewards honor?
The answers to those questions aren’t immediately clear. But one thing that will make answering them even harder is the promotion of the idea of “toxic masculinity.” New York Times columnist Charles Blow recently argued that “we have to re-examine our toxic, privileged, encroaching masculinity itself. And yes, that also means on some level reimagining the rules of attraction.” But the whole point of the phrase “rules of attraction” is to highlight that there aren’t any and never have been (if you have any doubts, read the 1987 Bret Easton Ellis novel that popularized the phrase). Blow’s lectures about “toxic masculinity” are meant to sow self-doubt in men and thus encourage some enlightened form of masculinity, but that won’t end sexual harassment any more than Lysistrata-style refusal by women to have sex will end war.
Parents should be teaching their sons about personal boundaries and consent from a young age, just as they teach their daughters, and unequivocally condemn raunchy and threatening remarks about women, whether they are uttered by a talk-radio host or by the president of the United States. The phrase “that isn’t how decent men behave” should be something every parent utters.
But such efforts are made more difficult by a liberal culture that has decided to equate caddish behavior with assault precisely because it has rejected the strict norms that used to hold sway—the old conservative norms that regarded any transgression against them as a seriousviolation and punished it accordingly. Instead, in an effort to be a kinder, gentler, more “woke” society that’s understanding of everyone’s differences, we’ve ended up arbitrarily picking and choosing among the various forms of questionable behavior for which we will have no tolerance, all the while failing to come to terms with the costs of living in such a society. A culture that hangs the accused first and asks questions later might have its virtues, but psychological understanding is not one of them.
And so we come back to sex and our muddled understanding of its place in society. Is it a meaningless pleasure you’re supposed to enjoy with as many people as possible before settling down and marrying? Or is it something more important than that? Is it something that you feel empowered to handle in Riot Grrrl fashion, or is getting groped once by a pervy co-worker something that prompts decades of nightmares and declarations that you will “never be the same”? How can we condemn people like Senator Al Franken, whose implicit self-defense is that it’s no big deal to cop a feel every so often, when our culture constantly offers up women like comedian Amy Schumer or Abbi and Ilana of the sketch show Broad City, who argue that women can and should be as filthy and degenerate as the most degenerate guy?
Perhaps it’s progress that the downfall of powerful men who engage in inappropriate sexual behavior is no longer called a “bimbo eruption,” as it was in the days of Bill Clinton, and that the men who harassed or assaulted women are facing the end of their careers and, in some cases, prison. But this is not the great awakening that so many observers have claimed it is. Awakenings need tent preachers to inspire and eager audiences to participate; our #MeToo moment has plenty of those. What it doesn’t have, unless we can agree on new norms for sexual behavior both inside and outside the workplace, is a functional theology that might cultivate believers who will actually practice what they preach.
That functional theology is out of our reach. Which means this moment is just that—a moment. It will die down, impossible though it seems at present. And every 10 or 15 years a new harassment scandal will spark widespread outrage, and we will declare that a new moment of reckoning and realization has emerged. After which the stories will again die down and very little will have changed.
No one wants to admit this. It’s much more satisfying to see the felling of so many powerful men as a tectonic cultural shift, another great leap forward toward equality between the sexes. But it isn’t, because the kind of asexual equality between the genders imagined by those most eager to celebrate our #MeToo moment has never been one most people embrace. It’s one that willfully overlooks significant differences between the sexes and assumes that thoughtful people can still agree on norms of sexual behavior.
They can’t. And they won’t.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The U.S. will endanger itself if it accedes to Russian and Chinese efforts to change the international system to their liking
A “sphere of influence” is traditionally understood as a geographical zone within which the most powerful actor can impose its will. And nearly three decades after the close of the superpower struggle that Churchill’s speech heralded, spheres of influence are back. At both ends of the Eurasian landmass, the authoritarian regimes in China and Russia are carving out areas of privileged influence—geographic buffer zones in which they exercise diplomatic, economic, and military primacy. China and Russia are seeking to coerce and overawe their neighbors. They are endeavoring to weaken the international rules and norms—and the influence of opposing powers—that stand athwart their ambitions in their respective “near abroads.” Chinese island-building and maritime expansionism in the South China Sea and Russian aggression in Ukraine and intimidation of the Baltic states are part and parcel of the quasi-imperial projects these revisionist regional powers are now pursuing.
Historically speaking, a world made up of rival spheres is more the norm than the exception. Yet such a world is in sharp tension with many of the key tenets of the American foreign-policy tradition—and with the international order that the United States has labored to construct and maintain since the end of World War II.
To be sure, Washington carved out its own spheres of influence in the Western Hemisphere beginning in the 19th century, and America’s myriad alliance blocs in key overseas regions are effectively spheres by another name. And today, some international-relations observers have welcomed the return of what the foreign-policy analyst Michael Lind has recently called “blocpolitik,” hoping that it might lead to a more peaceful age of multilateral equilibrium.
But for more than two centuries, American leaders have generally opposed the idea of a world divided into rival spheres of influence and have worked hard to deny other powers their own. And a reversion to a world dominated by great powers and their spheres of influence would thus undo some of the strongest traditions in American foreign policy and take the international system back to a darker, more dangerous era.I n an extreme form, a sphere of influence can take the shape of direct imperial or colonial control. Yet there are also versions in which a leading power forgoes direct military or administrative domination of its neighbors but nonetheless exerts geopolitical, economic, and ideological influence. Whatever their form, spheres of influence reflect two dominant imperatives of great-power politics in an anarchic world: the need for security vis-à-vis rival powers and the desire to shape a nation’s immediate environment to its benefit. Indeed, great powers have throughout history pursued spheres of influence to provide a buffer against the encroachment of other hostile actors and to foster the conditions conducive to their own security and well-being.
The Persian Empire, Athens and Sparta, and Rome all carved out domains of dominance. The Chinese tribute system—which combined geopolitical control with the spread of Chinese norms and ideas—profoundly shaped the trajectory of East Asia for hundreds of years. The 19th and 20th centuries saw the British Empire, Japan’s East Asian Co-Prosperity Sphere, and the Soviet bloc.
America, too, has played the spheres-of-influence game. From the early-19th century onward, American officials strove for preeminence in the Western Hemisphere—first by running other European powers off much of the North American continent and then by pushing them out of Latin America. With the Monroe Doctrine, first enunciated in 1823, America staked its claim to geopolitical primacy from Canada to the Southern Cone. Over the succeeding generations, Washington worked to achieve military dominance in that area, to tie the countries of the Western Hemisphere to America geopolitically and economically, and even to help pick the rulers of countries from Mexico to Brazil.
If this wasn’t a sphere of influence, nothing was. In 1895, Secretary of State Richard Olney declared that “the United States is practically sovereign on this continent and its fiat is law upon the subjects to which it confines its interposition.” After World War II, moreover, a globally predominant United States steadily expanded its influence into Europe through NATO, into East Asia through various military alliances, and into the Middle East through a web of defense, diplomatic, and political arrangements. The story of global politics over the past 200 years has, in large part, been the story of expanding U.S. influence.
Nonetheless, there has always been something ambivalent—critics would say hypocritical—about American views of this matter. For as energetic as Washington has been in constructing its geopolitical domain, a “spheres-of-influence world” is in perpetual tension with four strong intellectual traditions in U.S. strategy. These are hegemony, liberty, openness, and exceptionalism.
First, hegemony. The myth of America as an innocent isolationist country during its first 170 years is powerful and enduring; it’s also wrong. From the outset, American statesmen understood that the country’s favorable geography, expanding population, and enviable resource endowments gave it the potential to rival, and ultimately overtake, the European states that dominated world politics. America might be a fledgling republic, George Washington said, but it would one day attain “the strength of a giant.” From the revolution onward, American officials worried, with good reason, that France, Spain, and the United Kingdom would use their North American territories to strangle or contain the young republic. Much of early American diplomacy was therefore geared toward depriving the European powers of their North American possessions, using measures from coercive diplomacy to outright wars of conquest. “The world shall have to be familiarized with the idea of considering our proper dominion to be the continent of North America,” wrote John Quincy Adams in 1819. The only regional sphere of influence that Americans would accept as legitimate was their own.
By the late-19th century, the same considerations were pushing Americans to target spheres of influence further abroad. As the industrial revolution progressed, it became clear that geography alone might not protect the nation. Aggressive powers could now generate sufficient military strength to dominate large swaths of Europe or East Asia and then harness the accumulated resources to threaten the United States. Moreover, as America itself became an increasingly mighty country that sought to project its influence overseas, its leaders naturally objected to its rivals’ efforts to establish their own preserves from which Washington would be excluded. If much of America’s 19th-century diplomacy was dedicated to denying other powers spheres of influence in the Western Hemisphere, much of the country’s 20th-century diplomacy was an effort to break up or deny rival spheres of influence in Europe and East Asia.
From the Open Door policy, which sought to prevent imperial powers from carving up China, to U.S. intervention in the world wars, to the confrontation with the Soviet Empire in the Cold War, the United States repeatedly acted on the belief that it could be neither as secure nor influential as it desired in a world divided up and dominated by rival nations. The American geopolitical tradition, in other words, has long contained a built-in hostility to other countries’ spheres of influence.
The American ideological tradition shares this sense of preeminence, as reflected in the second key tenet: liberty. America’s founding generation did not see the revolution merely as the birth of a future superpower; they saw it as a catalyst for spreading political liberty far and wide. Thomas Paine proclaimed in 1775 that Americans could “begin the world anew”; John Quincy Adams predicted, several decades later, that America’s liberal ideology was “destined to cover the surface of the globe.” Here, too, the new nation was not cursed with excessive modesty—and here, too, the existence of rival spheres of influence threatened this ambition.
Rival spheres of influence—particularly within the Western Hemisphere—imperiled the survival of liberty at home. If the United States were merely one great power among many on the North American continent, the founding generation worried, it would be forced to maintain a large standing military establishment and erect a sort of 18th-century “garrison state.” Living in perpetual conflict and vigilance, in turn, would corrode the very freedoms for which the revolution had been fought. “No nation,” wrote James Madison, “can preserve its freedom in the midst of continual warfare.” Just as Madison argued, in Federalist No. 10, that “extending the sphere”—expanding the republic—was a way of safeguarding republicanism at home, expanding America’s geopolitical domain was essential to providing the external security that a liberal polity required to survive.
Rival spheres of influence also constrained the prospects for liberty abroad. Although the question of whether the United States should actively support democratic revolutions overseas has been a source of unending controversy, virtually all American strategists have agreed that the country would be more secure and influential in a world where democracy was widespread. Given this mindset, Americans could hardly be desirous of foreign powers—particularly authoritarian powers—establishing formidable spheres of influence that would allow them to dominate the international system or suppress liberal ideals. The Monroe Doctrine was a response to the geopolitical dangers inherent in renewed imperial control of South America; it was also a response to the ideological danger posed by European nations that would “extend the political system to any portion” of the Western Hemisphere. Similar concerns have been at the heart of American opposition to the British Empire and the Soviet bloc.
Economic openness, the third core dynamic of American policy, has long served as a commercial counterpart to America’s ideological proselytism. Influenced as much by Adam Smith as by Alexander Hamilton, early American statecraft promoted free trade, neutral rights, and open markets, both to safeguard liberty and enrich a growing nation. This mission has depended on access to the world’s seas and markets. When that access was circumscribed—by the British in 1812 and by the Germans in 1917—Americans went to war to preserve it. It is unsurprising, then, that Americans also looked askance at efforts by other powers to establish areas that might be walled off from U.S. trade and investment—and from the spread of America’s capitalist ideology.
A brief list of robust policy endeavors underscores the persistent U.S. hostility to an economically closed, spheres-of-influence world: the Model Treaty of 1776, designed to promote free and reciprocal trade; John Hay’s Open Door policy of 1899, designed to prevent any outside power from dominating trade with China; Woodrow Wilson’s advocacy in his “14 Points” speech of 1918 for the removal “of all economic barriers and the establishment of an equality of trade conditions among all nations”; and the focus of the 1941 Atlantic Charter on reducing trade restrictions while promoting international economic cooperation (assuming the allies would emerge triumphant from World War II).
Fourth and finally, there’s exceptionalism. Americans have long believed that their nation was created not simply to replicate the practices of the Old World, but to revolutionize how states and peoples interact with one another. The United States, in this view, was not merely another great power out for its own self-interest. It was a country that, by virtue of its republican ideals, stood for the advancement of universal rights, and one that rejected the back-alley methods of monarchical diplomacy in favor of a more principled statecraft. When Abraham Lincoln said America represented “the last best hope of earth,” or when Woodrow Wilson scorned secret agreements in favor of “open covenants arrived at openly,” they demonstrated this exceptionalist strain in American thinking. There is some hypocrisy here, of course, for the United States has often acted in precisely the self-interested, cutthroat manner its statesmen deplored. Nonetheless, American exceptionalism has had a pronounced effect on American conduct.
Compare how Washington led its Western European allies during the Cold War—the extent to which NATO rested on the authentic consent of its members, the way the United States consistently sought to empower rather than dominate its partners—with how Moscow managed its empire in Eastern Europe. In the same way, Americans have often recoiled from arrangements that reeked of the old diplomacy. Franklin Roosevelt might have tolerated a Soviet-dominated Eastern Europe after World War II, for instance, but he knew he could not admit this publicly. Likewise, the Helsinki Accords of 1975, which required Washington to acknowledge the diplomatic legitimacy of the Soviet sphere, proved controversial inside the United States because they seemed to represent just the sort of cynical, old-school geopolitics that American exceptionalism abhors.
To be clear, U.S. hostility to a spheres-of-influence world has always been leavened with a dose of pragmatism; American leaders have pursued that hostility only so far as power and prudence allowed. The Monroe Doctrine warned European powers to stay out of the Americas, but the quid pro quo was that a young and relatively weak United States would accept, for a time, a sphere of monarchical dominance within Europe. Even during the Cold War, U.S. policymakers generally accepted that Washington could not break up the Soviet bloc in Eastern Europe without risking nuclear war.
But these were concessions to expediency. As America gained greater global power, it more actively resisted the acquisition or preservation of spheres by others. From gradually pushing the Old World out of the New, to helping vanquish the German and Japanese Empires by force of arms, to assisting the liquidation of the British Empire after World War II, to containing and ultimately defeating the Soviet bloc, the United States was present at the destruction of spheres of influence possessed by adversaries and allies alike.
The acme of this project came in the quarter-century that followed the Cold War. With the collapse of the Warsaw Pact and the Soviet Union itself, it was possible to envision a world in which what Thomas Jefferson called America’s “empire of liberty” could attain global dimensions, and traditional spheres of influence would be consigned to history. The goal, as George W. Bush’s 2002 National Security Strategy proclaimed, was to “create a balance of power that favors human freedom.” This meant an international environment in which the United States and its values were dominant and there was no balance of power whatsoever.
Under presidents from George H.W. Bush to Barack Obama, this project entailed working to spread democracy and economic liberalism farther than ever before. It involved pushing American influence and U.S.-led institutions into regions—such as Eastern Europe—that were previously dominated by other powers. It meant maintaining the military primacy necessary to stop regional powers from establishing new spheres of influence, as Washington did by rolling back Saddam Hussein’s conquest of Kuwait in 1990 and by deterring China from coercing Taiwan in 1995–96. Not least, this American project involved seeking to integrate potential rivals—foremost Russia and China—into the post–Cold War order, in hopes of depriving them of even the desire to challenge it. This multifaceted effort reflected the optimism of the post-Cold War era, as well as the influence of tendencies with deep roots in the American past. Yet try as Washington might to permanently leave behind a spheres-of-influence world, that prospect is once again upon us.B egin with China’s actions in the Asia-Pacific region. The sources of Chinese conduct are diverse, ranging from domestic insecurity to the country’s confidence as a rising power to its sense of historical destiny as “the Middle Kingdom.” All these influences animate China’s bid to establish regional mastery. China is working, first, to create a power vacuum by driving the United States out of the Western Pacific, and second, to fill that vacuum with its own influence. A Chinese admiral made this ambition clear when he remarked—supposedly in jest—to an American counterpart that, in the future, the two powers should simply split the Pacific with Hawaii as the dividing line. Yang Jiechi, then China’s foreign minister, echoed this sentiment in a moment of frustration by lecturing the nations of Southeast Asia. “China is a big country,” he said, “and other countries are small countries, and that’s just a fact.”
Policy has followed rhetoric. To undercut America’s position, Beijing has harassed American ships and planes operating in international waters and airspace. The Chinese have warned U.S. allies they may be caught in the crossfire of a Sino-American war unless Washington accommodates China or the allies cut loose from the United States. China has simultaneously worked to undermine the credibility of U.S. alliance guarantees by using strategies designed to shift the regional status quo in ways even the mighty U.S. Navy finds difficult to counter. Through a mixture of economic aid and diplomatic coercion, Beijing has also successfully divided international bodies, such as the Association of Southeast Asian Nations, through which the United States has sought to rally opposition to Chinese assertiveness. And in the background, China has been steadily building, over the course of more than two decades, formidable military tools designed to keep the United States out of the region and give Beijing a free hand in dealing with its weaker neighbors. As America’s sun sets in the Asia-Pacific, Chinese leaders calculate, the shadow China casts over the region will only grow longer.
To that end, China has claimed, dubiously, nearly all of the South China Sea as its own and constructed artificial islands as staging points for the projection of military power. Military and paramilitary forces have teased, confronted, and violated the sovereignty of countries from Vietnam to the Philippines; China is likewise intensifying the pressure on Japan in the East China Sea. Economically, Beijing uses its muscle to reward those who comply with China’s policies and punish those not willing to bow to its demands. It is simultaneously advancing geoeconomic projects, such as the Belt and Road Initiative, Asian Infrastructure Investment Bank, and Regional Comprehensive Economic Project (RCEP) that are designed to bring the region into its orbit.
Strikingly, China has also moved away from its long-professed principle of noninterference in other countries’ domestic politics by extending the reach of Chinese propaganda organs and using investment and even bribery to co-opt regional elites. Payoffs to Australian politicians are as critical to China’s regional project as development of “carrier-killer” missiles. Finally, far from subscribing to liberal concepts of democracy and human rights, Beijing emphasizes its rejection of these values and its desire to create “Asia for Asians.” In sum, China is pursuing a classic spheres-of-influence project. By blending intimidation with inducement, Beijing aims to sunder its neighbors’ bonds with America and force them to accept a Sino-centric order—a new Chinese tribute system for the 21st century.A t the other end of Eurasia, Russia is playing geopolitical hardball of a different sort. The idea that Moscow should dominate its “near abroad” is as natural to many Russians as American regional primacy is to Americans. The loss of the Kremlin’s traditional buffer zone was, therefore, one of the most painful legacies of the Cold War’s end. And so it is hardly surprising that, as Russia has regained a degree of strength in recent years, it has sought to reassert its supremacy.
It has done so, in fact, through more overtly aggressive means than those employed by China. Moscow has twice seized opportunities to humiliate and dismember former Soviet republics that committed the sin of tilting toward the West or throwing out pro-Russian leaders, first in Georgia in 2008 and then in Ukraine in 2014. It has regularly reminded its neighbors that they live on Russia’s doorstep, through coercive activities such as conducting cyberattacks on Estonia in 2007 and holding aggressive military exercises on the frontiers of the Baltic states. In the same vein, the Kremlin has essentially claimed a veto over the geopolitical alignments of neighbors from the Caucasus to Scandinavia, whether by creating frozen conflicts on their territory or threatening to target them militarily—perhaps with nuclear weapons—should they join NATO.
Military muscle is not Moscow’s only tool. Russia has simultaneously used energy exports to keep the states on its periphery economically dependent, and it has exported corruption and illiberalism to non-aligned states in the former Warsaw Pact area to prevent further encroachment of liberal values. Not least, the Kremlin has worked to undermine NATO and the European Union through political subversion and intervention in Western electoral processes. And while Russia’s activities are most concentrated in Eastern Europe and Central Asia, it’s also projecting its influence farther afield. Russian forces intervened successfully in Syria in 2015 to prop up Bashar al-Assad, preserve access to warm-water ports on the Mediterranean, and demonstrate the improved accuracy and lethality of Russian arms. Moscow continues to make inroads in the Middle East, often in cooperation with another American adversary: Iran.
To be sure, the projects that China and Russia are pursuing today are vastly different from each other, but the core logic is indisputably the same. Authoritarian powers are re-staking their claim to privileged influence in key geostrategic areas.S o what does this mean for American interests? Some observers have argued that the United States should make a virtue of necessity and accept the return of such arrangements. By this logic, spheres of influence create buffer zones between contending great powers; they diffuse responsibility for enforcing order in key areas. Indeed, for those who think that U.S. policy has left the country exhausted and overextended, a return to a world in which America no longer has the burden of being the dominant power in every region may seem attractive. The great sin of American policy after the Cold War, many realist scholars argue, was the failure to recognize that even a weakened Russia would demand privileged influence along its frontiers and thus be unalterably opposed to NATO expansion. Similarly, they lament the failure to understand that China would not forever tolerate U.S. dominance along its own periphery. It is not surprising, then, to hear analysts such as Australia’s Hugh White or America’s John Mearsheimer argue that the United States should learn to “share power” with China in the Pacific, or that it must yield ground in Eastern Europe in order to avoid war with Russia.
Such claims are not meritless; there are instances in which spheres of influence led to a degree of stability. The division of Europe into rival blocs fostered an ugly sort of stasis during the Cold War; closer to home, America’s dominance in the Western Hemisphere has long muted geopolitical competition in our own neighborhood. For all the problems associated with European empires, they often partially succeeded in limiting scourges such as communal violence.
And yet the allure of a spheres-of-influence world is largely an illusion, for such a world would threaten U.S. interests, traditions, and values in several ways.
First, basic human rights and democratic values would be less respected. China and Russia are not liberal democracies; they are illiberal autocracies that see the spread of democratic values as profoundly corrosive to their own authority and security. Just as the United States has long sought to create a world congenial to its own ideological predilections, Beijing and Moscow would certainly do likewise within their spheres of dominance.
They would, presumably, bring their influence to bear in support of friendly authoritarian regimes. And they would surely undermine democratic governments seen to pose a threat of ideological contagion or insubordination to Russian or Chinese prerogatives. Russia has taken steps to prevent the emergence of a Western-facing democracy in Ukraine and to undermine liberal democracies in Europe and elsewhere; China is snuffing out political freedoms in Hong Kong. Such actions offer a preview of what we will see when these countries are indisputably dominant along their peripheries. Further aggressions, in turn, would not simply be offensive to America’s ideological sensibilities. For given that the spread of democracy has been central to the absence of major interstate war in recent decades, and that the spread of American values has made the U.S. more secure and influential, a less democratic world will also be a more dangerous world.
Second, a spheres-of-influence world would be less open to American commerce and investment. After all, the United States itself saw geoeconomic dominance in Latin America as the necessary counterpart to geopolitical dominance. Why would China take a less self-interested approach? China already reaps the advantages of an open global economy even as it embraces protectionism and mercantilism. In a Chinese-dominated East Asia, all economic roads will surely lead to Beijing, as Chinese officials will be able to use their leverage to ensure that trade and investment flows are oriented toward China and geopolitical competitors like the United States are left on the outside. Beijing’s current geoeconomic projects—namely, RCEP and the Belt and Road Initiative—offer insight into a regional economic future in which flows of commerce and investment are subject to heavy Chinese influence.
Third, as spheres of influence reemerge, the United States will be less able to shape critical geopolitical events in crucial regions. The reason Washington has long taken an interest in events in faraway places is that East Asia, Europe, and the Middle East are the areas from which major security challenges have emerged in the past. Since World War II, America’s forward military presence has been intended to suppress incipient threats and instability; that presence has gone hand in glove with energetic diplomacy that amplifies America’s voice and protects U.S. interests. In a spheres-of-influence world, Washington would no longer enjoy the ability to act with decisive effect in these regions; it would find itself reacting to global events rather than molding them.
This leads to a final, and crucial, issue. America would be more likely to find its core security interests challenged because world orders based on rival spheres of influence have rarely been as peaceful and settled as one might imagine.
To see this, just work backward from the present. During the Cold War, a bipolar balance did help avert actual war between Moscow and Washington. But even in Europe—where the spheres of influence were best defined—there were continual tensions and crises as Moscow tested the Western bloc. And outside Europe, violence and proxy wars were common as the superpowers competed to extend their reach into the Third World. In the 1930s, the emergence of German and Japanese spheres of influence led to the most catastrophic war in global history. The empires of the 19th century—spheres of influence in their own right—continually jostled one another, leading to wars and near-wars over the course of decades; the Peace of Amiens between England and Napoleonic France lasted a mere 14 months. And looking back to the ancient world, there were not one, but three Punic Wars fought between Rome and Carthage as two expanding empires came into conflict. A world defined by spheres of influence is often a world characterized by tensions, wars, and competition.
The reasons for this are simple. As the political scientist William Wohlforth observed, unipolar systems—such as the U.S.-dominated post–Cold War order—are anchored by a hegemonic power that can act decisively to maintain the peace. In a unipolar system, Wohlforth writes, there are few incentives for revisionist powers to incur the “focused enmity” of the leading state. Truly multipolar systems, by contrast, have often been volatile. When the major powers are more evenly matched, there is a greater temptation to aggression by those who seek to change the existing order of things. And seek to change things they undoubtedly will.
The idea that spheres of influence are stabilizing holds only if one assumes that the major powers are motivated only by insecurity and that concessions to the revisionists will therefore lead to peace. Churchill described this as the idea that if one “feeds the crocodile enough, the crocodile will eat him last.”
Unfortunately, today’s rising or resurgent powers are also motivated—as is America—by honor, ambition, and the timeless desire to make their international habitats reflect their own interests and ideals. It is a risky gamble indeed, then, to think that ceding Russia or China an uncontested sphere of influence would turn a revisionist authoritarian regime into a satisfied power. The result, as Robert Kagan has noted, might be to embolden those actors all the more, by giving them freer rein to bring their near-abroads under control, greater latitude and resources to pursue their ambitions, and enhanced confidence that the U.S.-led order is fracturing at its foundations. For China, dominance over the first island chain might simply intensify desires to achieve primacy in the second island chain and beyond; for Russia, renewed mastery in the former Soviet space could lead to desires to bring parts of the former Warsaw Pact to heel, as well. To observe how China is developing ever longer-range anti-access/area denial capabilities, or how Russia has been projecting military power ever farther afield, is to see this process in action.T he reemergence of a spheres-of-influence world would thus undercut one of the great historical achievements of U.S. foreign policy: the creation of a system in which America is the dominant power in each major geopolitical region and can act decisively to shape events and protect its interests. It would foster an environment in which democratic values are less prominent, authoritarian models are ascendant, and mercantilism advances as economic openness recedes. And rather than leading to multipolar stability, this change could simply encourage greater revisionism on the part of powers whose appetite grows with the eating. This would lead the world away from the relative stability of the post–Cold War era and back into the darker environment it seemed to have relegated to history a quarter-century ago. The phrase “spheres of influence” may sound vaguely theoretical and benign, but its real-world effects are likely to be tangible and pernicious.
Fortunately, the return of a spheres-of-influence world is not yet inevitable. Even as some nations will accept incorporation into a Chinese or Russian sphere of influence as the price of avoiding conflict, or maintaining access to critical markets and resources, others will resist because they see their own well-being as dependent on the preservation of the world order that Washington has long worked to create. The Philippines and Cambodia seem increasingly to fall into the former group; Poland and Japan, among many others, make up the latter. The willingness of even this latter group to take actions that risk incurring Beijing and Moscow’s wrath, however, will be constantly calibrated against an assessment of America’s own ability to continue leading the resistance to a spheres-of-influence world. Averting that outcome is becoming steadily harder, as the relative power and ambition of America’s authoritarian rivals rise and U.S. leadership seems to falter.
Harder, but not impossible. The United States and its allies still command a significant preponderance of global wealth and power. And the political, economic, and military weaknesses of its challengers are legion. It is far from fated, then, that the Western Pacific and Eastern Europe will slip into China’s and Russia’s respective orbits. With sufficient creativity and determination, Washington and its partners might still be able to resist the return of a dangerous global system. Doing so will require difficult policy work in the military, economic, and diplomatic realms. But ideas precede policy, and so simply rediscovering the venerable tradition of American hostility to spheres of influence—and no less, the powerful logic on which that tradition is based—would be a good start.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
What does the man with the baton actually do?
Why, then, are virtually all modern professional orchestras led by well-paid conductors instead of performing on their own? It’s an interesting question. After all, while many celebrity conductors are highly trained and knowledgeable, there have been others, some of them legendary, whose musical abilities were and are far more limited. It was no secret in the world of classical music that Serge Koussevitzky, the music director of the Boston Symphony from 1924 to 1949, found it difficult to read full orchestral scores and sometimes learned how to lead them in public by first practicing with a pair of rehearsal pianists whom he “conducted” in private.
Yet recordings show that Koussevitzky’s interpretations of such complicated pieces of music as Aaron Copland’s El Salón México and Maurice Ravel’s orchestral transcription of Mussorgsky’s Pictures at an Exhibition (both of which he premiered and championed) were immensely characterful and distinctive. What made them so? Was it the virtuosic playing of the Boston Symphony alone? Or did Koussevitzky also bring something special to these performances—and if so, what was it?
Part of what makes this question so tricky to answer is that scarcely any well-known conductors have spoken or written in detail about what they do. Only two conductors of the first rank, Thomas Beecham and Bruno Walter, have left behind full-length autobiographies, and neither one features a discussion of its author’s technical methods. For this reason, the publication of John Mauceri’s Maestros and Their Music: The Art and Alchemy of Conducting will be of special interest to those who, like my friend, wonder exactly what it is that conductors contribute to the performances that they lead.1
An impeccable musical journeyman best known for his lively performances of film music with the Hollywood Bowl Orchestra, Mauceri has led most of the world’s top orchestras. He writes illuminatingly about his work in Maestros and Their Music, leavening his discussions of such matters as the foibles of opera directors and music critics with sharply pointed, sometimes gossipy anecdotes. Most interesting of all, though, are the chapters in which he talks about what conductors do on the podium. To read Maestros and Their Music is to come away with a much clearer understanding of what its author calls the “strange and lawless world” of conducting—and to understand how conductors whose technique is deficient to the point of seeming incompetence can still give exciting performances.P rior to the 19th century, conductors of the modern kind did not exist. Orchestras were smaller then—most of the ensembles that performed Mozart’s symphonies and operas contained anywhere from two to three dozen players—and their concerts were “conducted” either by the leader of the first violins or by the orchestra’s keyboard player.
As orchestras grew larger in response to the increasing complexity of 19th-century music, however, it became necessary for a full-time conductor both to rehearse them and to control their public performances, normally by standing on a podium placed in front of the musicians and beating time in the air with a baton. Most of the first men to do so were composers, including Hector Berlioz, Felix Mendelssohn, and Richard Wagner. By the end of the century, however, it was becoming increasingly common for musicians to specialize in conducting, and some of them, notably Arthur Nikisch and Arturo Toscanini, came to be regarded as virtuosos in their own right. Since then, only three important composers—Benjamin Britten, Leonard Bernstein, and Pierre Boulez—have also pursued parallel careers as world-class conductors. Every other major conductor of the 20th century was a specialist.
What did these men do in front of an orchestra? Mauceri’s description of the basic physical process of conducting is admirably straightforward:
The right hand beats time; that is, it sets the tempo or pulse of the music. It can hold a baton. The left hand turns pages [in the orchestral score], cues instrumentalists with an invitational or pointing gesture, and generally indicates the quality of the notes (percussive, smoothly linked, sustained, etc.).
Beyond these elements, though, all bets are off. Most of the major conductors of the 20th century were filmed in performance, and what one sees in these films is so widely varied that it is impossible to generalize about what constitutes a good conducting technique.2 Most of them used batons, but several, including Boulez and Leopold Stokowski, conducted with their bare hands. Bernstein and Beecham gestured extravagantly, even wildly, while others, most famously Fritz Reiner, restricted themselves to tightly controlled hand movements. Toscanini beat time in a flowing, beautifully expressive way that made his musical intentions self-evident, but Wilhelm Furtwängler and Herbert von Karajan often conducted so unclearly that it is hard to see how the orchestras they led were able to follow them. (One exasperated member of the London Philharmonic claimed, partly in jest, that Furtwängler’s baton signaled the start of a piece “only after the thirteenth preliminary wiggle.”) Conductors of the Furtwängler sort tend to be at their best in front of orchestras with which they have worked for many years and whose members have learned from experience to “speak” their gestural language fluently.
Nevertheless, all of these men were pursuing the same musical goals. Beyond stopping and starting a given piece, it is the job of a conductor to decide how it will be interpreted. How loud should the middle section of the first movement be—and ought the violins to be playing a bit softer so as not to drown out the flutes? Someone must answer questions such as these if a performance is not to sound indecisive or chaotic, and it is far easier for one person to do so than for 100 people to vote on each decision.
Above all, a conductor controls the tempo of a performance, varying it from moment to moment as he sees fit. It is impossible for a full-sized symphony orchestra to play a piece with any degree of rhythmic flexibility unless a conductor is controlling the performance from the podium. Bernstein put it well when he observed in a 1955 TV special that “the conductor is a kind of sculptor whose element is time instead of marble.” These “sculptural” decisions are subjective, since traditional musical notation cannot be matched with exactitude. As Mauceri reminds us, Toscanini and Beecham both recorded La Bohème, having previously discussed their interpretations with Giacomo Puccini, the opera’s composer, and Toscanini conducted its 1896 premiere. Yet Beecham’s performance is 14 minutes longer than Toscanini’s. Who is “right”? It is purely a matter of individual taste, since both interpretations are powerfully persuasive.
Beyond the not-so-basic task of setting, maintaining, and varying tempos, it is the job of a conductor to inspire an orchestra—to make its members play with a charged precision that transcends mere unanimity. The first step in doing so is to persuade the players of his musical competence. If he cannot run a rehearsal efficiently, they will soon grow bored and lose interest; if he does not know the score in detail, they will not take him seriously. This requires extensive preparation on the part of the conductor, and an orchestra can tell within seconds of the downbeat whether he is adequately prepared—a fact that every conductor knows. “I’m extremely humble about whatever gifts I may have, but I am not modest about the work I do,” Bernstein once told an interviewer. “I work extremely hard and all the time.”
All things being equal, it is better than not for a conductor to have a clear technique, if only because it simplifies and streamlines the process of rehearsing an orchestra. Fritz Reiner, who taught Bernstein among others, did not exaggerate when he claimed that he and his pupils could “stand up [in front of] an orchestra they have never seen before and conduct correctly a new piece at first sight without verbal explanation and by means only of manual technique.”
While orchestra players prefer this kind of conducting, a conductor need not have a technique as fully developed as that of a Reiner or Bernstein if he knows how to rehearse effectively. Given sufficient rehearsal time, decisive and unambiguous verbal instructions will produce the same results as a virtuoso stick technique. This was how Willem Mengelberg and George Szell distinguished themselves on the podium. Their techniques were no better than adequate, but they rehearsed so meticulously that their performances were always brilliant and exact.
It also helps to supply the members of the orchestra with carefully marked orchestra parts. Beecham’s manual technique was notoriously messy, but he marked his musical intentions into each player’s part so clearly and precisely that simply reading the music on the stand would produce most of the effects that he desired.
What players do not like is to be lectured. They want to be told what to do and, if absolutely necessary, how to do it, at which point the wise conductor will stop talking and start conducting. Mauceri recalls the advice given to a group of student conductors by Joseph Silverstein, the concertmaster of the Boston Symphony: “Don’t talk to us about blue skies. Just tell us ‘longer-shorter,’ ‘faster-slower,’ ‘higher-lower.’” Professional musicians cannot abide flowery speeches about the inner meaning of a piece of music, though they will readily respond to a well-turned metaphor. Mauceri makes this point with a Toscanini anecdote:
One of Toscanini’s musicians told me of a moment in a rehearsal when the sound the NBC Symphony was giving him was too heavy. … In this case, without saying a word, he reached into his pocket and took out his silk handkerchief, tossed it into the air, and everyone watched it slowly glide to earth. After seeing that, the orchestra played the same passage exactly as Toscanini wanted.
Conducting, like all acts of leadership, is in large part a function of character. The violinist Carl Flesch went so far as to call it “the only musical activity in which a dash of charlatanism is not only harmless, but positively necessary.” While that is putting it too cynically, Flesch was on to something. I did a fair amount of conducting in college, but even though I practiced endlessly in front of a mirror and spent hours poring over my scores, I lacked the personal magnetism without which no conductor can hope to be more than merely competent at best.
On the other hand, a talented musician with a sufficiently compelling personality can turn himself into a conductor more or less overnight. Toscanini had never conducted an orchestra before making his unrehearsed debut in a performance of Verdi’s Aida at the age of 19, yet the players hastened to do his musical bidding. I once saw the modern-dance choreographer Mark Morris, whose knowledge of classical music is profound, lead a chorus and orchestra in the score to Gloria, a dance he had made in 1981 to a piece by Vivaldi. It was no stunt: Morris used a baton and a score and controlled the performance with the assurance of a seasoned pro. Not only did he have a strong personality, but he had also done his musical homework, and he knew that one was as important as the other.
The reverse, however, is no less true: The success of conductors like Serge Koussevitzky is at least as much a function of their personalities as of their preparation. To be sure, Koussevitzky had been an instrumental virtuoso (he played the double bass) before taking up conducting, but everyone who worked with him in later years was aware of his musical limitations. Yet he was still capable of imposing his larger-than-life personality on players who might well have responded indifferently to his conducting had he been less charismatic. Leopold Stokowski functioned in much the same way. He was widely thought by his peers to have been far more a showman than an artist, to the point that Toscanini contemptuously dismissed him as a “clown.” But he had, like Koussevitzky, a richly romantic musical imagination coupled with the showmanship of a stage actor, and so the orchestras that he led, however skeptical they might be about his musical seriousness, did whatever he wanted.
All great conductors share this same ability to impose their will on an orchestra—and that, after all, is the heart of the matter. A conductor can be effective only if the orchestra does what he wants. It is not like a piano, whose notes automatically sound when the keys are pressed, but a living organism with a will of its own. Conducting, then, is first and foremost an act of persuasion, as Mauceri acknowledges:
The person who stands before a symphony orchestra is charged with something both impossible and improbable. The impossible part is herding a hundred musicians to agree on something, and the improbable part is that one does it by waving one’s hands in the air.
This is why so many famous conductors have claimed that the art of conducting cannot be taught. In the deepest sense, they are right. To be sure, it is perfectly possible, as Reiner did, to teach the rudiments of clear stick technique and effective rehearsal practice. But the mystery at the heart of conducting is, indeed, unteachable: One cannot tell a budding young conductor how to cultivate a magnetic personality, any more than an actor can be taught how to have star quality. What sets the Bernsteins and Bogarts of the world apart from the rest of us is very much like what James M. Barrie said of feminine charm in What Every Woman Knows: “If you have it, you don’t need to have anything else; and if you don’t have it, it doesn’t much matter what else you have.”
2 Excerpts from many of these films were woven together into a two-part BBC documentary, The Art of Conducting, which is available on home video and can also be viewed in its entirety on YouTube
Choose your plan and pay nothing for six Weeks!
Not that he tries. What was remarkable about the condescension in this instance was that Franken directed it at women who accused him of behaving “inappropriately” toward them. (In an era of strictly enforced relativism, we struggle to find our footing in judging misbehavior, so we borrow words from the prissy language of etiquette. The mildest and most common rebuke is unfortunate, followed by the slightly more serious inappropriate, followed by the ultimate reproach: unacceptable, which, depending on the context, can include both attempted rape and blowing your nose into your dinner napkin.) Franken’s inappropriateness entailed, so to speak, squeezing the bottoms of complete strangers, and cupping the occasional breast.
Franken himself did not use the word “inappropriate.” By his account, he had done nothing to earn the title. His earlier vague denials of the allegations, he told his fellow senators, “gave some people the false impression that I was admitting to doing things that, in fact, I haven’t done.” How could he have confused people about such an important matter? Doggone it, it’s that damn sensitivity of his. The nation was beginning a conversation about sexual harassment—squeezing strangers’ bottoms, stuff like that—and “I wanted to be respectful of that broader conversation because all women deserve to be heard and their experiences taken seriously.”
Well, not all women. The women with those bottoms and breasts he supposedly manhandled, for example—their experiences don’t deserve to be taken seriously. We’ve got Al’s word on it. “Some of the allegations against me are not true,” he said. “Others, I remember very differently.” His accusers, in other words, fall into one of two camps: the liars and the befuddled. You know how women can be sometimes. It might be a hormonal thing.
But enough about them, Al seemed to be saying: Let’s get back to Al. “I know the work I’ve been able to do has improved people’s lives,” Franken said, but he didn’t want to get into any specifics. “I have used my power to be a champion of women.” He has faith in his “proud legacy of progressive advocacy.” He’s been passionate and worked hard—not for himself, mind you, but for his home state of Minnesota, by which he’s “blown away.” And yes, he would get tired or discouraged or frustrated once in a while. But then that big heart of his would well up: “I would think about the people I was doing this for, and it would get me back on my feet.” Franken recently published a book about himself: Giant of the Senate. I had assumed the title was ironic. Now I’m not sure.
Yet even in his flights of self-love, the problem that has ever attended Senator Franken was still there. You can’t take him seriously. He looks as though God made him to be a figure of fun. Try as he might, his aspect is that of a man who is going to try to make you laugh, and who is built for that purpose and no other—a close cousin to Bert Lahr or Chris Farley. And for years, of course, that’s the part he played in public life, as a writer and performer on Saturday Night Live. When he announced nine years ago that he would return to Minnesota and run for the Senate—when he came out of the closet and tried to present himself as a man of substance—the effect was so disorienting that I, and probably many others, never quite recovered. As a comedian-turned-politician, he was no longer the one and could never quite become the other.
The chubby cheeks and the perpetual pucker, the slightly crossed eyes behind Coke-bottle glasses, the rounded, diminutive torso straining to stay upright under the weight of an enormous head—he was the very picture of Comedy Boy, and suddenly he wanted to be something else: Politics Boy. I have never seen the famously tasteless tearjerker The Day the Clown Cried, in which Jerry Lewis stars as a circus clown imprisoned in a Nazi death camp, but I’m sure watching it would be a lot like watching the ex-funnyman Franken deliver a speech about farm price supports.
Then he came to Washington and slipped right into place. His career is testament to a dreary fact of life here: Taken in the mass, senators are pretty much interchangeable. Party discipline determines nearly every vote they cast. Only at the margins is one Democrat or Republican different in a practical sense from another Democrat or Republican. Some of us held out hope, despite the premonitory evidence, that Franken might use his professional gifts in service of his new job. Yet so desperate was he to be taken seriously that he quickly passed serious and swung straight into obnoxious. It was a natural fit. In no time at all, he mastered the senatorial art of asking pointless or showy questions in committee hearings, looming from his riser over fumbling witnesses and hollering “Answer the question!” when they didn’t respond properly.
It’s not hard to be a good senator, if you have the kind of personality that frees you to simulate chumminess with people you scarcely know or have never met and will probably never see again. There’s not much to it. A senator has a huge staff to satisfy his every need. There are experts to give him brief, personal tutorials on any subject he will be asked about, writers to write his questions for his committee hearings and an occasional op-ed if an idea strikes him, staffers to arrange his travel and drive him here or there, political aides to guard his reputation with the folks back home, press aides to regulate his dealings with reporters, and legislative aides to write the bills should he ever want to introduce any. The rest is show biz.
Oddly, Franken was at his worst precisely when he was handling the show-biz aspects of his job. While his inquisitions in committee hearings often showed the obligatory ferocity and indignation, he could also appear baffled and aimless. His speeches weren’t much good, and he didn’t deliver them well. As if to prove the point, he published a collection of them earlier this year, Speaking Franken. Until Pearl Harbor, he’d been showing signs of wanting to run for president. Liberal pundits were talking him up as a national candidate. Speaking Franken was likely intended to do for him what Profiles in Courage did for John Kennedy, another middling senator with presidential longings. Unfortunately for Franken, Ted Sorensen is still dead.
The final question raised by Franken’s resignation is why so many fellow Democrats urged him to give up his seat so suddenly, once his last accuser came forward. The consensus view involved Roy Moore, in those dark days when he was favored to win Alabama’s special election. With the impending arrival of an accused pedophile on the Republican side of the aisle, Democrats didn’t want an accused sexual harasser in their own ranks to deflect what promised to be a relentless focus on the GOP’s newest senator. This is bad news for any legacy Franken once hoped for himself. None of his work as a senator will commend him to history. He will be remembered instead for two things: as a minor TV star, and as Roy Moore’s oldest victim.
Choose your plan and pay nothing for six Weeks!
Review of 'Lioness' By Francine Klagsbrun
Golda Meir, Israel’s fourth prime minister, moved to Palestine from America in 1921, at the age of 22, to pursue Socialist Zionism. She was instrumental in transforming the Jewish people into a state; signed that state’s Declaration of Independence; served as its first ambassador to the Soviet Union, as labor minister for seven years, and as foreign minister for a decade. In 1969, she became the first female head of state in the Western world, serving from the aftermath of the 1967 Six-Day War and through the nearly catastrophic but ultimately victorious 1973 Yom Kippur War. She resigned in 1974 at the age of 76, after five years as prime minister. Her involvement at the forefront of Zionism and the leadership of Israel thus extended more than half a century.
This is the second major biography of Golda Meir in the last decade, after Elinor Burkett’s excellent Golda in 2008. Klagsbrun’s portrait is even grander in scope. Her epigraph comes from Ezekiel’s lamentation for Israel: What a lioness was your mother / Among the lions! / Crouching among the great beasts / She reared her cubs. The “mother” was Israel; the “cubs,” her many ancient kings; the “great beasts,” the hostile nations surrounding her. One finishes Klagsbrun’s monumental volume, which is both a biography of Golda and a biography of Israel in her time, with a deepened sense that modern Israel, its prime ministers, and its survival is a story of biblical proportions.Golda Meir’s story spans three countries—Russia, America, and Israel. Before she was Golda Meir, she was Golda Meyerson; and before that, she was Golda Mabovitch, born in 1898 in Kiev in the Russian Empire. Her father left for America after the horrific Kishinev pogrom in 1903, found work in Milwaukee as a carpenter, and in 1906 sent for his wife and three daughters, who escaped using false identities and border bribes. Golda said later that what she took from Russia was “fear, hunger and fear.” It was an existential fear that she never forgot.
In Milwaukee, Golda found socialism in the air: The city had both a socialist mayor and a socialist congressman, and she was enthralled by news from Palestine, where Jews were living out socialist ideals in kibbutzim. She immersed herself in Poalei Zion (Workers of Zion), a movement synthesizing Zionism and socialism, and in 1917 married a fellow socialist, Morris Meyerson. As soon as conditions permitted, they moved to Palestine, where the marriage ultimately failed—a casualty of the extended periods she spent away from home working for Socialist Zionism and her admission that the cause was more important to her than her husband and children. Klagsbrun writes that Meir might appear to be the consummate feminist: She asserted her independence from her husband, traveled continually and extensively on her own, left her husband and children for months to pursue her work, and demanded respect as an individual rather than on special standards based on her gender. But she never considered herself a feminist and indeed denigrated women’s organizations as reducing issues to women’s interests only, and she gave minimal assistance to other women. Klagsbrun concludes that questions about Meir as a feminist figure ultimately “hang in the air.”
Her American connection and her unaccented American English became strategic assets for Zionism. She understood American Jews, spoke their language, and conducted many fundraising trips to the United States, tirelessly raising tens of millions of dollars of critically needed funds. David Ben-Gurion called her the “woman who got the money which made the state possible.” Klagsbrun provides the schedule of her 1932 trip as an example of her efforts: Over the course of a single month, the 34-year-old Zionist pioneer traveled to Kansas City, Tulsa, Dallas, San Antonio, Los Angeles, San Francisco, Seattle, and three cities in Canada. She became the face of Zionism in America—“The First Lady,” in the words of a huge banner at a later Chicago event, “of the Jewish People.” She connected with American Jews in a way no other Zionist leader had done before her.
In her own straightforward way, she mobilized the English language and sent it into battle for Zionism. While Abba Eban denigrated her poor Hebrew—“She has a vocabulary of two thousand words, okay, but why doesn’t she use them?”—she had a way of crystallizing issues in plainspoken English. Of British attempts to prevent the growth of the Jewish community in Palestine, she said Britain “should remember that Jews were here 2,000 years before the British came.” Of expressions of sympathy for Israel: “There is only one thing I hope to see before I die, and that is that my people should not need expressions of sympathy anymore.” And perhaps her most famous saying: “Peace will come when the Arabs love their children more than they hate us.”
Once she moved to the Israeli foreign ministry, she changed her name from Meyerson to Meir, in response to Ben-Gurion’s insistence that ministers assume Israeli names. She began a decade-long tenure there as the voice and face of Israel in the world. At a Madison Square Garden rally after the 1967 Six-Day War, she observed sardonically that the world called Israelis “a wonderful people,” complimented them for having prevailed “against such odds,” and yet wanted Israel to give up what it needed for its self-defense:
“Now that they have won this battle, let them go back where they came from, so that the hills of Syria will again be open for Syrian guns; so that Jordanian Legionnaires, who shoot and shell at will, can again stand on the towers of the Old City of Jerusalem; so that the Gaza Strip will again become a place from which infiltrators are sent to kill and ambush.” … Is there anybody who has the boldness to say to the Israelis: “Go home! Begin preparing your nine and ten year olds for the next war, perhaps in ten years.”
The next war would come not in ten years, but in six, and while Meir was prime minister.
Klagsbrun’s extended discussion of Meir’s leadership before, during, and after the 1973 Yom Kippur War is one of the most valuable parts of her book, enabling readers to make informed judgments about that war and assess Meir’s ultimate place in Israeli history. The book makes a convincing case that there was no pre-war “peace option” that could have prevented the conflict. Egypt’s leader, Anwar Sadat, was insisting on a complete Israeli withdrawal before negotiations could even begin, and Meir’s view was, “We had no peace with the old boundaries. How can we have peace by returning to them?” She considered the demand part of a plan to push Israel back to the ’67 lines “and then bring the Palestinians back, which means no more Israel.”
A half-century later, after three Israeli offers of a Palestinian state on substantially all the disputed territories—with the Palestinians rejecting each offer, insisting instead on an Israeli retreat to indefensible lines and recognition of an alleged Palestinian “right of return”—Meir’s view looks prescient.
Klagsbrun’s day-by-day description of the ensuing war is largely favorable to Meir, who relied on assurances from her defense minister, Moshe Dayan, that the Arabs would not attack, and assurances from her intelligence community that, even if they did, Israel would have a 48-hour notice—enough time to mobilize the reserves that constituted more than 75 percent of its military force. Both sets of assurances proved false, and the joint Egyptian-Syrian attack took virtually everyone in Israel by surprise. Dayan had something close to a mental breakdown, but Meir remained calm and in control after the initial shock, making key military decisions. She was able to rely on the excellent personal relationships she had established with President Nixon and his national security adviser, Henry Kissinger, and the critical resupply of American arms that enabled Israel—once its reserves were called into action—to take the war into Egyptian and Syrian territories, with Israeli forces camped in both countries by its end.
Meir had resisted the option of a preemptive strike against Egypt and Syria when it suddenly became clear, 12 hours before the war started, that coordinated Egyptian and Syrian attacks were coming. On the second day of the war, she told her war cabinet that she regretted not having authorized the IDF to act, and she sent a message to Kissinger that Israel’s “failure to take such action is the reason for our situation now.” After the war, however, she testified that, had Israel begun the war, the U.S. would not have sent the crucial assistance that Israel needed (a point on which Kissinger agreed), and that she therefore believed she had done the right thing. A preemptive response, however, or a massive call-up of the reserves in the days before the attacks, might have avoided a war in which Israel lost 2,600 soldiers—the demographic equivalent of all the American losses in the Vietnam War.
It is hard to fault Meir’s decision, given the erroneous information and advice she was uniformly receiving from all her defense and intelligence subordinates, but it is a reminder that for Israeli prime ministers (such as Levi Eshkol in the Six-Day War, Menachem Begin with the Iraq nuclear reactor in 1981, and Ehud Olmert with the Syrian one in 2007), the potential necessity of taking preemptive action always hangs in the air. Klagsbrun’s extensive discussion of the Yom Kippur War is a case study of that question, and an Israeli prime minister may yet again face that situation.
The Meir story is also a tale of the limits of socialism as an organizing principle for the modern state. Klagsbrun writes about “Golda’s persistent—and hopelessly utopian—vision of how a socialist society should be conducted,” exemplified by her dream of instituting commune-like living arrangements for urban families, comparable to those in the kibbutzim, where all adults would share common kitchens and all the children would eat at school. She also tried to institute a family wage system, in which people would be paid according to their needs rather than their talents, a battle she lost when the unionized nurses insisted on being paid as professionals, based on their education and experience, and not the sizes of their families.
Socialism foundered not only on the laws of economics and human nature but also in the realm of foreign relations. In 1973, enraged that the socialist governments and leaders in Europe had refused to come to Israel’s aid during the Yom Kippur War, Meir convened a special London conference of the Socialist International, attended by eight heads of state and a dozen other socialist-party leaders. Before the conference, she told Willy Brandt, Germany’s socialist chancellor, that she wanted “to hear for myself, with my own ears, what it was that kept the heads of these socialist governments from helping us.”
In her speech at the conference, she criticized the Europeans for not even permitting “refueling the [American] planes that saved us from destruction.” Then she told them, “I just want to understand …what socialism is really about today”:
We are all old comrades, long-standing friends. … Believe me, I am the last person to belittle the fact that we are only one tiny Jewish state and that there are over twenty Arab states with vast territories, endless oil, and billions of dollars. But what I want to know from you today is whether these things are the decisive factors in Socialist thinking, too?
After she concluded her speech, the chairman asked whether anyone wanted to reply. No one did, and she thus effectively received her answer.
One wonders what Meir would think of the Socialist International today. On the centenary of the Balfour Declaration last year, the World Socialist website called it “a sordid deal” that launched “a nakedly colonial project.” Socialism was part of the cause for which she went to Palestine in 1921, and it has not fared well in history’s judgment. But the other half—
Zionism—became one of the great successes of the 20th century, in significant part because of the lifelong efforts of individuals such as she.
Golda Meir has long been a popular figure in the American imagination, particularly among American Jews. Her ghostwritten autobiography was a bestseller; Ingrid Bergman played her in a well-received TV film; Anne Bancroft played her on the Broadway stage. But her image as the “71-year old grandmother,” as the press frequently referred to her when she became prime minister, has always obscured the historic leader beneath that façade. She was a woman with strengths and weaknesses who willed herself into half a century of history. Francine Klagsbrun has given us a magisterial portrait of a lioness in full.
Choose your plan and pay nothing for six Weeks!
Back in 2016, then–deputy national-security adviser Ben Rhodes gave an extraordinary interview to the New York Times Magazine in which he revealed how President Obama exploited a clueless and deracinated press to steamroll opposition to the Iranian nuclear deal. “We created an echo chamber,” Rhodes told journalist David Samuels. “They”—writers and bloggers and pundits—“were saying things that validated what we had given them to say.”
Rhodes went on to explain that his job was made easier by structural changes in the media, such as the closing of foreign bureaus, the retirement of experienced editors and correspondents, and the shift from investigative reporting to aggregation. “The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns,” he said. “That’s a sea change. They literally know nothing.”
And they haven’t learned much. It was dispiriting to watch in December as journalists repeated arguments against the Jerusalem decision presented by Rhodes on Twitter. On December 5, quoting Mahmoud Abbas’s threat that moving the U.S. Embassy to Jerusalem would have “dangerous consequences,” Rhodes tweeted, “Trump seems to view all foreign policy as an extension of a patchwork of domestic policy positions, with no regard for the consequences of his actions.” He seemed blissfully unaware that the same could be said of his old boss.
The following day, Rhodes tweeted, “In addition to making goal of peace even less possible, Trump is risking huge blowback against the U.S. and Americans. For no reason other than a political promise he doesn’t even understand.” On December 8, quoting from a report that the construction of a new American Embassy would take some time, Rhodes asked, “Then why cause an international crisis by announcing it?”
Rhodes made clear his talking points for the millions of people inclined to criticize President Trump: Acknowledging Israel’s right to name its own capital is unnecessary and self-destructive. Rhodes’s former assistant, Ned Price, condensed the potential lines of attack in a single tweet on December 5. “In order to cater to his political base,” Price wrote, “Trump appears willing to: put U.S. personnel at great risk; risk C-ISIL [counter-ISIL] momentum; destabilize a regional ally; strain global alliances; put Israeli-Palestinian peace farther out of reach.”
Prominent media figures happily reprised their roles in the echo chamber. Susan Glasser of Politico: “Just got this in my in box from Ayman Odeh, leading Arab Israeli member of parliament: ‘Trump is a pyromaniac who could set the entire region on fire with his madness.’” BBC reporter Julia Merryfarlane: “Whether related or not, everything that happens from now on in Israel and the Pal territories will be examined in the context of Trump signaling to move the embassy to Jerusalem.” Neither Rhodes nor Price could have asked for more.
Network news broadcasts described the president’s decision as “controversial” but only reported on the views of one side in the controversy. Guess which one. “There have already been some demonstrations,” reported NBC’s Richard Engel. “They are expected to intensify, with Palestinians calling for three days of rage if President Trump goes through with it.” Left unmentioned was the fact that Hamas calls for days of rage like you and I call for pizza.
Throughout Engel’s segment, the chyron read: “Controversial decision could lead to upheaval.” On ABC, George Stephanopoulos said, “World leaders call the decision dangerous.” On CBS, Gayle King chimed in: “U.S. allies and leaders around the world say it’s a big mistake that will torpedo any chance of Middle East peace.” Oh? What were the chances of Middle East peace prior to Trump’s speech?
On CNN, longtime peace processor Aaron David Miller likened recognizing Jerusalem to hitting “somebody over the head with a hammer.” On MSNBC, Chris Matthews fumed: “Deaths are coming.” That same network featured foreign-policy gadfly Steven Clemons of the Atlantic, who said Trump “stuck a knife in the back of the two-state process.” Price and former Obama official Joel Rubin also appeared on the network to denounce Trump. “American credibility is shot, and in diplomacy, credibility relies on your word, and our word is, at this moment, not to be trusted from a peace-process perspective, certainly,” Rubin said. This from the administration that gave new meaning to the words “red line.”
Some journalists were so devoted to Rhodes’s tendentious narrative of Trump’s selfishness and heedlessness that they mangled the actual story. “He had promised this day would come, but to hear these words from the White House was jaw-dropping,” said Martha Raddatz of ABC. “Not only signing a proclamation reversing nearly 70 years of U.S. policy, but starting plans to move the embassy to Jerusalem. No one else on earth has an embassy there!” How dare America take a brave stand for a small and threatened democracy!
In fact, Trump was following U.S. policy as legislated by the Congress in 1995, reaffirmed in the Senate by a 90–0 vote just last June, and supported (in word if not in deed) by his three most recent predecessors as well as the last four Democratic party platforms. Most remarkable, the debate surrounding the Jerusalem policy ignored a crucial section of the president’s address. “We are not taking a position on any final-status issues,” he said, “including the specific boundaries of Israeli sovereignty in Jerusalem, or the resolution of contested borders. Those questions are up to the parties involved.” What we did then was simply accept the reality that the city that houses the Knesset and where the head of government receives foreign dignitaries is the capital of Israel.
However, just as had happened during the debate over the Iran deal, the facts were far less important to Rhodes than the overarching strategic goal. In this case, the objective was to discredit and undermine President Trump’s policy while isolating the conservative government of Israel. Yet there were plenty of reasons to be skeptical toward the disingenuous duo of Rhodes and Price. Trump’s announcement was bold, for sure, but the tepid protests from Arab capitals more worried about the rise of Iran, which Rhodes and Price facilitated, than the Palestinian issue suggested that the “Arab street” would sit this one out.
Which is what happened. Moreover, verbal disagreement aside, there is no evidence that the Atlantic alliance is in jeopardy. Nor has the war on ISIS lost momentum. As for putting “Israeli–Palestinian peace farther out of reach,” if third-party recognition of Jerusalem as Israel’s capital forecloses a deal, perhaps no deal was ever possible. Rhodes and Price would like us to overlook the fact that the two sides weren’t even negotiating during the Obama administration—an administration that did as much as possible to harm relations between Israel and the United States.
This most recent episode of the Trump show was a reminder that some things never change. Jerusalem was, is, and will be the capital of the Jewish state. President Trump routinely ignores conventional wisdom and expert opinion. And whatever nonsense President Obama and his allies say today, the press will echo tomorrow.