y office in the English department at Northwestern University is in University Hall, an American gothic joke of a building whose architect might have been inspired by Charles Addams, if the building weren’t much older than the cartoonist. Yet I have grown enamored of this absurd pile of limestone, where for the past twelve years I have had an office and done most of my teaching. Located on the second floor, my office overlooks what is known on campus as “the rock.” The rock is a boulder, roughly seven feet high, perhaps six feet in diameter, and a marker and meeting place for undergraduates. Fraternity and sorority pledge classes regularly paint the rock, students set up tables near it to sell T-shirts or yearbooks or campus magazines or to collect for charities. From ten minutes before the hour until the hour is struck, there is fairly heavy traffic near the rock, with students passing on their way to classes or back to their apartments, dormitories, fraternities, and sororities.
Fancying myself old Mr. Chips(tein), sometimes I stand at the window of my office and gaze down upon the students as they congregate around the rock or on the steps of University Hall. I note especially students who have been in my classes, many of whose names, only a year or two later, I have quite forgotten or will soon forget. As I watch them pass, I wonder what plots life has in store for the two upper-middle-class girls from a suburb outside Minneapolis who are in Ralph Lauren duds, or for the punkily dressed theater major who was such an atrocious speller, or for the Chinese pre-med student who did so well in my course in advanced prose composition. I wonder, too, about the future of the students who are members of the campus political organization that is known as InCAR (standing for International Committee Against Racism), who frequent the purlieus of the rock perhaps more than anyone else and are always there on heavy political business: to collect funds for striking coal miners in the north of England, to aid the guerrillas in El Salvador, to crush the guerrillas in Nicaragua, to beseech the university to divest from South Africa, to halt a showing of the film Birth of a Nation, and more, always much more. Like the village idiot hired by the shtetl in which he lived to await by the village gates the coming of the messiah, the students of InCAR appear never to be out of work.
No one would ever accuse the InCAR kids of having flair. Say what you like against the 60’s—and I would say a very great deal against them—the students who took part in the tumult of those years at least appeared hugely to enjoy the Dionysian fringe benefits that went along with their ostensibly Apollonian goals. The pleasure of the 60’s, after all, was in doing exactly what one pleased while appearing at the same time to be doing good. This alluring combination has left many students of the current generation with what I think of as “60’s envy,” or regret at missing a whacking idealistic good time. But the students in InCAR in no way suggest those of the 60’s. In their regression they jump all the way back to the 1930’s, without the excuse of not knowing for certain what Stalin was doing in Russia during those years.
The students who belong to InCAR manage to achieve a grayness, a grimness, a joylessness that almost seems studied. There is a dimness about their dress, a bleakness about their response to the pleasant surroundings in which they live—Northwestern’s is a lush campus set along the shore of Lake Michigan in Evanston, Illinois—that does not seem altogether natural. Whatever the season, winter seems to be in their faces as they stand near the rock, blaring the word through bullhorns or passing out leaflets for one or another of their causes—leaflets written in a tone and style that resemble not so much political argument as a ransom note. A local joke on campus asks, “How many members of InCAR does it take to change a light bulb?” “None,” the answer is, “They don’t change it—they smash it.”
For the students who have joined it—and, undergraduate and graduate students together, they appear to number fewer than thirty—InCAR obviously gives something of the pleasure in collectivity that a fraternity or sorority provides, though much intensified. Among other pleasures, it gives that of being in total and permanent opposition on a campus whose student body is otherwise middle and upper-middle class in tone and feeling. Unlike a fraternity or sorority, InCAR gives its members a complete outlook on life: a way of understanding the world and a language to help explain it. There is also the sense of a “movement,” for InCAR is not restricted to Northwestern University but claims an international membership in the thousands, ranging, or so it says, from coal miners in Kent in England to farm workers in the San Joaquin Valley in California. It is an offshoot—the impolite word is “front”—of the Communist Progressive Labor party, which has been in existence for fifteen years.
But only in the past five or so years has InCAR been a felt presence at Northwestern. A few of its members have wandered into my classes during this time. They tend to be very earnest, rather more passionate than the general run of Northwestern student, sometimes bright but never brilliant. The passion and the brightness, when they exist, come from the infusions of ideology that InCAR has given them. But what they gain on the straightaway they lose on the curve: that same ideology makes InCAR students leaden and mechanistic in their response to literature and ideas. They have minds so coarse no feelings can violate them. They have no notion that what seem to them hot new ideas are clichés pickled in the brine and blood of more than five decades. They write classroom papers with titles like “Sister Carrie as a Commodity” and “Joseph Conrad, Counter-Revolutionary.” They will respond to a point made by a classmate by accusing him of being “imprisoned in bourgeois ideology,” using the phrase as if it were quite as fresh and penetrating as an aphorism discovered in the middle of Proust.
As a writing major at NU, last year I took a course in American poetry which was taught by a certain Prof. X who claimed that Whitman was inherently racist and sexist in his naive representation of democracy, William Carlos Williams was racist in his depiction of individuals content within their socioeconomic positions, and that Dickinson, Frost, and other later New England writers were guilty of adhering to an ultimately “bourgeois” attitude of individualism combined with some sort of belief in religion, an after-life, and a sense of higher purpose: a belief which X considered “the opiate of the masses. . . .” X’s message to aspiring writers was implicit; as writers, our goal should be to search out and criticize each and every fault in our culture. Any desire to affirm any aspect of that culture, we should stifle with our “critical intellects.”
Although these views would seem to be congruent with Prof. Foley’s, I cannot be certain that she is Prof. X, for there have been other professors in the English department in which I teach who might also be pleased to claim the same views. I do not say that such views are dominant at Northwestern, or in the American university at large, but they do nowadays crop up with a fair frequency, and not in English departments alone. I recall three years ago taking two young men, graduating seniors who had been through two of my courses, to lunch, in the middle of which they unfurled a number of straightforwardly Marxist notions about American foreign policy. “Do you guys show slides with these clichés?” I inquired. I also asked where they had acquired such views. It turned out that they had just completed a course in American diplomatic history whose bottom, middle, top, and every other line was that all American foreign policy was a cover for the imperialist ventures of American business interests abroad. These were bright fellows, each of them with a fine sense of humor; one was headed for a career in journalism, the other for the foreign service; and I was disheartened to think that, as they were leaving my university, they were lugging such crude notions along with them.
A difference that never fails to astonish me between undergraduate education now and then—“then” being roughly thirty years ago, when I was an undergraduate at the University of Chicago—is that now university teachers who have strong political views feel no need to suppress them in the name of fairness or disinterestedness or a higher allegiance to the subject being taught. I may have been a political näif when young, but thinking back upon my own undergraduate education I cannot recall the political opinions of any of my teachers. The reason I cannot, I suspect, is that they kept their politics to themselves. Their politics were nobody’s business but their own, and, while they were in the classroom or lecture hall, not even their own.
Colleagues do not make judgments about a fellow teacher’s teaching. Instead, under the new dispensation, students do.Academic freedom, which earlier generations of professors had struggled to obtain, was about the right to hold any political views one wished outside the classroom. One of the most common meanings of academic freedom, in the words of Edward Shils, “refers to the freedom of university—and college—teachers to enjoy the freedom of speech and action that other citizens are constitutionally, legally, and conventionally empowered to exercise.” But that meaning has in recent decades been extended to include the right to teach one’s political views as part of the subject matter of one’s courses. Not only is it considered a right, but, in many quarters it is thought a fine thing. Thus university academic departments nowadays seek out feminists, Marxists, and others in whom the political impulse runs stronger than any other, to teach their bias—and to do so in the name of intellectual diversity. If a strong English-department chairman were today to learn about the teaching of Prof. X, quoted above, and tried to tell him or her to knock it off in the name of a higher seriousness, that chairman would no doubt be accused of interfering with Prof. X’s academic freedom.
In the old days one can imagine a strong English-department chairman, in the approved English-professor manner, taking Prof. X aside to say, “Look here, X, do be a good fellow and forget that rot about the bourgeois attitudes of the New England writers I understand you are teaching. Publish it if you like—that is your business. But our business, as teachers, is sticking to the text.” If Prof. X were tenured, the request would be a friendly one; if he were not yet tenured, the request would no doubt be more insistent. Today, however, our chairman would be accused, at a minimum, of McCarthyism, fascism, and troglo-dyticism. The American Association of University Professors might be called in. An article might appear in the Chronicle of Higher Education. Litigation might be set in motion. But, not to worry, no chairman is likely to suggest that Prof. X knock it off. It just isn’t done.
It isn’t done—not at least at large universities mindful of their prestige—because a college teacher’s classroom has become his castle, and he is free to do there as he pleases. Colleagues do not make judgments about a fellow teacher’s teaching. Instead, under the new dispensation, students do. Students always have done so, of course, but whereas earlier they did so informally, now, through something called evaluation forms, they do so formally. On the final days of a class, with perhaps ten or twenty minutes remaining, a professor passes out evaluation forms on which students remark on the strengths and deficiencies of his course. In cases where a professor is coming up for tenure, these evaluations are considered by his colleagues with some care. Tenured faculty, in these instances, do not directly judge the teacher; they judge the students’ judgments, which is not quite the same thing.
Not that judging teaching is easy. As everyone who has been to college knows, a popular teacher can be inefficient and a dull teacher can sometimes leave a lasting impress. What seems exciting in one’s youth, ten years later seems facile, if not silly. Teaching, especially teaching the large, so-called soft subjects in the humanities, where mastery of specific problems is not the chief business at hand, but asking the right questions is, is a subtle art. Student evaluations of one’s own teaching do not help. These evaluations can capture real delinquency, citing a professor’s many absences or his obvious unpreparedness. But beyond that, in the realm where useful distinctions might be made, they leave everything to be desired. Evaluations of my own teaching tend to be quite positive, and my teaching is almost always held to be—most ambiguous of words—“interesting.” But how much gets through, how long it will remain, I haven’t the least idea, and my guess is that neither does any other teacher. The most touching student evaluation I have ever received noted: “I did well in his course because I would have been ashamed not to do well for him.” But of the content of what I teach, and of the quality of thought that goes into this content—nothing. Undergraduate students can hardly be expected to be fit to judge this, and by and large they do not.
Barbara Foley, too, is thought to be an “interesting” teacher, some say an “exciting” teacher, some say the best teacher they have had at Northwestern. A small percentage of students—perhaps 10 percent—say that they are put off by what they term her “ideology.” About this ideology she is apparently, as they say today, “up front.” She makes clear her political point of view and then teaches it. A bit of a Jenny One-Note, Prof. Foley teaches courses that feature the political; their titles have included “Race and Racial Attitudes in American Literature,” “The Radical Tradition in American Literature,” “Proletarian Writers of the 1930’s,” and “The American Dream: Myth or Reality?” (anyone out there who cares to bet that the dream is a reality, please get in touch with me immediately). These courses, I have gathered from talking to students who have taken them and have since graduated, offer strong Marxistical readings of American books, with occasional eye-opening insights such as that, one of my former students recalls, Mark Twain was a “liberal racist.” You don’t have to believe Prof. Foley, you don’t have to swallow her line to do well in her courses, but it is, evidently, no easy chore to buck her directly. Still, the vast majority of her students, according to student evaluations, walk away satisfied customers.
In the scuttlebutt way one picks up on these things, one hears occasional murmurs of reaction against the heavy dosage of politics in Prof. Foley’s political teaching. A student who adored Emily Dickinson was greatly unsettled by Prof. Foley’s announcement that Emily Dickinson had rendered herself permanently minor by ignoring the political subject in her work. In another case, a young black student, who had been in two of my courses, dropped in one morning to ask if I thought that he, an undergraduate, was intellectually prepared to take a graduate-level course taught by Barbara Foley. I replied that I thought if he took special pains he was indeed up to it. “There’s only one thing I worry about, then,” he said. When I asked what that was, he replied, “The word is out that Prof. Foley gives almost all black students A’s. I really wouldn’t want to get an A that way.” Was this true? Was Barbara Foley practicing in her own classroom the redistributive justice she longed for in the world? Short of checking all her students’ grades for the six years she has taught at Northwestern, there is no way of knowing. It is interesting, though, that this is the word among black students.
Something that no one in the English department at Northwestern has had the ill manners to talk about is whether Prof. Foley uses her classroom to recruit members for InCAR. In a brief profile in the Daily Northwestern of an undergraduate InCAR member named Becki Huntman, the walls of whose rooms are festooned with pictures of Lenin, Stalin, Malcolm X, and Friedrich Engels, Miss Huntman is quoted as saying that she first learned of InCAR and Communist ideology during her sophomore year in a “Marxism in Literature” course taught by Prof. Foley. “It was the first class,” Miss Huntman added, “where things made sense.” I asked a former student of mine, who had taken one of Barbara Foley’s courses, if the subject of InCAR ever came up in class. She replied that sometimes, toward the close of a class, Prof. Foley would pass out an announcement of an InCAR meeting, where, she would say, some of the things that had been discussed in class would be talked about in greater detail. Often, too, my former student said, InCAR members would be waiting at the beginning of a class outside the door with petitions to sign or leaflets to hand out. Once, when my former student called on Prof. Foley to discuss a forthcoming classroom paper, she found her extremely helpful, but on the way out Prof. Foley attempted to sell her the current issue of the Progressive Labor party newspaper. “Did you buy it?” I asked. “No,” she said. “Was Prof. Foley angry that you didn’t?” “Not at all,” she said. “I guess she figured it was worth a try.”
Something else that no one at Northwestern talks about is Barbara Foley’s exact relationship with InCAR. The group is made up of students, but she is a professor. During the student tumult of the late 60’s and early 70’s there was many a professor who was in entire sympathy with groups such as Students for a Democratic Society, but were there any professors, above the level of graduate-student instructors, who were also members of student organizations? Such groups have generally had faculty advisers, but Prof. Foley’s role in InCAR goes far beyond the advisory. If there is an InCAR picket, she is on the line; she puts in her time with the bullhorn; she works the rock. There is nothing of the standard dilettante, BMW-owning, Marxoid, let-the-kids-do-all-the-dirty-work contemporary radical university professor about Barbara Foley. The movement is neither an amusement nor an avocation for her. It is of her blood and bone; she is wedded to it; it is her life. Not many people at Northwestern seem to want to talk about this, either.
In fact, Barbara Foley probably would never have been a subject of more than ordinary interest—her type, after all, is scarcely original—but for an incident that took place on the Northwestern campus on the evening of April 13, 1985, an incident that has come to be known locally as the Calero Event. On that night Adolfo Calero, then the commander-in-chief of the Nicaraguan Democratic Forces (FDN), the largest group of contras, as the guerrillas fighting against the Sandinista government in Nicaragua are known, was to speak at Northwestern under the auspices of two university organizations, the Conservative Council and the International Policy Forum. The Reagan administration’s campaign to increase American support for the contras was already well under way, and one assumes that Adolfo Calero was to speak at Northwestern, as he would at other universities, to make the case for this policy.
One has to assume this, for in fact Calero never got to speak. His talk was scheduled for 7:00 P.M. and was open to the university community and to the public free of charge. Well before the talk, however, protesters from at least five different left-wing organizations—four from around the city of Chicago and InCAR from Northwestern—ranged themselves around Harris Hall, the place of the scheduled talk, some carrying pickets and shouting and chanting against Calero and the contras. In the room in Harris Hall in which the Calero talk was to be given—a room that accommodates roughly 300 people—some ten or fifteen minutes before Adolfo Calero was to appear, Barbara Foley walked up to the podium and began to speak. (The day before, she had discussed the forthcoming Calero talk in her English class and urged her students to attend.) She identified herself as a member of the International Committee Against Racism, and then announced, in the words of an ad-hoc panel document,1 that “this was the first fascist rally on campus in some time, she suspected more were being planned and they should be stopped.” She said that “Adolfo Calero was a monster who would be attempting to speak about freedom, democracy, and liberty. By that he meant the right to reappropriate his Nicaraguan business holdings”; that “Calero had the blood of thousands on his hands and no respect for the rights to life and free speech of the people he helped slaughter with the CIA’s help”; that “He had no right to speak that night” and—the following are Prof. Foley’s words—“We are not going to let him speak,” and he “should feel lucky to get out of [Harris Hall] alive.” She went on in this vein for two or three minutes. Other people followed her to the podium to offer their opinions.
When Adolfo Calero arrived there was a great deal of chanting and shouting in opposition to his presence. His talk was delayed some ten or fifteen minutes. Before he could begin someone—not Barbara Foley—rushed to the stage and threw a red liquid at him. This liquid had been variously described as paint and as animal blood. At this point, with a good deal of shouting in the hall, Adolfo Calero, his suit coat bespattered with the red liquid, was led from the hall by security men and did not speak that evening. Barbara Foley acknowledges joining in the chanting during the tumult. A witness claims that she also shouted that “the only way to get anything done would be to kill him [Calero],” though she and another witness, a graduate student who is also a member of InCAR, deny that she said this. In any case, what is apparently technically known as “a shout-down”—though the throwing of the red liquid went well beyond shouting—was successful. It was not a memorable night for “dialogue” at Northwestern University.
The Calero Event took place on a Saturday evening. The following Monday a photograph of the bespattered Adolfo Calero along with an account of the incidents surrounding the event appeared in the Daily Northwestern. The account quoted Barbara Foley as having said, “He has no right to speak here tonight and we are not going to let him. He should feel lucky to get out alive.” (This quotation would come up again and again.) The account also mentioned that protesters later gathered outside the apartment at which Calero was staying, threatening violence, and had to be removed by police. Prof. Foley claimed that InCAR was not responsible for what had happened but that she did not regret that it happened. (She would never express regret, let alone offer apology, throughout the months that followed; nor has she done so since. But more about this presently.) She was also quoted as saying, “I think it’s terrific that people saw the fascists.”
Soon thereafter Arnold Weber, in his first year as president of Northwestern University, issued a strong but quite general statement about the impermissibility of any person or group invading the rights of students or faculty by disrupting regularly scheduled university events. Weber had previously been president of the University of Colorado and still earlier had worked for George Shultz at the Office of Management and Budget, but, apart from having a reputation as a strong and capable administrator, his general views or politics were not known. But a stronger and more specific statement came from the university provost, a man named W. Raymond Mack, a sociologist by training and, by all accounts, a liberal in politics. As a liberal should, Mack was said to be outraged by this violation of free speech and academic freedom, and, by way of a press release, he let it be known that he intended to propose that Barbara Foley be suspended for two academic quarters, and suspended without pay. Provost Mack also felt that the tenure procedure on Prof. Foley, which was to begin the following autumn, ought to be delayed for a full year.
Like Provost Mack, the majority of the faculty at Northwestern’s College of Arts and Sciences are also generally liberal in their political views, with a sprinkling of Marxists and conservatives popping up here and there in various departments. But if most members of the faculty were outraged by the incidents connected with the Calero Event, they managed to keep it to themselves. When the autumn 1985 term began, such organizing efforts as were in evidence were on behalf of Barbara Foley. InCAR was firing up the Xerox machine with handouts headed “Foley Tenure Case Shows Need to Shed Liberal Illusions,” pointing out that Northwestern University was clearly going the way of Nazi Germany. A group of faculty members in the English department wrote a lecture to Provost Mack on the subject of “due process” that ran as a full-page ad in the Daily Northwestern, signed by more than eighty members of the faculty. One would hear a good deal about “due process” in the days ahead. Except from the InCAR students, one would not hear much more about fascism and Nazi Germany. Liberals at a place like Northwestern are not so crude; they prefer instead to talk about “McCarthyism.” One would also hear a great many sentences that began, “Of course I find Barbara Foley’s views abhorrent, but. . . .” But, yet, still, and however. “The sin of nearly all left-wingers from 1933 onward,” wrote George Orwell, “is that they have wanted to be anti-fascist without being anti-totalitarian.”
I found the ad-hoc panel impressive in its competence. As befits professors, its members had done their homework. Relevant documents had been exchanged between contending parties; the ground rules and procedures were clearly laid out. The chairwoman made plain that if there were any demonstrations or disruptions, she would clear the auditorium. The hearing was conducted with a certain gravity of spirit that was entirely appropriate to the business at hand. The business at hand was serious business—a young woman’s career could ride on the outcome.
Before responding to the specific charges, Barbara Foley made a general opening statement. The political speech is a form with which she is at ease. To make a longish speech short, she said that subjecting her to these charges at all was farcical, and the charges showed that the administration at Northwestern University, in its accusations against her, was really aligned with the Reagan administration’s policies in Central America. (I have to admit that this connection had not occurred to me.) After this statement, Barbara Foley’s lawyer began calling witnesses. The first among them was a graduate student on whose dissertation committee Barbara Foley sat. A small and extremely tidy young woman, she shocked—at least she shocked me—with the quiet violence of her views. To her it was perfectly obvious that Adolfo Calero, being evil, had no right to speak, and she had no qualms either about the red liquid that was thrown at him. Others of Prof. Foley’s students, graduate and undergraduate, agreed that Calero had been treated as he deserved and that shouting him down was a perfectly appropriate response to such a man. Allowing such witnesses to speak on Barbara Foley’s behalf seemed to me a very foolish tack for her lawyer to take; the best defense can surely never be a bad offense.
As a university teacher, I am not sure I should wish to be judged by what my students thought they had learned from me. But in Barbara Foley’s case, there was no question that her students had got it right; their views, at any rate, were perfectly congruent with her own on the subjects of freedom of speech, academic freedom, and political discussion: to those deemed enemies, none is permitted. While Prof. Foley was not being judged by the ad-hoc panel as a teacher, the sad show her students put on could not have helped her cause. The open hearing, too, made it impossible for Barbara Foley to express the least contrition for what she had done, assuming that she felt any or even wished to fake some. The best that could be got from her with her InCAR crowd in the audience was that—I am quoting again from the UFRPTDAP document cited earlier—“although she did not agree with a judgment that condemns her behavior, she would live within the limits prescribed and would not repeat the prescribed conduct. She acknowledged that were she to repeat it, she would do so knowingly and at full risk.”
At the intermission of the first session of the ad-hoc panel hearing, I ran into a friend who is also a member of the English department at Northwestern. He looked very glum, not to say downright depressed. He told me that he was soon likely to lose the allegiance of some of his best graduate students, who were adamantly in Barbara Foley’s camp. “After what I have just heard in that room,” he said, referring to the ad-hoc panel proceedings, “there is no way I can ever hope to support Barbara.” As things turned out, he, along with the majority of the faculty, would find a way.
It was not until January of 1986 that the UFRPTDAP ad-hoc panel delivered its report and recommendation on the Foley case. The report has a somewhat pompous tone, possibly owing to the panel members’ repeated insistence on the scrupulosity of their procedures and bases for judgment. As for the judgment itself, it went crushingly against Barbara Foley, sustaining eight of the administration’s eleven charges against her. I quote from the summary of the panel’s findings on the charges:
The ad-hoc panel finds that Professor Foley’s conduct on the evening of April 13, 1985, was violative of the speaker’s rights to speak and be heard and of the audience’s right to hear, and consequently constitutes “grave professional misconduct” as charged. We believe that the conduct was violative of academic freedom, a widely accepted principle essential to the central purpose of a university. We believe that the charges sustained describe conduct which is inimical to an atmosphere of open inquiry, antithetical to the principles of free speech, and unacceptable in a university community.
That Barbara Foley was up for tenure the same year in which she was in effect tried for violations of academic freedom was a circumstance not without its ironies—for tenure, at its inception, was meant to protect the academic freedom of university teachers. Far from being the job-security arrangement it has since become, tenure began as a means of ensuring the rights of professors or citizens outside the university. Thus a university teacher might vote or speak or even organize on behalf of, say, labor unions and, with tenure, need not fear the retaliation of anti-labor men on his school’s board of trustees. Academic freedom also ensured that a teacher would be able to pursue lines of scholarly or scientific inquiry that his colleagues or even the community might think odd or heretical. But academic freedom always carried certain responsibilities. Most obviously, it never included the freedom to abrogate someone else’s freedom. It never included the freedom to turn the classroom into a political pulpit or guerrilla theater. It never included using the job of teacher to propagandize for political or religious conversions. The guardians of academic freedom were to be, precisely, its beneficiaries, the professors themselves, who might be expected to guard it jealously, at least if they knew what was good for their own interests.
Nowadays tenure is awarded, traditionally, after a young professor has been at an institution for two three-year terms, and on three bases: scholarship, teaching, and academic citizenship. The emphases laid on teaching and scholarship can differ from institution to institution. At some smaller colleges a teacher’s publishing record may be of negligible import next to his performance in the classroom. At larger universities, especially those concerned about their prestige as research institutions, no publication, or insufficient publication, means no tenure. Although one hears the phrase less frequently now than formerly, “publish or perish” is still very much the rule at most larger universities.
The third basis for tenure, academic citizenship, is one that always has to be considered but is only in exceptional cases emphasized. It is a judgment, at least in part, of character; it is also a judgment about willingness to conform to the rules, explicit and implicit, that govern institutions of higher learning. If a young teacher shows himself irresponsible in his committee assignments, if he misses classes owing to drunkenness, if he seduces his young students, if he shows no regard for the fundamental beliefs of the institution, he could, theoretically, be faulted on academic citizenship and hence denied tenure. (With tenure, it occurs to me to add, the same teacher could today do any of the things mentioned in the previous sentence and probably keep his job.) Owing either to the widened tolerance for human misbehavior or to the lapse in standards of deportment on the part of professors, the category of academic citizenship is invoked in tenure decisions only in exceptional cases. Would Barbara Foley’s be such a case?
Although I have been teaching in the English department at Northwestern University since 1974, I continue to do so, largely by choice, without tenure. Since only members of the department who already have tenure are permitted to vote on tenure decisions, I did not attend the meeting at which the decision about Barbara Foley’s tenure was made. On the other hand, had I been at that meeting, I should have been bound by confidentiality not to speak about it. What I know about this meeting, therefore, I know from things that have leaked out and from surmise. It quickly leaked out, for example, that the English department’s vote was ten to five (with one member abstaining) in favor of recommending Prof. Foley for tenure. As such recommendations go, this was not a strong one; it is more usual for a candidate for tenure to get a unanimous recommendation, or one with only one or two votes of dissent.
It also transpired that, during the tenure vote, discussions of Prof. Foley’s politics were ruled out of order. Whether she was pushing a clear and hard political line in her classes, whether it was appropriate for a member of the faculty to be a major figure in a radical student organization, whether the teaching of literature was a proper occasion for attacking what Prof. Foley deemed the rampant racism and brutalities of capitalism in American life—all this, apparently, was proscribed from discussion, lest politics be thought to determine a decision that ought to be made exclusively on grounds of scholarship and pedagogy. As for Prof. Foley’s behavior during the Calero Event, this, too, was apparently ruled out of bounds for discussion.
If one is not permitted to talk about the politics of so thoroughly political a woman as Barbara Foley, and forbidden to talk not only about her political views but (much more important) about her on-campus political activities, not much is left to talk about. There were the students’ evaluations of her teaching, and these, as mentioned earlier, were vastly approving. There was her scholarship, as a teacher’s publishing record is now conventionally called, and here she had clearly published in sufficient quantity to guard against her perishing: along with a fairly thick sheaf of reviews and articles in academic journals, she had a book on the documentary novel about to appear from Cornell Uni-veristy Press. Was Prof. Foley’s academic citizenship discussed? I do not know, though afterward, in defense of the English department’s decision, its chairman would say that he knew no better citizen of the English department than Barbara Foley. By this he meant that she was a willing and helpful member of department committees, attended meetings and other departmental events, was reliable in performing intramural house- and record-keeping functions. There is no reason to doubt his word on this, but his remains an astonishingly narrow view of academic citizenship. The vote, to repeat, was ten to five in favor of recommending tenure.
To plunge a bit further into the forest of surmise, what was in the minds of the ten faculty members who voted in favor of recommending Barbara Foley for tenure? None is a Marxist, though some have been known to mutter the cliché that “Marxism remains an indispensable tool of literary analysis.” (Odd that the best literary critics have, without much difficulty, been able to dispense with it.) Some may have felt themselves crippled by the ground rules of the discussion. Almost all of them have said to me, singly, at one time or another, that “I find Barbara’s views abhorrent, but . . . still . . . yet . . . however. . . .” But some find her abhorrent views useful for giving students a wide diversity of opinion. Still, some like her personally and say that in private she can be a charming woman. Yet some feel they have no right to claim that their own views are superior to the next person’s, though I have never encountered anyone who finds his own views abhorrent. However, some feel that to censure even views and actions they find abhorrent is to be guilty of, yes, McCarthyism.
The reliable Marxist gang is also there: Christopher Caudwell, Lucien Goldmann, Georg Lukács, and Louis Althusser. When either Marx or Lenin is quoted, he is given scriptural authority.It is, of course, also possible that many members of the Northwestern English department were greatly impressed by Prof. Foley’s scholarship, so greatly impressed that denying tenure to so considerable a scholar would have seemed grievously wrong (especially since there are now members of the English department who were given tenure before they even published a book). “Scholarship, scholarship, scholarship,” said the chairman of the English department on another occasion in defending his department’s decision on tenure, “that is what wins tenure at a place like Northwestern.”
I fear that “scholarship” has in academic circles become one of those words that no longer means what it once did; “research” is another. Where scholarship once stood for the kind of work done by people like E. H. Gombrich, Arnaldo Momi-gliano, and Frances A. Yates, it now merely means published work. In this rather less impressive conception of scholarship, Prof. Foley had committed her share, and I decided to have a closer look at it (which, I believe, is called “research”).
Reading Prof. Foley’s book, Telling the Truth: The Theory and Practice of Documentary Fiction,2 I recalled that one of her graduate students, while defending her conduct during the Calero Event, remarked that she admired Prof. Foley’s expertise in “social awareness literature” and added that Prof. Foley was at the forefront of “the theoretical debate that is shaping literature.” There is something in what this graduate student said. In her book Prof. Foley sets out to demonstrate that she is very much au courant with the new literary theory but is not about to let it swamp the larger questions of social awareness that, when one gets right down to it, no one has posed with greater clarity, penetration, and wisdom than Karl Marx and Vladimir Ilyich Lenin.
Part of the difficulty with Telling the Truth is that one does not get right down to it very quickly. Its first 103 pages are given over to the new literary theory gradually becoming regnant in English graduate studies. Devotees of this theory will find many old friends cited in her pages: Foucault, Bakhtin, Derrida, Barthes, Lacan. The reliable Marxist gang is also there: Christopher Caudwell, Lucien Goldmann, Georg Lukács, and Louis Althusser. When either Marx or Lenin is quoted, he is given scriptural authority. The highest new jargonese is everywhere employed. The prose is consequently very dense. Few actual works of literature are cited in this first part of the book. Instead something called “the text” is continually mentioned; also such barbed-wire words as “intertextuality” and “extra-textuality.” Nonprofessionals will find it all very hard, almost impenetrable, going. Reading the first part of Prof. Foley’s book I was reminded of Robert Frost’s remark that writing free verse is like playing tennis with the net down; perhaps the best way to describe the new critical theory is to say that it is like playing ping-pong without the ball.
But the real purpose of the dense theoretical discussion of the first part of Telling the Truth is to make the point that the new literary theory cannot finally supplant that old indispensable tool of literary analysis, standard Marxism. In the second part of her book Prof. Foley puts a ball in play and discusses actual works in the genre she rather loosely defines as the documentary novel. “Contextualization,” “hypostatization,” “re-concretization of the referent,” “fictionality”—the lyrics may be different but the melody is the traditional Marxist one. For, as long as we are telling the truth, the only documentary fiction Prof. Foley appears to approve is that which confronts the class struggle and the alienation she believes is the inevitable result of capitalism as part of an “oppositional program.” Or, to put it in the not uncharacteristic prose style of Telling the Truth: “Though some writers manage to make use of modernist defamiliarization as a powerful tool in the critique of reification, most accede to the thoroughgoing fetishization of social relations that characterizes what Lukács called the ‘problem of commodities’ in the early 20th century.”
That, patently, is not entertainment, but is it scholarship? Evidently, to judge by English-studies standards of the day, and by the subsequent career of Prof. Foley’s tenure recommendation, it is indeed. Once such a recommendation leaves an academic department, its approval is far from certain. Not many years back the recommendation for tenure of an extremely popular teacher of philosophy was shot down at the level of the dean’s office, presumably because the teacher’s published work was felt by the dean not to have enough intellectual weight. But the recommendation for Prof. Foley sailed on through—from the university’s ad-hoc tenure committee, thence to the College of Arts and Sciences Promotion and Tenure Committee, and thence to the dean of the College of Arts and Sciences—everywhere endorsed, and reputed to have been strongly endorsed. It was reported that where usually somewhere between eight and twelve outside readers are called in for judgment upon a tenure decision, in Prof. Foley’s instance there were between thirty-five and thirty-eight outside readers, and these endorsed her tenure overwhelmingly (a fact that by itself says a great deal about the fallen standards of English studies in recent years). Case, one would have thought, closed.
Not quite, as it turned out. On May 21, 1986, Barbara Foley received a letter from the dean of the College of Arts and Sciences whose first sentence read: “I must inform you that the Provost did not approve the recommendation to promote you to the rank of associate professor with tenure.” The brief letter went on to say that she would be given a final year of employment as an assistant professor. Attached was a copy of Provost W. Raymond Mack’s letter rejecting the appointment. In his letter the provost cited the fact that Prof. Foley “failed to receive a majority vote from her tenured colleagues in the department of English who had recommended her retention.” (There are twenty-one members of the department with tenure, and five had missed the meeting.) But, obviously, a more significant item for the provost—and, clearly, for President Arnold Weber for whom he was also speaking—was Barbara Foley’s behavior during the Calero Event:
Professor Foley’s record includes the fact that the administration of Northwestern University last year charged her with grave professional misconduct, in that she violated widely accepted principles of academic freedom and responsibility including those stated in the Northwestern University Faculty Handbook (p. 1). She asked for, and received, an open hearing on these charges before an ad-hoc panel of the elected faculty appeals committee.
The provost then quoted from the summary section of the ad-hoc panel’s report, and ended by saying that he agreed with that summary. “I support the finding of the university faculty panel that her conduct is ‘unacceptable in a university community.’ I therefore have decided against offering her further appointment as a member of the Northwestern University faculty.”
The faculty reaction to the administration’s rejection of tenure for Barbara Foley was much stronger than its reaction to the Calero Event. The coolest reaction was perhaps Prof. Foley’s own. Despite the cliché nature of her revolutionary rhetoric, she is no fool and far from being an unsavvy politician. She immediately sent around Xerox copies of the letter from the dean to her and the provost’s letter to the dean, and she was quoted the following day in the university newspaper as saying that rejection of her tenure was “clearly a political attack that revealed nakedly the actual nature of power relations in the university,” adding that the administration, not the faculty, is assuming control of “the academic business of the university.”
Most professors at Northwestern tend to be essentially in business for themselves. They teach, they publish, they apply for grants, they go to their mailboxes ever hopeful that they will have a job offer from Stanford or Michigan, or—O my God, can it be?—Harvard, Yale, or Princeton. Not much in the life of the university community outside their own interests arouses them. But the Foley case did. Of course they almost all found her views “abhorrent,” but still, yet, and however she was one of them, a professor, and the administration was still the administration, a historic enemy. A new issue, then, had arisen, and it went by the name of faculty governance. True, there was nothing illegal about the provost’s action. The provost of the university is supposed to review all tenure decisions and under an active provost these reviews are never perfunctory. Still, in overruling the findings of the various faculty committees in the Foley case, wasn’t the administration undercutting the faculty’s own power? After putting the faculty through the entire rigmarole of tenure review for Barbara Foley, then rejecting its findings, wasn’t the administration acting in bad faith?
Although it was near the end of term, with papers to grade and examinations to give, a flurry of meetings was called. A group of graduate students organized themselves into the Graduate Committee in Support of Barbara Foley. The graduate students and the faculty in the English department met to vote on a resolution first “demanding,” then (in quieter language) “urging” the president of the university to reverse the provost’s decision in the Foley case. Voted on solely by the English faculty in secret ballot, the resolution passed 17 to 2. (Whose was the other dissenting vote, I wondered.) Letters with lengthy lists of signatories explaining faculty governance were sent off to the metropolitan Chicago press in answer to editorials congratulating the administration for taking a strong stand on academic freedom in the Foley case. To add an Orwellian touch of doublethink, small posters began to appear on the walls of university buildings and in classrooms: “Support Diversity of Opinion, Back Foley!” and “Retain Civil Liberties, Keep Foley!”
Just before final-examination week, on May 28, a rally in support of Barbara Foley was held at what was described in a hand-out as the “Mandela/ Crown Center.” (The true name of the place is Rebecca Crown Center, the location of the university’s administration offices, donated by the Henry Crown family.) It was, literally, a banner day. A podium was set up before a pink banner with black lettering carrying InCAR’s initials. “Support Foley” banners were rife. A student carried a banner reading, “Fire Mack, Support Foley!” Four or five people came up to me with various petitions to sign. A faculty member wanted me to sign a petition urging the president to understand he was undermining faculty governance. She let me know—here was an original thought—that she found Barbara Foley’s views abhorrent, but a larger issue was at stake. I told her that I didn’t find Barbara Foley’s views abhorrent, merely crude and preposterous; what I found abhorrent was what she had done and her refusal to offer anything approaching an apology for it.
Meanwhile, everyone except those doing the talking was growing logy with boredom. Nearly two hours had passed. What we were all waiting for, of course, was for Barbara Foley to speak. At last she came to the microphone. There were perhaps a hundred and fifty people gathered. She was greeted by strong applause. For the most part, she spoke in pure clichés: about the administration’s political motives in rejecting her for tenure, about its desecration of the principle of faculty governance, about the splendid work of InCAR around the country, about the evil of the contras whom she accused of torturing women and cutting off their breasts. Toward the close of her speech, she said, “When I got up on the stage in Harris 107, I had no idea all this would happen, but I have no regrets,” and then she added, though I did not catch her exact words, that her only regret was that there hadn’t been other faculty members with her on that night.
“You have to admit,” Prof. Foley’s friends say, “that Barbara has guts.” I admit it. She also has a goofy kind of integrity. She calls herself a Marxist-Leninist, but the truth is that she is a poor Leninist indeed. A real Leninist would have apologized at some point—and I myself have little doubt that an apology would have made it very difficult for the provost to do what he did without seeming heartless—and then, in good Leninist fashion, would have gone on with whatever he thought would further the interests of the revolution. It may be that Barbara Foley cannot apologize even to the point of allowing that perhaps she went a bit far on the night of the Calero Event, lest she lose face among the students she has been revving up with her rhetoric over the past several years. My own suspicion is that she feels she has nothing for which to apologize. She is not in the least a phony. I think she is a real revolutionary, at least psychologically, and that, now headed toward forty years of age, she is probably in for the duration. As I walked away from the rally, I recalled that it was not her beloved Marx but Nietzsche who said that every idea has its autobiography, and I wondered what it was in Barbara Foley’s own autobiography that had put the dark and fiery idea of revolution in her head, probably never to be dislodged within her lifetime.
There was to be one more meeting on the Foley case before, so to speak, school was out. With what I am quite certain was unconscious symbolism, the meeting of the College of Arts and Sciences faculty was held in Harris Hall in the same room where Adolfo Calero had been attacked and denied his right to speak. After the reading of various documents by the chairman of the Committee on Promotion and Tenure, the first speaker, a youngish man with a Russian accent, announced that Prof. Foley had outraged any serious conception anyone could possibly hold of the responsibilities of academic citizenship. His accent, which told that he had firsthand experience of totalitarianism and thus cherished freedom all the more, seemed to make little impression on his fellow teachers; another professor quickly rose to say that there was no way to gauge citizenship in a university. Another professor, a teacher of political science, spoke strongly against Prof. Foley’s violation of free speech, and said that by this act she had disqualified herself as a member in good standing of the academic community. A few others rose to say that they did not have enough facts to make a careful judgment on the case.
But the reigning feeling in the room was overwhelmingly, if not pro-Foley, at least anti-administration. Much of the behavior I have come to think of as virtucratic was on display: people rising to speak chiefly to show that their hearts were in the right place. Not everyone apparently found Prof. Foley’s views and actions abhorrent. A man in what I took to be a Swiss-German accent spoke against the conlras, and ended by asserting that what Prof. Foley had done was courageous and correct. A left-wing sociologist chipped in with, “I’m sympathetic to speech acts against Mr. Calero.” An economist who had served on the Council of Economic Advisers under Jimmy Carter allowed that he found Prof. Foley’s views abhorrent, but felt she nevertheless deserved another chance; if she were to do something similar in the future, she should then be stripped of her tenure. A man in a page-boy hairdo and a mustache who looked like a minor character in Shakespeare (“poor Yorick,” perhaps) reminded the meeting that Barbara Foley had not been found guilty of a shout-down but of “inciting to a shout-down,” though the force of his distinction, of which he seemed rather proud, was lost on the audience. Another sociologist said that he wanted to hear nothing further about Prof. Foley’s moral conduct, for surely everyone knew that “the morals of professors are lower than that of a snake.” This remark was met with snickering laughter—a roomful of professors took it sitting down. Finally, a resolution was offered to the effect that the Promotion and Tenure Committee write a strong letter to the administration protesting its rejection of Barbara Foley’s tenure as a violation of the principle of faculty governance. It passed by a vast majority.
1 I was not at Harris Hall on the night of April 13. My account of the events of that evening derives solely from a document entitled “Decision, Ad Hoc Panel of UFRPTDAP, Northwestern University, In the Matter of Professor Barbara Foley.” Both Prof. Foley’s lawyer and the lawyer for the provost of Northwestern agreed to and signed a “Stipulation of Facts” about that evening. It is from the ad-hoc panel’s presentation of those agreed-upon facts that I have drawn.
2 Cornell University Press, 273 pp., $24.95.
A Case of Academic Freedom
Must-Reads from Magazine
A foreign-policy approach based in security and pragmatism is now characterized by retrenchment and radicalism
And yet realism is currently in crisis.
Realism was once a sophisticated intellectual tradition that represented the best in American statecraft. Eminent Cold War realists were broadly supportive of America’s postwar internationalism and its stabilizing role in global affairs, even as they stressed the need for prudence and restraint in employing U.S. power. Above all, Cold War–era realism was based on a hard-earned understanding that Americans must deal with the geopolitical realities as they are, rather than retreat to the false comfort provided by the Atlantic and Pacific oceans.
More recently, however, those who call themselves realists have lost touch with this tradition. Within academia, realism has become synonymous with a preference for radical retrenchment and the deliberate destruction of arrangements that have fostered international stability and prosperity for decades. Within government, the Trump administration appears to be embracing an equally misguided version of realism—an approach that masquerades as shrewd realpolitik but is likely to prove profoundly damaging to American power and influence. Neither of these approaches is truly “realist,” as neither promotes core American interests or deals with the world as it really is. The United States surely needs the insights that an authentically realist approach to global affairs can provide. But first, American realism will have to undergo a reformation.
The Realist Tradition
Realism has taken many forms over the years, but it has always been focused on the imperatives of power, order, and survival in an anarchic global arena. The classical realists—Thucydides, Machiavelli, Hobbes—considered how states and leaders should behave in a dangerous world in which there was no overarching morality or governing authority strong enough to regulate state behavior. The great modern realists—thinkers and statesmen such as Reinhold Niebuhr, Hans Morgenthau, George Kennan, and Henry Kissinger—grappled with the same issues during and after the catastrophic upheaval that characterized the first half of the 20th century.
They argued that it was impossible to transcend the tragic nature of international politics through good intentions or moralistic maxims, and that seeking to do so would merely empower the most ruthless members of the international system. They contended, on the basis of bitter experience, that aggression and violence were always a possibility in international affairs, and that states that desired peace would thus have to prepare for war and show themselves ready to wield coercive power. Most important, realist thinkers tended to place a high value on policies and arrangements that restrained potential aggressors and created a basis for stability within an inherently competitive global environment.
For this very reason, leading Cold War–era realists advocated a robust American internationalism as the best way of restraining malevolent actors and preventing another disastrous global crack-up—one that would inevitably reach out and touch the United States, just as the world wars had. Realist thinkers understood that America was uniquely capable of stabilizing the international order and containing Soviet power after World War II, even as they disagreed—sometimes sharply—over the precise nature and extent of American commitments. Moreover, although Cold War realists recognized the paramount role of power in international affairs, most also recognized that U.S. power would be most effective if harnessed to a compelling concept of American moral purpose and exercised primarily through enduring partnerships with nations that shared core American values. “An idealistic policy undisciplined by political realism is bound to be unstable and ineffective,” the political scientist Robert Osgood wrote. “Political realism unguided by moral purpose will be self-defeating and futile.” Most realists were thus sympathetic to the major initiatives of postwar foreign policy, such as the creation of U.S.-led military alliances and the cultivation of a thriving Western community composed primarily of liberal democracies.
At the same time, Cold War realists spoke of the need for American restraint. They worried that America’s liberal idealism, absent a sense of limits, would carry the country into quixotic crusades. They thought that excessive commitments at the periphery of the global system could weaken the international order against its radical challengers. They believed that a policy of outright confrontation toward the Kremlin could be quite dangerous. “Absolute security for one power means absolute insecurity for all others,” Kissinger wrote. Realists therefore advocated policies meant to temper American ambition and the most perilous aspects of superpower competition. They supported—and, in Kissinger’s case, led—arms-control agreements and political negotiations with Moscow. They often objected to America’s costliest interventions in the Third World. Kennan and Morgenthau were among the first mainstream figures to go public with opposition to American involvement in Vietnam (Morgenthau did so in the pages of Commentary in May 1962).
During the Cold War, then, realism was a supple, nuanced doctrine. It emphasized the need for balance in American statecraft—for energetic action blended with moderation, for hard-headed power politics linked to a regard for partnerships and values. It recognized that the United States could best mitigate the tragic nature of international relations by engaging with, rather than withdrawing from, an imperfect world.
This nuance has now been lost. Academics have applied the label of realism to dangerous and unrealistic policy proposals. More disturbing and consequential still, the distortion of realism seems to be finding a sympathetic hearing in the Trump White House.
Realism as Retrenchment
Consider the state of academic realism. Today’s most prominent self-identified realists—Stephen Walt, John Mearsheimer, Barry Posen, and Christopher Layne—advocate a thoroughgoing U.S. retrenchment from global affairs. Whereas Cold War realists were willing to see the world as it was—a world that required unequal burden-sharing and an unprecedented, sustained American commitment to preserve international stability—academic realists now engage in precisely the wishful thinking that earlier realists deplored. They assume that the international order can essentially regulate itself and that America will not be threatened by—and can even profit from—a more unsettled world. They thus favor discarding the policies that have proven so successful over the decades in providing a congenial international climate.
Why has academic realism gone astray? If the Cold War brokered the marriage between realists and American global engagement, the end of the Cold War precipitated a divorce. Following the fall of the Soviet Union, U.S. policymakers continued to pursue an ambitious global agenda based on preserving and deepening both America’s geopolitical advantage and the liberal international order. For many realists, however, the end of the Cold War removed the extraordinary threat—an expansionist USSR—that had led them to support such an agenda in the first place. Academic realists argued that the humanitarian interventions of the 1990s (primarily in the former Yugoslavia) reflected capriciousness rather than a prudent effort to deal with sources of instability. Similarly, they saw key policy initiatives—especially NATO enlargement and the Iraq war of 2003—as evidence that Washington was no longer behaving with moderation and was itself becoming a destabilizing force in global affairs.
These critiques were overstated, but not wholly without merit. The invasion and occupation of Iraq did prove far costlier than expected, as the academic realists had indeed warned. NATO expansion—even as it successfully promoted stability and liberal reform in Eastern Europe—did take a toll on U.S.–Russia relations. Having lost policy arguments that they thought they should have won, academic realists decided to throw the baby out with the bathwater, calling for a radical reformulation of America’s broader grand strategy.
The realists’ preferred strategy has various names—“offshore balancing,” “restraint,” etc.—but the key components and expectations are consistent. Most academic realists argue that the United States should pare back or eliminate its military alliances and overseas troop deployments, going back “onshore” only if a hostile power is poised to dominate a key overseas region. They call on Washington to forgo costly nation-building and counterinsurgency missions overseas and to downgrade if not abandon the promotion of democracy and human rights.
Academic realists argue that this approach will force local actors in Europe, the Middle East, and East Asia to assume greater responsibility for their own security, and that the United States can manipulate—through diplomacy, arms sales, and covert action—the resulting rivalries and conflicts to prevent any single power from dominating a key region and thereby threatening the United States. Should these calculations prove faulty and a hostile power be poised to dominate, Washington can easily swoop in to set things aright, as it did during the world wars. Finally, if even this calculation were to prove faulty, realists argue that America can ride out the danger posed by a regional hegemon because the Atlantic and Pacific Oceans and America’s nuclear deterrent provide geopolitical immunity against existential threats.
Today’s academic realists portray this approach as hard-headed, economical strategy. But in reality, it represents a stark departure from classical American realism. During the Cold War, leading realists placed importance on preserving international stability and heeded the fundamental lesson of World Wars I and II—that the United States, by dint of its power and geography, was the only actor that could anchor international arrangements. Today’s academic realists essentially argue that the United States should dismantle the global architecture that has undergirded the international order—and that Washington can survive and even thrive amid the ensuing disorder. Cold War realists helped erect the pillars of a peaceful and prosperous world. Contemporary academic realists advocate tearing down those pillars and seeing what happens.
The answer is “nothing good.” Contemporary academic realists sit atop a pyramid of faulty assumptions. They assume that one can remove the buttresses of the international system without that system collapsing, and that geopolitical burdens laid down by America will be picked up effectively by others. They assume that the United States does not need the enduring relationships that its alliances have fostered, and that it can obtain any cooperation it needs via purely transactional interactions. They assume that a world in which the United States ceases to promote liberal values will not be a world less congenial to America’s geopolitical interests. They assume that revisionist states will be mollified rather than emboldened by an American withdrawal, and that the transition from U.S. leadership to another global system will not unleash widespread conflict. Finally, they assume that if such upheaval does erupt, the United States can deftly manage and even profit from it, and that America can quickly move to restore stability at a reasonable cost should it become necessary to do so.
The founding generation of American realists had learned not to indulge in wishfully thinking that the international order would create or sustain itself, or that the costs of responding to rampant international disorder would be trivial. Today’s academic realists, by contrast, would stake everything on a leap into the unknown.
For many years, neither Democratic nor Republican policymakers were willing to make such a leap. Now, however, the Trump administration appears inclined to embrace its own version of foreign-policy realism, one that bears many similarities to—and contains many of the same liabilities as—the academic variant. One of the least academic presidents in American history may, ironically, be buying into some of the most misguided doctrines of the ivory tower.
Any assessment of the Trump administration must remain somewhat provisional, given that Donald Trump’s approach to foreign policy is still a work in progress. Yet Trump and his administration have so far taken multiple steps to outline a three-legged-stool vision of foreign policy that they explicitly describe as “realist” in orientation. Like modern-day academic realism, however, this vision diverges drastically from the earlier tradition of American realism and leads to deeply problematic policy.
The first leg is President Trump’s oft-stated view of the international environment as an inherently zero-sum arena in which the gains of other countries are America’s losses. The post–World War II realists, by contrast, believed that the United States could enjoy positive-sum relations with like-minded nations. Indeed, they believed that America could not enjoy economic prosperity and national security unless its major trading partners in Europe and Asia were themselves prosperous and stable. The celebrated Marshall Plan was high-mindedly generous in the sense of addressing urgent humanitarian needs in Europe, yet policymakers very much conceived of it as serving America’s parochial economic and security interests at the same time. President Trump, however, sees a winner and loser in every transaction, and believes—with respect to allies and adversaries alike—that it is the United States who generally gets snookered. The “reality” at the core of Trump’s realism is his stated belief that America is exploited “by every nation in the world virtually.”
This belief aligns closely with the second leg of the Trump worldview: the idea that all foreign policy is explicitly competitive in nature. Whereas the Cold War realists saw a Western community of states, President Trump apparently sees a dog-eat-dog world where America should view every transaction—even with allies—on a one-off basis. “The world is not a ‘global community’ but an arena where nations, nongovernmental actors and businesses engage and compete for advantage,” wrote National Security Adviser H.R. McMaster and National Economic Council Director Gary Cohn in an op-ed. “Rather than deny this elemental nature of international affairs, we embrace it.”
To be sure, Cold War realists were deeply skeptical about “one worldism” and appeals to a global community. But still they saw the United States and its allies as representing the “free world,” a community of common purpose forged in the battle against totalitarian enemies. The Trump administration seems to view U.S. partnerships primarily on an ad hoc basis, and it has articulated something akin to a “what have you done for me lately” approach to allies. The Cold War realists—who understood how hard it was to assemble effective alliances in the first place—would have found this approach odd in the extreme.
Finally, there is the third leg of Trump’s “realism”: an embrace of amorality. President Trump has repeatedly argued that issues such as the promotion of human rights and democracy are merely distractions from “winning” in the international arena and a recipe for squandering scarce resources. On the president’s first overseas trip to the Middle East in May, for instance, he promised not to “lecture” authoritarian countries on their internal behavior, and he made clear his intent to embrace leaders who back short-term U.S. foreign-policy goals no matter how egregious their violations of basic human rights and political freedoms. Weeks later, on a visit to Poland, the president did speak explicitly about the role that shared values played in the West’s struggle against Communism during the Cold War, and he invoked “the hope of every soul to live in freedom.” Yet his speech contained only the most cursory reference to Russia—the authoritarian power now undermining democratic governance and security throughout Europe and beyond. Just as significant, Trump failed to mention that Poland itself—until a few years ago, a stirring exemplar of successful transition from totalitarianism to democracy—is today sliding backwards toward illiberalism (as are other countries within Europe and the broader free world).
At first glance, this approach might seem like a modern-day echo of Cold War debates about whether to back authoritarian dictators in the struggle against global Communism. But, as Jeane Kirkpatrick explained in her famous 1979 Commentary essay “Dictatorships and Double Standards,” and as Kissinger himself frequently argued, Cold War realists saw such tactical alliances of convenience as being in the service of a deeper values-based goal: the preservation of an international environment favoring liberty and democracy against the predations of totalitarianism. Moreover, they understood that Americans would sustain the burdens of global leadership over a prolonged period only if motivated by appeals to their cherished ideals as well as their concrete interests. Trump, for his part, has given only faint and sporadic indications of any appreciation of the traditional role of values in American foreign policy.
Put together, these three elements have profound, sometimes radical, implications for America’s approach to a broad range of global issues. Guided by this form of realism, the Trump administration has persistently chastised and alienated long-standing democratic allies in Europe and the Asia-Pacific and moved closer to authoritarians in Saudi Arabia, China, and the Philippines. The president’s body language alone has been striking: Trump’s summits have repeatedly showcased conviviality with dictators and quasi-authoritarians and painfully awkward interactions with democratic leaders such as Germany’s Angela Merkel. Similarly, Trump has disdained international agreements and institutions that do not deliver immediate, concrete benefits for the United States, even if they are critical to forging international cooperation on key issues or advancing longer-term goods. As Trump has put it, he means to promote the interests of Pittsburgh, not Paris, and he believes that those interests are inherently at odds with each other.
To be fair, President Trump and his proxies do view the war on terror as a matter of defending both American security interests and Western civilization’s values against the jihadist onslaught. This was a key theme of Trump’s major address in Warsaw. Yet the administration has not explained how this civilizational mindset would inform any other aspect of its foreign policy—with the possible exception of immigration policy—and resorts far more often to the parochial lens of nationalism.
The Trump administration seems to be articulating a vision in which America has no lasting friends, little enduring concern with values, and even less interest in cultivating a community of like-minded nations that exists for more than purely deal-making purposes. The administration has often portrayed this as clear-eyed realism, even invoking the founding father of realism, Thucydides, as its intellectual lodestar. This approach does bear some resemblance to classical realism: an unsentimental approach to the world with an emphasis on the competitive aspects of the international environment. And insofar as Trump dresses down American allies, rejects the importance of values, and focuses on transactional partnerships, his version of realism has quite a lot in common with the contemporary academic version.
Daniel Drezner of Tufts University has noted the overlap, declaring in a Washington Post column, “This is [academic] realism’s moment in the foreign policy sun.” Randall Schweller of Ohio State University, an avowed academic realist and Trump supporter, has been even more explicit, noting approvingly that “Trump’s foreign-policy approach essentially falls under the rubric of ‘off-shore balancing’” as promoted by ivory-tower realists in recent decades.
Yet one suspects that the American realists who helped create the post–World War II order would not feel comfortable with either the academic or Trumpian versions of realism as they exist today. For although both of these approaches purport to be about power and concrete results, both neglect the very things that have allowed the United States to use its power so effectively in the past.
Both the academic and Trump versions of realism ignore the fact that U.S. power is most potent when it is wielded in concert with a deeply institutionalized community of like-minded nations. Alliances are less about addition and subtraction—the math of the burden-sharing emphasized by Trump and the academic realists—and more about multiplication, leveraging U.S. power to influence world events at a fraction of the cost of unilateral approaches. The United States would be vastly less powerful and influential in Europe and Central Asia without NATO; it would encounter far greater difficulties in rounding up partners to wage the ongoing war in Afghanistan or defeat the Islamic State; it would find itself fighting alone—rather than with some of the world’s most powerful partners—far more often. Likewise, without its longstanding treaty allies in Asia, the United States would be at an almost insurmountable disadvantage vis-à-vis revisionist powers in that region, namely China.
Both versions of realism also ignore the fact that America has been able to exercise its enormous power with remarkably little global resistance precisely because American leaders, by and large, have paid sufficient regard to the opinions of potential partners. Of course, every administration has sought to “put America first,” but the pursuit of American self-interest has proved most successful when it enjoys the acquiescence of other states. Likewise, the academic and Trump versions of realism too frequently forget that America draws power by supporting values with universal appeal. This is why every American president from Franklin Roosevelt to Barack Obama has recognized that a more democratic world is likely to be one that is both ideologically and geopolitically more congenial to the United States.
Most important, both the academic and Trump versions of realism ignore the fact that the classical post–World War II realists deliberately sought to overcome the dog-eat-dog world that modern variants take as a given. They did so by facilitating cooperation within the free world, suppressing the security competitions that had previously led to cataclysmic wars, creating the basis for a thriving international economy, and thereby making life a little less nasty, brutish, and short for Americans as well as for vast swaths of the world’s population.
If realism is about maximizing power, effectiveness, and security in a competitive global arena, then neither the academic nor the Trump versions of realism merits the name. And if realism is meant to reflect the world as it is, both of these versions are deeply deficient.
This is a tragedy. For if ever there were a moment for an informed realism, it would be now, as the strategic horizon darkens and a more competitive international environment reemerges. There is still time for Trump and his team to adapt, and realism can still make a constructive contribution to American policy. But first it must rediscover its roots—and absorb the lessons of the past 70 years.
The Seven Pillars of Realism
A reformed realism should be built upon seven bedrock insights, which President Trump would do well to embrace.
First, American leadership remains essential to restraining global disorder. Today’s realists channel the longstanding American hope that there would come a time when the United States could slough off the responsibilities it assumed after World War II and again become a country that relies on its advantageous geography to keep the world at arm’s length. Yet realism compels an awareness that America is exceptionally suited to the part it has played for nearly four generations. The combination of its power, geographic location, and values has rendered America uniquely capable of providing a degree of global order in a way that is more reassuring than threatening to most of the key actors in the international system. Moreover, given that today the most ambitious and energetic international actors besides the United States are not liberal democracies but aggressive authoritarian powers, an American withdrawal is unlikely to produce multipolar peace. Instead, it is likely to precipitate the upheaval that U.S. engagement and activism have long been meant to avert. As a corollary, realists must also recognize that the United States is unlikely to thrive amid such upheaval; it will probably find that the disorder spreads and ultimately implicates vital American interests, as was twice the case in the first half of the 20th century.
Second, true realism recognizes the interdependence of hard and soft power. In a competitive world, there is no substitute for American hard power, and particularly for military muscle. Without guns, there will not—over the long term—be butter. But military power, by itself, is an insufficient foundation for American strategy. A crude reliance on coercion will damage American prestige and credibility in the end; hard power works best when deployed in the service of ideas and goals that command widespread international approval. Similarly, military might is most effective when combined with the “softer” tools of development assistance, foreign aid, and knowledge of foreign societies and cultures. The Trump administration has sought to eviscerate these nonmilitary capabilities and bragged about its “hard-power budget”; it would do better to understand that a balance between hard and soft power is essential.
Third, values are an essential part of American realism. Of course, the United States must not undertake indiscriminate interventions in the name of democracy and human rights. But, fortunately, no serious policymaker—not Woodrow Wilson, not Jimmy Carter, not George W. Bush—has ever embraced such a doctrine. What most American leaders have traditionally recognized is that, on balance, U.S. interests will be served and U.S. power will be magnified in a world in which democracy and human rights are respected. Ronald Reagan, now revered for his achievements in improving America’s global position, understood this point and made the selective promotion of democracy—primarily through nonmilitary means—a key part of his foreign policy. While paying due heed to the requirements of prudence and the limits of American power, then, American realists should work to foster a climate in which those values can flourish.
Fourth, a reformed realism requires aligning relations with the major powers appropriately—especially today, as great-power tensions rise. That means appreciating the value of institutions that have bound the United States to some of the most powerful actors in the international system for decades and thereby given Washington leadership of the world’s dominant geopolitical coalition. It means not taking trustworthy allies for granted or picking fights with them gratuitously. It also means not treating actual adversaries, such as Vladimir Putin’s Russia, as if they were trustworthy partners (as Trump has often talked of doing) or as if their aggressive behavior were simply a defensive response to American provocations (as many academic realists have done). A realistic approach to American foreign policy begins by seeing great-power relations through clear eyes.
Fifth, limits are essential. Academic realists are wrong to suggest that values should be excised from U.S. policy; they are wrong to argue that the United States should pull back dramatically from the world. Yet they are right that good statecraft requires an understanding of limits—particularly for a country as powerful as the United States, and particularly at a time when the international environment is becoming more contested. The United States cannot right every wrong, fix every problem, or defend every global interest. America can and should, however, shoulder more of the burden than modern academic and Trumpian realists believe. The United States will be effective only if it chooses its battles carefully; it will need to preserve its power for dealing with the most pressing threat to its national interests and the international order—the resurgence of authoritarian challenges—even if that means taking an economy-of-force approach to other issues.
Sixth, realists must recognize that the United States has not created and sustained a global network of alliances, international institutions, and other embedded relationships out of a sense of charity. It has done so because those relationships provide forums through which the United States can exercise power at a bargain-basement price. Embedded relationships have allowed the United States to rally other nations to support American causes from the Korean War to the counter-ISIS campaign, and have reduced the transaction costs of collective action to meet common threats from international terrorism to p.iracy. They have provided institutional megaphones through which the United States can amplify its diplomatic voice and project its influence into key issues and regions around the globe. If these arrangements did not exist, the United States would find itself having to create them, or acting unilaterally at far greater cost. If realism is really about maximizing American power, true realists ought to be enthusiastic about relationships and institutions that serve that purpose. Realists should adopt the approach that every post–Cold War president has embraced: that the United States will act unilaterally in defense of its interests when it must, but multilaterally with partners whenever it can.
Finally, realism requires not throwing away what has worked in the past. One of the most astounding aspects of both contemporary academic realism and the Trumpian variant of that tradition is the cavalier attitude they display toward arrangements and partnerships that have helped produce a veritable golden age of international peace, stability, and liberalism since World War II, and that have made the United States the most influential and effective actor in the globe in the process. Of course, there have been serious and costly conflicts over the past decades, and U.S. policy has always been thoroughly imperfect. But the last 70 years have been remarkably good ones for U.S. interests and the global order—whether one compares them with the 70 years before the United States adopted its global leadership role, or compares them with the violent disorder that would have emerged if America followed the nostrums peddled today under the realist label. A doctrine that stresses that importance of prudence and discretion, and that was originally conservative in its preoccupation with stability and order, ought not to pursue radical changes in American statecraft or embrace a “come what may” approach to the world. Rather, such a doctrine ought to recognize that true achievements are enormously difficult to come by—and that the most realistic approach to American strategy would thus be to focus on keeping a good thing going.
The story of Britain’s unknown neoconservatives
During the decade that followed, the prospects of “the sick man of Europe” were seemingly transformed. With the free market unleashed and the authority of the democratic government restored, inflation fell, growth resumed, and the unions were tamed. Britain became the laboratory for an experiment—privatization—that would transform not just its economy, but that of many countries throughout the world that came to look to it for inspiration.
More than any other Briton, one person was responsible for this about-turn: Margaret Thatcher. The foundations for what came to be known as the Thatcher revolution were laid in the four years she spent as leader of the Opposition before the Conservative Party she led was returned to power at the 1979 general election. During this period, much of the groundwork was done by a curious and unlikely triumvirate. Thatcher, the daughter of a shopkeeper and Methodist lay preacher from the provincial Middle England town of Grantham, was both the leader and the follower of the other two. They were Sir Keith Joseph, the scion of a wealthy Anglo-Jewish family, and Alfred Sherman, a former Communist working-class Jew from London’s East End whose parents had fled Czarist Russia.
Traditionally, the relationship between Jews and the Conservative Party had been one of mutual distrust. It was the Tories, for instance, who had attempted to shut the door to Jewish immigrants at the turn of the 20th century, while it was the Labour Party in which many of their sons and daughters would find a sympathetic home. An all-too-common mix of snobbery and anti-Semitism dominated the upper echelons of the Conservative Party, seemingly undisturbed by the fact that, by the 1930s, upward mobility began to enable some Jews to leave behind the socialist citadels of the inner cities and find a home in Tory-voting suburbia.
After the war, the association between the Tory Party and prewar appeasement, indifference verging on hostility to the birth of the state of Israel, and occasional manifestations of anti-Semitism among its grassroots membership meant that many Jews continued to shun it. There were only two Jews on the Tory benches in the House of Commons in the 25 years between 1945 and 1970—as against, at its peak, 38 Jewish Labour MPs in 1966. During the 1970s, this began to shift: Further demographic changes within the Jewish community, Labour’s drift toward anti-Zionism, and the more meritocratic bent of the Conservative Party, begun under Prime Minister Ted Heath (1970–74) and accelerated by Thatcher, dramatically increased the number of Jews voting Tory and sitting on the party’s benches in parliament.
If the Tory Party had historically been unwelcoming toward Jews, it had also had little time for intellectuals. While the notion of the Conservatives as the “stupid party,” as Britain’s only Jewish prime minster called it, was overblown, it was also true that many Tories regarded ideas and those who traded in them as suspect and a distraction from the party’s mission to govern the nation unencumbered by the kind of intellectual baggage that might hinder its ruthlessly successful pursuit of power.
Thatcher, Joseph, and Sherman would change all that.
When Thatcher unseated Heath as the Conservative Party’s leader in February 1975, the party was suffering an acute crisis of confidence. Heath had lost three of the four elections he had fought against Labour’s wily leader, Harold Wilson. The previous October, the Tories had received their lowest share of the vote since 1945.
These political problems were accompanied by—indeed, caused by, Thatcher was certain—a lack of self-belief. For three decades, the Tories had embraced the postwar consensus of Keynesian economics and a welfare state. In 1970, the party’s “Selsdon Manifesto” had promised to break with that ignoble history by freeing up the economy, reining in government, and clipping the wings of the nation’s powerful trade unions. But, barely two years in office, Heath’s government had buckled at the first sign of resistance and executed a less than gracious U-turn: caving into miners in the face of a strike and rolling back some newly introduced restrictions on the unions; ditching fiscal caution in an ill-fated “dash for growth”; and introducing wage and price controls. Its Industry Act, crowed the leader of Labour’s left, Tony Benn, was “spadework for socialism.” As members of the Heath government, Thatcher and Joseph—respectively responsible for the high-spending education and health departments—were implicated in this intellectual and political betrayal. But, unlike many of their colleagues, the two most economically conservative members of Heath’s Cabinet were determined it would be the last.
The son of a former lord mayor of London, Joseph was an improbable revolutionary by both background and temperament. Sherman would later note his ally’s “tendency to wilt under pressure” and aversion to conflict.
And yet Joseph was to be the man who lit the touch paper that, as Sherman put it, “sparked off the Thatcher revolution.”
Thatcher and Joseph shared a common attribute: the sense that they were both outsiders. Hers stemmed from her grocer’s-daughter upbringing, the snobbery and disdain she encountered at Oxford from both the upper-class grandees of the Conservative Association and the liberal intelligentsia that dominated its academic body, and later, her gender, as she sought a safe Tory seat.
His originated from his Judaism. In later life, Joseph suggested that the advantage of being Jewish was that to be successful, “you have to spark on all four cylinders.” To put it less positively, Jews faced greater barriers to achievement than others and so had to be twice as able. Despite his rapid rise through the Tory ranks once he had entered parliament 1956, Joseph remained, in the words of one observer, “almost alien.” Nonetheless, Joseph was very much in the mainstream of postwar moderate Conservatism. He combined a liberal social outlook and concern for the poor with a belief in the importance of entrepreneurship.
Occasionally, as when the Conservatives lost power in 1964, Joseph would signal dissent with the leftward direction in which his party was drifting. In a series of speeches and articles, he bemoaned the Tories’ failure to free Britain from the collectivist constraints Labour had imposed upon it after the war, talking of the need to cut taxes further, give business greater freedom, and, perhaps most significantly for the future, raise the then virtually unheard-of prospect of privatization.
But for the most part he toed the party line, as did Thatcher. Neither indicated any personal misgivings or public signs of disagreement when Heath abandoned the free-market program on which the Conservative government had been elected in 1970.
Joseph’s weakness at this critical moment escaped neither the wrath nor the attention of Alfred Sherman. Sherman’s upbringing in the East End of London was one, he later suggested, in which “you were born a socialist, you didn’t have to become one.”
Struggling to assimilate against a backdrop of barely disguised official anti-Semitism, Sherman became a Communist. “When we deserted the God of our fathers,” he wrote, “we were bound to go whoring after strange gods, of which socialism in its various forms was a prominent choice.” At 17, he went to war in Spain. His turn from Marxism came after World War II, when he studied at the London School of Economics and came upon F.A. Hayek’s The Road to Serfdom. It “set him thinking”—and in 1948 he was expelled from the Communist Party for “deviationism.” In the unpromising terrain of 1950s socialist Israel, where he went to work as an economic advisor, he developed his fervent support for the free market. It was a cause he would vociferously promote on his return to Britain.
The two future collaborators in the Thatcher project first met when Sherman—at this point a journalist for the Daily Telegraph, the house journal of the Conservative Party—came to interview Joseph shortly after he had become a Cabinet minister in 1962. Sherman soon began to help write Joseph’s speeches, including those in which, before the Tories’ return to government in 1970, Joseph first began to tentatively break with the postwar consensus. Sherman was thus dismayed not only by the Heath government’s abandonment of its pre-election free-market pledges, but Joseph’s supposed connivance in this betrayal. He later labeled his friend “a lion in opposition and a lamb in government.”
But the shattering blow of the Tories’ ejection from office in 1974 at the hands of the unions brought the two men back together. “Keith,” Sherman bluntly told Joseph over lunch one day, “the trouble is that you agree with me but you haven’t got the backbone to say so.” While Sherman was a Conservative, his disdain for the establishment did not recognize party labels. The Tories, he believed, appeared to judge virtue by the measure of whether it won them elections. The free-market revolution that he wanted Joseph to lead was designed not simply to sweep away socialism, but to cleanse the Conservative Party of its postwar ideological sins. And so it was that, with Sherman acting as his confessor, Joseph underwent his very public recantation and conversion to Conservatism.
What Sherman would later dub “the London Spring” commenced on June 24, 1974, when Joseph delivered the first of a series of speeches eviscerating the Tories’ record and his own part in it. The introductory lines of this first speech, drafted by Sherman, represented the opening volley in what was to become a five-year assault on the postwar settlement:
This is no time to be mealy-mouthed. Since the end of the Second World War we have had altogether too much Socialism.…For half of that 30 years Conservative Governments, for understandable reasons, did not consider it practicable to reverse the vast bulk of the accumulating detritus of Socialism which on each occasion they found when they returned to office.
Just over two months later, on the eve of 1974’s second election, called by Labour’s Harold Wilson to boost his weak parliamentary position, Joseph returned to the fray once again. He assailed the last Tory government for abandoning “sound money policies,” suggested that it had been debilitated by an unwarranted fear of unemployment, and warned that inflation was “threatening to destroy our society.” His solution—neither “easy nor enjoyable”— was to cut the deficit, gradually bear down on the money supply, and accept that there was a resultant risk of a temporary increase in unemployment.
This was the moment at which the Tories began to break with the principal tenet of Keynesianism—that government’s overriding goal should be to secure full employment. As Thatcher argued in her memoirs, it was “one of the very few speeches which have fundamentally affected a political generation’s way of thinking.” A decade later, when she had been prime minister for five years, the import of Joseph’s words in Preston was clearer still. By that point, Britain was being led by a woman whose government had broken decisively with the policies of its predecessors, placed the defeat of inflation above that of unemployment, and turned monetarism into its economic lodestar. Thatcher had determined that she would not, as Joseph had cautioned against, “be stampeded again” into a Heath-like surrender to Keynes.
But at the time, Thatcher’s response to the Tory defeat in February 1974 was publicly muted. Her pronouncements—“I think we shall finish up being the more radical party”—verged on the anodyne. But she did become a vice-chair of the new Centre for Policy Studies, the think tank that Joseph and Sherman had newly established to “question the unquestioned, think the unthinkable, [and] blaze a trail,” in Sherman’s world. Not for nothing would Geoffrey Howe describe Sherman as “a zealot of the right.” During this period, as she later acknowledged, Thatcher “learned a great deal” from Sherman and Joseph. Thatcher began to attend lunches and seminars at the free-market Institute of Economic Affairs think tank and, as co-founder of the IEA, Lord Harris of High Crosssaid, said, “ponder our writing and our authors’ publications.”
That Joseph would lead while Thatcher followed was not, then, surprising. She had always regarded him as “the senior partner” in their close political friendship. Thatcher urged Joseph to challenge Heath for the Tory Party leadership and discouraged speculation that she herself might seek it. Then Joseph delivered an ill-advised speech on social policy in which he suggested that “the balance of our population, our human stock is threatened” by the birth rates of the poor. It led to a media furor and the abandonment of his still-embryonic campaign. Frustrated, Thatcher stepped into the breach. Two months later, she was elected leader.
In her campaign to take command of the Conservative Party, Thatcher sounded many of the same notes as Joseph: that voters believed too many Conservatives “had become Socialists already” and that Britain was moving inexorably in the direction of socialism, taking “two steps forward” under Labour, but only “half a step back” under the Tories. Nonetheless, she was under no illusions that her victory in the leadership election represented a “wholesale conversion” by the party to her and Joseph’s way of thinking. Over the next four years, the support and counsel of Joseph would prove invaluable.
Thatcher had, in the words of one of her Downing Street policy advisors, “no interest in ideas for their own sake,” but she did regard politics as a clash of opposing philosophies. “We must have an ideology,” she declared to the Conservative Philosophy Group, which was formed in the year she became party leader. “The other side have got an ideology they can test their policies against.” She thus looked to Joseph and Sherman to articulate her “beliefs, feelings, instincts, and intuitions into ideas, strategies, and policies,” in Sherman’s telling. They were the builders of the intellectual edifice for the instincts—that “profligacy was a vice” and government, like a prudent household, should live within its means—that, Thatcher proudly declared, she had learned from “the world in which I grew up.”
Many Tories regarded the very notion of a “battle of ideas” as dangerous nonsense. For others, it was the ideas themselves that were suspect. When Joseph presented a paper in April 1975 urging a break with the “path of consensus” and a much greater defense of “what some intellectuals disparagingly call ‘middle-class suburban values,’ a desire to enjoy economic independence, to be well thought of, patriotism”—it met with a furious response from the Tory Shadow Cabinet. Joseph’s call for the Conservatives to push an agenda of higher defense spending, an assault on union power, deep cuts in public expenditure, and measures to curb immigration and bolster the family was greeted with horror by his colleagues. But as Thatcher’s biographer, Charles Moore, has noted, “this startling paper furnished the main elements of what came to be called Thatcherism, both in specific policy and in general psychological terms.”
Meanwhile, memos, letters, and speeches poured forth from Sherman, invariably urging Thatcher and Joseph to go further and faster. With Sherman as his navigator and companion, Joseph himself assumed the role of outrider— “the licensed thinker scouting ahead in Indian country,” as future MP and Cabinet minister Oliver Letwin put it—helping to open up new territory for the Tory leader to occupy when she deemed it politically safe to do so. Her political antennae, much sharper and more finely attuned than those of Joseph or Sherman, proved critical to this creative mix. They drew fire from the Tory old guard, allowing Thatcher to rise above the fray and then later make public pronouncements that frequently followed the Joseph-Sherman line.
Joseph marked the territory between the two camps clearly. He urged the Tories to reach for the “common ground.” He did not mean the centrist midpoint between the two main parties’ positions, which had been the Conservative approach since the end of the war. He meant the territory where a majority of the public found itself, on the opposite side of the political establishment. As Sherman wrote to Thatcher, in trying to compete with Labour in the ephemeral center ground, the Tories had abandoned the defense of those values—“patriotism, the puritan ethic, Christianity, conventional family-based morality”— that most voters supported. More prosaically, he urged her to speak out on issues such as “national identity, law and order, and scrounging.” He thus provided her with an electoral and moral justification for pursuing a populist political strategy that dovetailed with her own instinctive convictions.
This son of Jewish immigrants would later speak of his disapproval of the term “Judeo-Christian values” and would insist that Thatcher should root her message in her own Methodist upbringing and the Tories’ close relationship with Britain’s Established Church. Thatcher proved more ecumenical. As her close friendship with Chief Rabbi Immanuel Jakobovits illustrated, she saw, and often remarked upon, the close harmony between Judaism and the nonconformist insistence on individual responsibility, community self-help, and the moral necessity of self-improvement and wealth creation imparted by her father. Not for nothing would the Sunday Telegraph later admiringly suggest during her premiership that Judaism had become “the new creed of Thatcherite Britain.”
Sherman’s early political convictions had both positive and negative ramifications. Thatcher said he brought a “convert’s zeal to the task of plotting out a new kind of free-market Conservatism.” What Sherman referred to as his “Communist decade,” he wrote, had taught him “to think big, to believe that, aligned with the forces of history, a handful of people with sufficient faith could move mountains.” His understanding of the left also allowed him to recognize, in a way neither Joseph nor Thatcher intuitively did, the need to cast Thatcherism as an anti-establishment, radical force. Combined with his assiduous wooing of disenchanted former Labour supporters, this helped Thatcher win some high-profile converts, such as the novelist Kingsley Amis, the writer Paul Johnson, and the academic John Vaizey.
The intellectual development of Thatcherism in the 1970s was, of course, the work of many hands. While not by any means exclusively so, many were Jewish and some came from outside the Tory fold. The political scientist Shirley Robin Letwin and her husband, the economist Bill Letwin, both American-born, began to offer advice and assistance with Thatcher’s speeches. While recoiling from her devotion to “Victorian values,” the economist Samuel Brittan was nonetheless an influential exponent of monetarism. His economic commentary in the Financial Times was the only newspaper column Thatcher never missed reading. Arthur Seldon, a founder of the IEA, was a supporter of the Liberal Party who hankered in vain for it return to its Gladstonian belief in limited government. He ensured the flame of free-market economics was not completely extinguished in the 1950s, helped introduce the ideas of Milton Friedman to Britain, and willingly assisted in Thatcher’s effort to smash the postwar settlement.
However, it was Joseph and Sherman who were the preeminent warriors in the battle of ideas. Joseph’s 1976 Stockton Lecture, “Monetarism Is Not Enough,” called for a squeeze on the money supply to bring down inflation, substantial cuts in taxes and spending, and “bold incentives and encouragements” to wealth-creators. It encapsulated the governing agenda and underlying philosophy of the Thatcher governments. Thatcher biographer Hugo Young believed that Joseph’s speeches during this time contained “everything that is distinctive about the economic and political philosophy” of Thatcherism. Joseph took “the moral case for capitalism” into the lion’s den of the campuses, delivering 150 speeches in three years on the virtues of the free market. Despite the frequent attempts of hard-left students to disrupt his appearances, Thatcher later concluded that Joseph’s work had been critical in restoring the right’s “intellectual self-confidence.” She said that “all that work with the intellectuals” helped underlay her government’s later successes.
In the settling of scores that followed her dramatic defenestration in November 1990, Thatcher’s sense of betrayal was evident. Among the few who escaped her harsh words were Joseph and Sherman. In the first volume of her memoirs, which she dedicated to Joseph’s memory, Thatcher wrote simply: “I could not have become Leader of the Opposition, or achieved what I did as Prime Minister, without Keith. But nor, it is fair to say, could Keith have achieved what he did without …Alfred Sherman.”
Joseph and Sherman’s presence underlines the leading role played by Jews in the intellectual regeneration of British conservatism, a prominence akin to—and perhaps even greater than—that played by Jewish neoconservatives in the Reagan revolution.
Review of 'The Strange Death of Europe' By Douglas Murray
Since Christianity had shaped the “humanism of which Europe feels legitimately proud,” the ailing pontiff argued, the constitution should make some reference to Europe’s Christian patrimony. His appeal was met with accusations of bigotry. The pope had inflamed the post-9/11 atmosphere of “Islamophobia,” one “anti-racism” outfit said. Another group asked: What about the contributions made by the “tolerant Islam of al-Andalus”? Former French President Valéry Giscard d’Estaing spoke for the political class: “Europeans live in a purely secular political system, where religion does not play an important role.”
Douglas Murray recounts this episode early on in his fiery, lucid, and essential polemic. It epitomized the folly of European elites who would sooner discard the Continent’s civilizational heritage than show partiality for their own culture over others’. To Murray, this tendency is quite literally suicidal—hence the “death” in his title.
The book deals mainly with Western Europe’s disastrous experiment in admitting huge numbers of Muslim immigrants without bothering to assimilate them. These immigrants now inhabit parallel communities on the outskirts of most major cities. They reject mainstream values and not infrequently go boom. Murray’s account ranges from the postwar guest-worker programs to the 2015 crisis that brought more than a million people from the Middle East and Africa.
This is dark-night-of-the-soul stuff. The author, a director at London’s Henry Jackson Society (where I was briefly a nonresident fellow), has for more than a decade been among Europe’s more pessimistic voices on immigration. My classically liberal instincts primed me to oppose him at every turn. Time and again, I found myself conceding that, indeed, he has a point. This is in large part because I have been living in and reporting on Europe for nearly four years. Events of the period have vindicated Murray’s bleak vision and confounded his critics.
Murray is right: Time isn’t mellowing out Europe’s Muslims. “The presumption of those who believed in integration is that in time everybody who arrives will become like Europeans,” Murray writes. Yet it is the young who are usually the most fanatical. Second- and third-generation immigrants make up the bulk of the estimated 5,000 Muslims who have gone off to fight with the Islamic State.
The first large wave of Muslim immigrants to Britain arrived soon after World War II. Seven decades later, an opinion survey conducted (in 2016) by the polling firm ICM found that half of Muslim Britons would proscribe homosexuality, a third would legalize polygamy, and a fifth would replace civil law with Shariah. A different survey, also conducted in 2016, found that 83 percent of young French Muslims describe their faith as “important or very important” to them, compared with 22 percent of young Catholics. I could go on with such polling data; Murray does for many pages.
He is also correct that all the various “integration” models have failed. Whether it is consensus-based social democracy in the Nordic countries, multiculturalism in Britain, or republican secularism in France, the same patterns of disintegration and social incohesion persist nearly everywhere. Different European governments have treated this or that security measure, economic policy, or urban-planning scheme as the integration panacea, to no avail.
Murray argues that the successive failures owe to a basic lack of political will. To prove the point he cites, among other things, female genital mutilation in the UK. Laws against the practice have been on the books for three decades. Even so, an estimated 130,000 British women have had their genitals cut, and not a single case has been successfully prosecuted.
Pusillanimity and retreat have been the norm among governments and cultural elites on everything from FGM to free speech to counterterrorism. The result has been that the “people who are most criticized both from within Muslim communities in Europe and among the wider population are in fact the people who fell hardest for the integration promises of liberal Europe.” It was Ayaan Hirsi Ali, the fierce Somali-born proponent of Enlightenment values and women’s equality, who had to escape Holland under a death threat, not her persecutors.
And Murray is right when he says that Europeans hadn’t staged a real debate on immigration until very recently. The author might be too quick to dismiss the salutary fiscal and social effects of economic growth and immigration’s role in promoting it. At various points he even suggests that Europeans forgo economic as well as population growth if it means having to put up with fewer migrants. He praises hermetically sealed Japan, but he elides the Japanese model’s serious economic, demographic, and even psychological disadvantages.
All this is secondary to Murray’s unanswerable argument that European elites had for years cordoned off immigration from normal political debate. As he writes, “whereas the benefits of mass immigration undoubtedly exist and everybody is made very aware of them, the disadvantages of importing huge numbers of people from another culture take a great deal of time to admit to.” In some cases, most notably the child-sex grooming conspiracy in Rotherham, England, the institutions have tried to actively suppress the truth. Writes Murray: “Instead of carrying out their jobs without fear or favor, police, prosecutors, and journalists behaved as though their job was to mediate between the public and the facts.”I s it possible to imagine an alternative history, one in which Europe would absorb this many migrants from Islamic lands but suffer fewer and less calamitous harms? Murray’s surprising answer is yes. Had Europe retained its existential confidence over the course of the previous two centuries, things might have turned out differently. As it was, however, mass migration saw a “strong religious culture”—Islam—“placed into a weak and relativistic culture.”
In the book’s best chapters, Murray departs from the policy debate to attend to the sources of Europe’s existential insecurity. Germans bear much of the blame, beginning with 19th-century Bible scholarship that applied the methods of history, philology, and literary criticism to sacred scripture. That pulled the rug of theological certainty from under Europe’s feet, in Murray’s account, and then Darwin’s discoveries heightened the disorientation. Europeans next tried to substitute totalistic ideology for religion, with catastrophic results.
Finally, after World War II, they settled on human rights as the central meaning of Europe. But since Europeans could no longer believe, these rights were cut off from one of their main wellsprings: the Judeo-Christian tradition. The Catholic Church—having circumscribed the power of earthly kings across centuries and thereby “injected an anti-totalitarian vaccine into the European bloodstream,” as George Weigel has written in these pages–was scorned or ignored. Europeans forgot how they came to be free.
Somehow Europe must recover its vitality. But how? Murray is torn. On one hand, he sees how a rights-based civilization needs a theological frame, lest it succumb before a virile and energetic civilization like Islam. On the other, he thinks the leap of faith is impossible today. Murray can’t blame François, the professor-protagonist of Michel Houellebecq’s 2016 novel Submission. Faced with an Islamic takeover of France, François heads to a monastery desperate to shake his spiritual torpor. But kneeling before the Virgin doesn’t do anything for him. Islam, with its simplicity and practicality (not least the offer of up to four nubile wives), is much harder to resist.
Murray wonders whether the answer lies in art. Maybe in beauty Europeans can recover the fulfillment and sense of mystery that their ancestors once found in liturgy–only without the cosmic truth claims. He laments that contemporary European art has “given up that desire to connect us to something like the spirit of religion,” though it is possible that the current period of crisis will engender a revival. In the meanwhile, Murray has suggested, even nonbelievers should go to church as a way to mark and show gratitude for Christianity’s foundational role in Europe.
He is onto something. Figure out the identity bit in the book’s subtitle—“Immigration, Identity, Islam”—and the other two will prove much easier to sort out.
A maestro’s morality
How is it possible that a man who made his conducting debut when Grover Cleveland was president should still be sufficiently well known and revered that most of his recordings remain in print to this day? Toscanini: Musician of Conscience, Harvey Sachs’s new biography, goes a long way toward defining what made Toscanini unique.1 A conductor himself, Sachs is also the author of, among other excellent books, a previous biography of Toscanini that was published in 1978. Since then, several large caches of important primary-source material, most notably some 1,500 of the conductor’s letters, have become available to researchers. Sachs’s new biography draws on this new material and other fresh research. It is vastly longer and more detailed than its predecessor and supersedes it in every way.
Despite its length and thoroughness, Toscanini: Musician of Conscience is not a pedant’s vade mecum. Clearly and attractively written, it ranks alongside Richard Osborne’s 1998 biography of Herbert von Karajan as one of the most readable biographies of a conductor ever published. For Toscanini, as Sachs shows us, had a volatile, immensely strong-willed character, one that in time caused him to clash not only with his colleagues but with the dangerous likes of Adolf Hitler and Benito Mussolini. The same fierce integrity that energized his conducting also led him to put his life at risk at a time when many of his fellow musicians were disinclined to go even slightly out of their way to push back against the Fascist tyrants of the ’30s.T oscanini: Musician of Conscience does not devote much space to close analysis of Toscanini’s interpretative choices and technical methods. For the most part, Sachs shows us Toscanini’s art through the eyes of others, and the near-unanimity of the admiration of his contemporaries, whose praise is quoted in extenso, is striking, even startling. Richard Strauss, as distinguished a conductor as he was a composer, spoke for virtually everyone in the world of music when he said, “When you see that man conduct, you feel that there is only one thing for you to do: take your baton, break it in pieces, and never conduct again.”
Fortunately for posterity, Toscanini’s unflashy yet wondrously supple baton technique can be seen up close in the 10 concerts he gave with the NBC Symphony between 1948 and 1952 that were telecast live (most of which can now be viewed in part or whole on YouTube). But while his manual gestures, whose effect was heightened by the irresistible force of his piercing gaze, were by all accounts unfailingly communicative, Toscanini’s ability to draw unforgettable performances out of the orchestras that he led had at least as much to do with his natural musical gifts. These included an infallible memory—he always conducted without a score—and an eerily exact ear for wrong notes. Such attributes would have impressed orchestra players, a hard-nosed lot, even if they had not been deployed in the service of a personality so galvanizing that most musicians found it all but impossible not to do Toscanini’s musical bidding.
What he wanted was for the most part wholly straightforward. Toscanini believed that it was his job—his duty, if you will—to perform the classics with note-perfect precision, singing tone, unflagging intensity, and an overall feeling of architectural unity that became his trademark. When an orchestra failed to give of its best, he flew into screaming rages whose verbal violence would likely not be believed were it not for the fact that there were secret tapes made. In one of his most spectacular tantrums, which has been posted on YouTube, he can be heard telling the bass players of the NBC Symphony that “you have no ears, no eyes, nothing at all…you have ears in—in your feet!”
Toscanini was able to get away with such behavior because his own gifts were so extraordinary that the vast majority of his players worshipped him. In the words of the English bassoonist Archie Camden, who played under Toscanini in the BBC Symphony from 1935 to 1939, he was “the High Priest of Music,” a man “almost of another world” whose artistic integrity was beyond question. And while his personal integrity was not nearly so unblemished—he was, as Sachs reports with unsalacious candor, a compulsive philanderer whose love letters to his mistresses are explicit to the point of pornography—there is nonetheless a parallel between the passionate conscientiousness of his music-making and his refusal to compromise with Hitler and Mussolini, both of whom were sufficiently knowledgeable about music to understand what a coup it would have been to co-opt the world’s greatest conductor.
Among the most valuable parts of Toscanini: Musician of Conscience are the sections in which Sachs describes Toscanini’s fractious relations with the German and Italian governments. Like many of his fellow countrymen, he had been initially impressed by Mussolini, so much so that he ran for the Italian parliament as a Fascist candidate in 1919. But he soon saw through Mussolini’s modernizing rodomontade to the tyrant within, and by the late ’20s he was known throughout Italy and the world as an unswerving opponent of the Fascist regime. In 1931 he was beaten by a mob of blackshirted thugs, after which he stopped conducting in Italy, explaining that he would not perform there so long as the Fascists were in power. Mussolini thereupon started tapping his telephone line, and seven years later the conductor’s passport was confiscated when he described the Italian government’s treatment of Jews as “medieval stuff” in a phone call. Had public and private pressure not been brought to bear, he might well have been jailed or murdered. Instead he was allowed to emigrate to the U.S. He did not return to Italy until after World War II.
If anything, Toscanini’s hatred for the Nazis was even more potent, above all because he was disgusted by their anti-Semitism. A philo-Semite who referred to the Jews as “this marvelous people persecuted by the modern Nero,” he wrote a letter to one of his mistresses in the immediate wake of the Anschluss that makes for arresting reading eight decades later:
My heart is torn in bits and pieces. When you think about this tragic destruction of the Jewish population of Austria, it makes your blood turn cold. Think of what a prominent part they’d played in Vienna’s life for two centuries! . . . Today, with all the great progress of our civilization, none of the so-called liberal nations is making a move. England, France, and the United States are silent!
Toscanini felt so strongly about the rising tide of anti-Semitism that he agreed in 1936 to conduct the inaugural concerts of the Palestine Symphony (later the Israel Philharmonic) as a gesture of solidarity with the Jews. In an even more consequential gesture, he had already terminated his relationship with the Bayreuth Festival, where he had conducted in 1930 and 1931, the first non-German conductor to do so. While the founder of the festival, Richard Wagner, ranked alongside Beethoven, Brahms, and Verdi at the top of Toscanini’s pantheon of musical gods, he was well aware many of the members of the Wagner family who ran Bayreuth were close friends of Adolf Hitler, and he decided to stop conducting in Germany—Bayreuth included—when the Nazis came to power. Hitler implored him to return to the festival in a personal letter that praised him as “the great representative of art and of a people friendly to Germany.” Once again, though, there was to be no compromise: Toscanini never performed in Germany again, nor would he forgive those musicians, Wilhelm Furtwängler among them, who continued to do so.I mplicit throughout Sachs’s book is the idea that Toscanini the man and Toscanini the musician were, as his subtitle suggests, inseparable—that, in other words, his conscience drove him to oppose totalitarianism in much the same way that it drove him to pour his heart and soul into his work. He was in every sense of the word a driven man, one capable of writing in an especially revealing letter that “when I’m working I don’t have time to feel joy; on the contrary, I suffer without interruption, and I feel that I’m going through all the pain and suffering of a woman giving birth.”
Toscanini was not striking a theatrical pose when he wrote these melodramatic-sounding words. The rare moments of ecstasy that he experienced on the podium were more than offset by his obsessive struggle to make the mere mortals who sang and played for him realize, as closely as possible, his vision of artistic perfection. That was why he berated them, why he ended his rehearsals drenched with sweat, why he flogged himself as unsparingly as he flogged his musicians. It was, he believed, what he had been born to do, and he was willing to move heaven and earth in order to do it.
To read of such terrifying dedication is awe-inspiring—yet it is also strangely demoralizing. To be sure, there are still artists who drive themselves as relentlessly as did Toscanini, and who pull great art out of themselves with the same iron determination. But his quasi-religious consecration to music inevitably feels alien to the light-minded spirit of our own age, dominated as it is by pop culture. It is hard to believe that NBC, the network of Jimmy Fallon and Superstore, maintained for 17 years a full-time symphony orchestra that had been organized in 1937 for the specific purpose of allowing Toscanini to give concerts under conditions that he found satisfactory. A poll taken by Fortune that year found that 40 percent of Americans could identify Toscanini as a conductor. By 1954, the year in which he gave up conducting the NBC Symphony (which was then disbanded), the number was surely much higher.
Will there ever again be a time when high art in general and classical music in particular mean as much to the American people as they did in Toscanini’s heyday? Very likely not. But at least there will be Harvey Sachs’s fine biography—and, far more important, Toscanini’s matchlessly vivid recordings—to remind us of what we once were, what we have lost, and what Arturo Toscanini himself aspired to be and to do.
1 Liveright, 923 pages. Many of Toscanini’s best commercial American recordings, made with the NBC Symphony, the New York Philharmonic, and the Philadelphia Orchestra, were reissued earlier this year in a budget-priced box set called Arturo Toscanini: The Essential Recordings (RCA Red Seal, 20 CD’s) whose contents were chosen by Sachs and Christopher Dyment, another noted Toscanini scholar. Most of the recordings that he made in the ’30s with the BBC Symphony are on Arturo Toscanini: The HMV Recordings (Warner Classics, six CD’s).
A blockbuster movie gets the spirit right and the details wrong
But enough about Brexit; what about Christopher Nolan’s new movie about Dunkirk?
Dunkirk is undoubtedly a blockbuster with a huge cast—Nolan has splendidly used thousands of extras rather than computer cartooning to depict the vast numbers of Allied troops trapped on the beaches—and a superb score by Hans Zimmer. Kenneth Branagh is a stiff upper-lipped rear-admiral, whose rather clunking script is all too obviously designed to tell the audience what’s going on; One Direction pop star Harry Styles is a British Tommy, and Tom Hardy is a Spitfire pilot who somehow shoots down two Heinkels while gliding, having run out of fuel about halfway through the movie. Mark Rylance, meanwhile, plays the brave skipper of a small boat taking troops off the beaches in the manner of Walter Pidgeon in Mrs. Miniver.
Yet for all the clichéd characterization, almost total lack of dialogue, complete lack of historical context (not even a cameo role for Winston Churchill), a ludicrous subplot in which a company of British soldiers stuck on a sinking boat do not use their Bren guns to defend themselves, problems with continuity (sunny days turn immediately into misty ones as the movie jumps confusingly through time), and Germans breaking into central Dunkirk whereas in fact they were kept outside the perimeter throughout the evacuation, Dunkirk somehow works well.
It works for the same reason that the 1958 film of the same name directed by Leslie Norman and starring Richard Attenborough and John Mills did. The story of the nine-day evacuation of the British Expeditionary Force from Dunkirk in late May and early June 1940 is a tale of such extraordinary heroism, luck, and intimate proximity to utter disaster that it would carry any film, even a bad one, and Nolan’s is emphatically not a bad one. Although the dogfights take place at ridiculously low altitudes, they are thrilling, and the fact that one doesn’t see a single German soldier until the closing scene, and then only two of them in silhouette, somehow works, too. See the film on the biggest screen you can, which will emphasize the enormity of the challenge faced by the Allies in getting over 336,000 troops off the beaches for the loss of only 40,000 killed, wounded and captured.
There is a scene when the armada of small boats arrives at the beaches that will bring a lump to the throat of any patriotic Briton; similarly, three swooping Spitfires are given a wonderfully evocative moment. The microcosm of the evacuation that Nolan concentrates on works well, despite another silly subplot in which a British officer with PTSD (played by Cillian Murphy) kills a young boy on Rylance’s small boat. That all the British infantry privates, not just Harry Styles, look like they sing in boy-bands doesn’t affect the power of seeing them crouch en masse under German attack in their greatcoats and helmets on the foam-flecked beaches.
On the tenth of May in 1940, Adolf Hitler invaded France, Belgium, and Holland, unleashing Blitzkrieg on the British and French armies—a new all-arms tactic of warfare that left his enemies reeling. He also sent tanks through the forests of the Ardennes mountains, which were considered impassable, and by May 16, some panzer units had already reached the English Channel. With the British and French in full retreat, on May 24 the Fuhrer halted his tanks’ headlong advance for various sound military reasons—he wanted to give his men some rest, did not want to over-extend the German army, needed to protect against counter-attack, and wanted his infantry to catch up. From May 26 to June 3, the Allies used this pause to throw up a perimeter around the French port of Dunkirk, from whose pleasure beaches more than a quarter of a million British and more than 80,000 French troops embarked to cross the Channel to safety in Britain.
Protected by the Royal Air Force, which lost 144 pilots in the skies over Dunkirk, and by the French air force (which plays no part in this movie) and transported by the Royal Navy (which doesn’t seem to be able to use its guns against the Luftwaffe in this film, but which luckily did in real life), British and French troops made it to Dover, albeit without any heavy equipment which they had to destroy on the beach. An allusion is made to that when Tom Hardy destroys the Spitfire he has (I must say quite unbelievably) landed on a beach in order to prevent its falling into German hands.
In response to a call from the British government, more than 700 private vessels were requisitioned, including yachts, paddle steamers, ferries, fishing trawlers, packet steamers and lifeboats. Even today when boating down the Thames it is possible to see small pleasure vessels sometimes only fifteen feet long with the plaque “Dunkirk 1940” proudly displayed on the cabins. That 226 were sunk by the Luftwaffe, along with six destroyers of the 220 warships that took part, shows what it meant to rise to what was afterwards called “the Dunkirk Spirit.” It was a spirit of defiance of tyranny that one glimpses regularly in this film, even if Nolan does have to pay obeisance to the modern demands for stories of cowardice alongside heroism, and the supposedly redemptive cowardice-into-heroism stories that Hollywood did not find necessary when it made Mrs. Miniver in 1942.
Nolan’s Dunkirk implies that it was the small boats that brought back the majority of the troops, whereas in fact the 39 destroyers and one cruiser involved in Operation Dynamo brought back the huge majority while the little ships did the crucial job of ferrying troops from the beaches to the destroyers. Six of which were sunk, though none by U-boats (which the film wrongly suggests were present).
Where Nolan’s film commits a libel on the British armed services is in its tin ear for the Anglo-French relations of the time. In the movie, a British beach-master prevents French infantrymen from boarding a naval vessel, saying “This is a British ship. You get your own ships.” The movie later alleges that no Frenchmen were allowed to be evacuated until all the Britons were safely back home. This was not what happened. The French were brought across the Channel in Royal Navy vessels and small boats when their units arrived on the beaches.
There was no discrimination whatsoever, and to suggest there was injects false nationalist tension into what was in truth a model of good inter-Allied cooperation. Only much later, when the Nazi-installed Vichy government in France needed to create an Anglophobic myth of betrayal at Dunkirk, did such lies emerge. It is a shame that Nolan is now propagating them—especially since this might be the only contact that millions of people will ever have with the Dunkirk story for years, perhaps even a generation. At a time when schools simply do not teach the histories of anything so patriotism-inducing as Dunkirk, it was incumbent on Nolan to get this right.
In a touching scene at the end, one of the Tommies is depicted reading from a newspaper Churchill’s famous “We shall fight on the beaches” speech of June 4, 1940, with its admonition: “We must be very careful not to assign to this deliverance the attributes of a victory. Wars are not won by evacuations.” Churchill made no attempt to minimize the scale of what he called a “colossal military disaster,” but he also spoke, rightly, of the fact that it had been a “miracle of deliverance.” That is all that matters in this story.
So despite my annoyance at how many little details are off here—for example, Tom Hardy firing 75 seconds’ worth of ammunition when he would really have only had 14.7, or choppy weather when the Channel was really like a mill pond—I must confess that such problems are only for military history pedants like me. What Nolan has gotten right is the superb spirit of the British people in overcoming hatred, resentment, and fury with calmness, courage, and good humor.
Which brings us back to Brexit.
The Swoon has several symptoms: extreme praise, a disinclination to absorb contrary facts, a weakness for adulation, and a willingness to project one’s own beliefs and dispositions onto an ill-suited target, regardless of evidence. The first thing to know about the Swoon, though, is that it is well rooted in reality. John McCain is perhaps the most interesting non-presidential figure in Washington politics since Daniel Patrick Moynihan. Any piece of journalism that aims to assess him objectively should be required to include, as a stipulation, a passage like this one from Robert Timberg’s masterful book about Vietnam, The Nightingale’s Song.
“Do you want to go home?”
“Now, McCain, it will be very bad for you.”
The [chief jailer] gleefully led the charge as the guards, at [another guard’s] command, drove fists and knees and boots into McCain. Amid laughter and muttered oaths, he was slammed from one guard to another, bounced from wall to wall, knocked down, kicked, dragged to his feet, knocked back down, punched again and again in the face. When the beating was over, he lay on the floor, bloody, arms and legs throbbing, ribs cracked, several teeth broken off at the gum line.
“Are you ready to confess your crimes?” asked [the guard].
The ropes came next . . .
This scene is, of course, from McCain’s five years in a North Vietnamese prisoner of war camp. It helps to know that before this gruesome episode began—there were many more to come—McCain’s arms had been broken and gone untreated. It helps, too, to know that the point of the torture was to force McCain to leave the prison and return home to his father, the highest ranking naval officer in the Pacific. In other words, they hung him by his broken arms because he refused to let them let him go.
Every reporter who’s done his homework knows this about McCain, and most civilians who meet him know it, too. This is the predicate for the Swoon. It began to afflict liberal journalists of the Boomer generation during the warm-up to his first run for president, against Governor George W. Bush, in the late 1990s. The reporter would be brought onto McCain’s campaign bus and receive a mock-gruff welcome from the candidate. No nervous handlers would be in evidence, like those who ever attend other candidates during interviews.
And then it happens: In casual, preliminary conversation, McCain makes an indiscreet comment about a Senate colleague. “Is that off the record?” the reporter asks, and McCain waves his hand: “It’s the truth, isn’t it?” In a minute or two, the candidate, a former fighter pilot, drops the F bomb. Then, on another subject, he makes an offhanded reference to being “in prison.” The reporter, who went through four deferments in the late 1960s smoking weed with half-naked co-eds at an Ivy League school, feels the hot, familiar surge of guilt. As the interview winds down, the reporter sees an unexpected and semi-obscure literary work—the collected short stories of William Maxwell, let’s say—that McCain keeps handy for casual reading.
By the time he’s shown off the bus—after McCain has complimented a forgotten column the reporter wrote two years ago—the man is a goner. If I saw it once in my years writing about McCain, I saw it a dozen times. (I saw it happen to me!) Soon the magazine feature appears, with a headline like “The Warrior,” or “A Question of Honor,” or even “John McCain Walks on Water.” Those are all real headlines from his first presidential campaign. This really got printed, too: “It is a perilous thing, this act of faith in a faithless time—perilous for McCain and perilous for the people who have come to him, who must realize the constant risk that, sometimes, God turns out to be just a thunderstorm, and the gold just stones agleam in the sun.”
Judging from inquiries I’ve made over the years, the only person who knows what that sentence means is the writer of it, an employee of Esquire magazine named Charles Pierce. No liberal journalist got the Swoon worse than Pierce, and no one was left with a bitterer hangover when it emerged that McCain was, in nearly every respect, a conventionally conservative, generally loyal Republican—with complications, of course. The early Swooners had mistaken those complications (support for campaign-finance reform, for example, and his willingness to strike back at evangelical bullies like Jerry Falwell Sr.) as the essence of McCain. When events proved this not to be so, culminating in his dreary turn as the 2008 Republican presidential nominee—when he committed the ultimate crime in liberal eyes, midwifing the national career of Sarah Palin—it was only Republicans who were left to swoon.
So matters rested until this July, when McCain released the news that he suffers from a particularly aggressive form of brain cancer. Many appropriate encomiums rolled in, some from the original Swooners. But another complication arose. Desperate to pass a “motion to proceed” so that a vote could be taken on a lame and toothless “repeal” of Obamacare, Senate Republicans could muster only a tie vote. McCain announced he would rise from his hospital bed and fly to Washington to break the tie and vote for the motion to proceed.
Even conservatives who had long remained resistant to the Swoon succumbed. Even Donald Trump tweet-hailed McCain as a returning hero. His old fans from the left, those with long memories, wrote, or tweeted, more in sorrow than in anger. Over at Esquire, poor Charles Peirce reaffirmed that God had turned out to be just a thunderstorm again. “The ugliest thing to witness on a very ugly day in the United States Senate,” he wrote, “was what John McCain did to what was left of his legacy as a national figure.” A longtime Swooner in the Atlantic: “Senator McCain gave us a clearer idea of who he is and what he stands for.” Answers: a hypocrite, and nothing!
The old fans weren’t mollified by a speech McCain made after his vote, in which he sounded notes they had once thrilled to—he praised bipartisanship and cooperation across the aisle. Several critics in the press dismissed the speech with the same accusation that his conservative enemies had always leveled at McCain when he committed something moderate. He was pandering…to them! “McCain so dearly wants the press to think better of him for [this] speech,” wrote the ex-fan in the Atlantic. But the former Swooners were having none of it. Swoon me once, shame on me. Swoon me twice . . .
Then the next day in the wee hours, McCain voted against the actual bill to repeal Obamacare. Democrats were elated, and Republicans were forced to halt in mid-Swoon. His reasons for voting as he did were sound enough, but reasons seldom enter in when people are in thrall to their image of McCain. The people who had once loved him so, and who had suffered so cruelly in disappointment, were once more in love. Let’s let Pierce have the last word: “The John McCain the country had been waiting for finally showed up early Friday morning.” He had done what they wanted him to do; why he had done it was immaterial.
The condescension is breathtaking. Sometimes I think McCain is the most misunderstood man in Washington. True enough, he’s hard to pin down. He’s a screen onto which the city’s ideologues and party hacks project their own hopes and forebodings. Now, as he wages another battle in a long and eventful life, what he deserves from us is something simpler—not a swoon but a salute, offered humbly, with much reverence, affection, and gratitude.