o hatred has as rich and as lethal a history as anti-Semitism—“the longest hatred,” as the historian Robert Wistrich has dubbed it. Over the millennia, anti-Semitism has infected a multitude of peoples, religions, and civilizations, in the process inflicting a host of terrors on its Jewish victims. But while there is no disputing the impressive reach of the phenomenon, there is surprisingly little agreement about its cause or causes.
Indeed, finding a single cause would seem too daunting a task—the incidence of anti-Semitism is too frequent, the time span too broad, the locales too numerous, the circumstances too varied. No doubt that is why some scholars have come to regard every outbreak as essentially unique, denying that a straight line can be drawn from the anti-Semitism of the ancient world to that of today. Whether it is the attack on the Jews of Alexandria in 38 C.E. or the ones that took place 200 years earlier in ancient Jerusalem, whether it is the Dreyfus affair in 1890’s France or Kristallnacht in late-1930’s Germany—each incident is seen as the outcome of a distinctive mix of political, social, economic, cultural, and religious forces that preclude the possibility of a deeper or recurring cause.
A less extreme version of this same approach identifies certain patterns of anti-Semitism, but only within individual and discrete “eras.” In particular, a distinction is drawn between the religiously based hatred of the Middle Ages and the racially based hatred of the modern era. Responsibility for the anti-Semitic waves that engulfed Europe from the age of Constantine to the dawn of the Enlightenment is laid largely at the foot of the Church and its offshoots, while the convulsions that erupted over the course of the next three centuries are viewed as the byproduct of the rise of virulent nationalism.
Obviously, separating out incidents or eras has its advantages, enabling researchers to focus more intensively on specific circumstances and to examine individual outbreaks from start to finish. But what such analyses may gain in local explanatory power they sacrifice in comprehensiveness. Besides, if every incident or era of anti-Semitism is largely distinct from every other, how to explain the cumulative ferocity of the phenomenon?
As if in response to this question, some scholars have attempted to offer more sweeping, trans-historical explanations. Perhaps the two best known are the “scapegoat” theory, according to which tensions within society are regulated and released by blaming a weaker group, often the Jews, for whatever is troubling the majority, and the “demonization” theory, according to which Jews have been cast into the role of the “other” by the seemingly perennial need to reject those who are ethnically, religiously, or racially different.
Clearly, in this sociological approach, anti-Semitism emerges as a Jewish phenomenon in name only. Rather, it is but one variant in a family of hatreds that include racism and xenophobia. Thus, the specifically anti-Jewish violence in Russia at the turn of the 20th century has as much in common with the ethnic cleansing in Bosnia at the turn of the 21st as it does with the massacres of Jews in the Ukraine in the mid-1600’s. Taken to its logical conclusion, this theory would redefine the Holocaust—at the hands of some scholars, it has redefined the Holocaust—as humanity’s most destructive act of racism rather than as the most murderous campaign ever directed against the Jews.
Reacting to such universalizing tendencies a half-century ago, Hannah Arendt cited a piece of dialogue from “a joke which was told after the first World War”:
An anti-Semite claimed that the Jews had caused the war; the reply was: Yes, the Jews and the bicyclists. Why the bicyclists? asks the one. Why the Jews? asks the other.
George Orwell offered a similar observation in 1944: “However true the scapegoat theory may be in general terms, it does not explain why the Jews rather than some other minority group are picked on, nor does it make clear what they are the scapegoat for.”
Three decades ago, as a young dissident in the Soviet Union, I compiled underground reports on anti-Semitism for foreign journalists and Western diplomats. At the time, I firmly believed that the cause of the “disease” was totalitarianism, and that democracy was the way to cure it. Once the Soviet regime came to be replaced by democratic rule, I figured, anti-Semitism was bound to wither away. In the struggle toward that goal, the free world, which in the aftermath of the Holocaust appeared to have inoculated itself against a recurrence of murderous anti-Jewish hatred, was our natural ally, the one political entity with both the means and the will to combat the great evil.
Today I know better. This year, following publication of a report by an Israeli government forum charged with addressing the issue of anti-Semitism, I invited to my office the ambassadors of the two countries that have outpaced all others in the frequency and intensity of anti-Jewish attacks within their borders. The emissaries were from France and Belgium—two mature democracies in the heart of Western Europe. It was in these ostensible bastions of enlightenment and tolerance that Jewish cemeteries were being desecrated, children assaulted, synagogues scorched.
To be sure, the anti-Semitism now pervasive in Western Europe is very different from the anti-Semitism I encountered a generation ago in the Soviet Union. In the latter, it was nurtured by systematic, government-imposed discrimination against Jews. In the former, it has largely been condemned and opposed by governments (though far less vigilantly than it should be). But this only makes anti-Semitism in the democracies more disturbing, shattering the illusion—which was hardly mine alone—that representative governance is an infallible antidote to active hatred of Jews.
Another shattered illusion is even more pertinent to our search. Shocked by the visceral anti-Semitism he witnessed at the Dreyfus trial in supposedly enlightened France, Theodor Herzl, the founder of modern Zionism, became convinced that the primary cause of anti-Semitism was the anomalous condition of the Jews: a people without a polity of its own. In his seminal work, The Jewish State (1896), published two years after the trial, Herzl envisioned the creation of such a Jewish polity and predicted that a mass emigration to it of European Jews would spell the end of anti-Semitism. Although his seemingly utopian political treatise would turn out to be one of the 20th century’s most prescient books, on this point history has not been kind to Herzl; no one would seriously argue today that anti-Semitism came to a halt with the founding of the state of Israel. To the contrary, this particular illusion has come full circle: while Herzl and most Zionists after him believed that the emergence of a Jewish state would end anti-Semitism, an increasing number of people today, including some Jews, are convinced that anti-Semitism will end only with the disappearance of the Jewish state.
I first encountered this idea quite a long time ago, in the Soviet Union. In the period before, during, and after the Six-Day war of June 1967—a time when I and many others were experiencing a heady reawakening of our Jewish identity—the Soviet press was filled with scathing attacks on Israel and Zionism, and a wave of official anti-Semitism was unleashed to accompany them. To quite a few Soviet Jews who had been trying their best to melt into Soviet life, Israel suddenly became a jarring reminder of their true status in the “workers’ paradise”: trapped in a world where they were free neither to live openly as Jews nor to escape the stigma of their Jewishness. To these Jews, Israel came to seem part of the problem, not (as it was for me and others) part of the solution. Expressing what was no doubt a shared sentiment, a distant relative of mine quipped: “If only Israel didn’t exist, everything would be all right.”
In the decades since, and especially over the last three years, the notion that Israel is one of the primary causes of anti-Semitism, if not the primary cause, has gained much wider currency. The world, we are told by friend and foe alike, increasingly hates Jews because it increasingly hates Israel. Surely this is what the Belgian ambassador had in mind when he informed me during his visit that anti-Semitism in his country would cease once Belgians no longer had to watch pictures on television of Israeli Jews oppressing Palestinian Arabs.
The rise in viciously anti-Semitic content disseminated through state-run Arab media is quite staggering, and has been thoroughly documented. Arab propagandists, journalists, and scholars now regularly employ the methods and the vocabulary used to demonize European Jews for centuries—calling Jews Christ-killers, charging them with poisoning non-Jews, fabricating blood libels, and the like. In a region where the Christian faith has few adherents, a lurid and time-worn Christian anti-Semitism boasts an enormous following.
To take only one example: this past February, the Egyptian government, formally at peace with Israel, saw fit to broadcast on its state-run television a 41-part series based on the infamous Czarist forgery about a global Jewish conspiracy to dominate humanity, the Protocols of the Elders of Zion. To ensure the highest ratings, the show was first aired, in prime time, just as millions of families were breaking their traditional Ramadan fast; Arab satellite television then rebroadcast the series to tens of millions more throughout the Middle East.
In Europe, the connection between Israel and anti-Semitism is equally conspicuous. For one thing, the timing and nature of the attacks on European Jews, whether physical or verbal, have all revolved around Israel, and the anti-Semitic wave itself, which began soon after the Palestinians launched their terrorist campaign against the Jewish state in September 2000, reached a peak (so far) when Israel initiated Operation Defensive Shield at the end of March 2002, a month in which 125 Israelis had been killed by terrorists.
Though most of the physical attacks in Europe were perpetrated by Muslims, most of the verbal and cultural assaults came from European elites. Thus, the Italian newspaper La Stampa published a cartoon of an infant Jesus lying at the foot of an Israeli tank, pleading, “Don’t tell me they want to kill me again.” The frequent comparisons of Ariel Sharon to Adolf Hitler, of Israelis to Nazis, and of Palestinians to the Jewish victims of the Holocaust were not the work of hooligans spray-painting graffiti on the wall of a synagogue but of university educators and sophisticated columnists. As the Nobel Prize-winning author José Saramago declared of Israel’s treatment of the Palestinians: “We can compare it with what happened at Auschwitz.”
The centrality of Israel to the revival of a more generalized anti-Semitism is also evident in the international arena. Almost a year after the current round of Palestinian violence began, and after hundreds of Israelis had already been killed in buses, discos, and pizzerias, a so-called “World Conference against Racism” was held under the auspices of the United Nations in Durban, South Africa. It turned into an anti-Semitic circus, with the Jewish state being accused of everything from racism and apartheid to crimes against humanity and genocide. In this theater of the absurd, the Jews themselves were turned into perpetrators of anti-Semitism, as Israel was denounced for its “Zionist practices against Semitism”—the Semitism, that is to say, of the Palestinian Arabs.
Naturally, then, in searching for the “root cause” of anti-Semitism, the Jewish state would appear to be the prime suspect. But Israel, it should be clear, is not guilty. The Jewish state is no more the cause of anti-Semitism today than the absence of a Jewish state was its cause a century ago.
To see why, we must first appreciate that the always specious line between anti-Zionism and anti-Semitism has now become completely blurred: Israel has effectively become the world’s Jew. From Middle Eastern mosques, the bloodcurdling cry is not “Death to the Israelis,” but “Death to the Jews.” In more civilized circles, a columnist for the London Observer proudly announces that he does not read published letters in support of Israel that are signed by Jews. (That the complaints commission for the British press found nothing amiss in this statement only goes to show how far things have changed since Orwell wrote of Britain in 1945 that “it is not at present possible, indeed, that anti-Semitism should become respectable.”) When discussion at fashionable European dinner parties turns to the Middle East, the air, we have been reliably informed, turns blue with old-fashioned anti-Semitism.
No less revealing is what might be called the mechanics of the discussion. For centuries, a clear sign of the anti-Semitic impulse at work has been the use of the double standard: social behavior that in others passes without comment or with the mildest questioning becomes, when exhibited by Jews, a pretext for wholesale group denunciation. Such double standards are applied just as recklessly today to the Jewish state. It is democratic Israel, not any of the dozens of tyrannies represented in the United Nations General Assembly, that that body singles out for condemnation in over two dozen resolutions each year; it is against Israel—not Cuba, North Korea, China, or Iran—that the UN human-rights commission, chaired recently by a lily-pure Libya, directs nearly a third of its official ire; it is Israel whose alleged misbehavior provoked the only joint session ever held by the signatories to the Geneva Convention; it is Israel, alone among nations, that has lately been targeted by Western campaigns of divestment; it is Israel’s Magen David Adom, alone among ambulance services in the world, that is denied membership in the International Red Cross; it is Israeli scholars, alone among academics in the world, who are denied grants and prevented from publishing articles in prestigious journals. The list goes on and on.
The idea that Israel has become the world’s Jew and that anti-Zionism is a substitute for anti-Semitism is certainly not new. Years ago, Norman Podhoretz observed that the Jewish state “has become the touchstone of attitudes toward the Jewish people, and anti-Zionism has become the most relevant form of anti-Semitism.” And well before that, Dr. Martin Luther King, Jr. was even more unequivocal:
You declare, my friend, that you do not hate the Jews, you are merely “anti-Zionist.” And I say, let the truth ring forth from the high mountain tops, let it echo through the valleys of God’s green earth; when people criticize Zionism, they mean Jews—this is God’s own truth.
But if Israel is indeed nothing more than the world’s Jew, then to say that the world increasingly hates Jews because the world increasingly hates Israel means as much, or as little, as saying that the world hates Jews because the world hates Jews. We still need to know: why?
Here is the reasoning invoked by Haman, the infamous viceroy of Persia in the biblical book of Esther, to convince his king to order the annihilation of the Jews:
There is a certain people scattered and dispersed among the people in all the provinces of your kingdom, and their laws are different from those of other peoples, and the king’s laws they do not keep, so that it is of no benefit for the king to tolerate them. If it please the king, let it be written that they be destroyed. [emphasis added]
This is hardly the only ancient source pointing to the Jews’ incorrigible separateness, or their rejection of the majority’s customs and moral concepts, as the reason for hostility toward them. Centuries after Hellenistic values had spread throughout and beyond the Mediterranean, the Roman historian Tacitus had this to say:
Among the Jews, all things are profane that we hold sacred; on the other hand, they regard as permissible what seems to us immoral. . . . The rest of the world they confront with the hatred reserved for enemies. They will not feed or intermarry with gentiles. . . . They have introduced circumcision to show that they are different from others. . . . It is a crime among them to kill any newly born infant.
Philostratus, a Greek writer who lived a century later, offered a similar analysis:
For the Jews have long been in revolt not only against the Romans, but against humanity; and a race that has made its own life apart and irreconcilable, that cannot share with the rest of mankind in the pleasures of the table, nor join in their libations or prayers or sacrifices, are separated from ourselves by a greater gulf than divides us from Sura or Bactra of the more distant Indies.
Did the Jews actually reject the values that were dominant in the ancient world, or was this simply a fantasy of their enemies? While many of the allegations leveled at Jews were spurious—they did not ritually slaughter non-Jews, as the Greek writer Apion claimed—some were obviously based on true facts. The Jews did oppose intermarriage. They did refuse to sacrifice to foreign gods. And they did emphatically consider killing a newborn infant to be a crime.
Some, perhaps many, individual Jews in those days opted to join the (alluring) Hellenist stream; most did not. Even more important, the Jews were the only people seriously to challenge the moral system of the Greeks. They were not an “other” in the ancient world; they were the “other”—an other, moreover, steadfast in the conviction that Judaism represented not only a different way of life but, in a word, the truth. Jewish tradition claims that Abraham was chosen as the patriarch of what was to become the Jewish nation only after he had smashed the idols in his father’s home. His descendants would continue to defy the pagan world around them, championing the idea of the one God and, unlike other peoples of antiquity, refusing to subordinate their beliefs to those of their conquerors.
The (by and large correct) perception of the Jews as rejecting the prevailing value system of the ancient world hardly justifies the anti-Semitism directed against them; but it does take anti-Semitism out of the realm of fantasy, turning it into a genuine clash of ideals and of values. With the arrival of Christianity on the world stage, that same clash, based once again on the charge of Jewish rejectionism, would intensify a thousandfold. The refusal of the people of the “old covenant” to accept the new came to be defined as a threat to the very legitimacy of Christianity, and one that required a mobilized response.
Branding the Jews “Christ killers” and “sons of devils,” the Church launched a systematic campaign to denigrate Christianity’s parent religion and its adherents. Accusations of desecrating the host, ritual murder, and poisoning wells would be added over the centuries, creating an ever larger powder keg of hatred. With the growing power of the Church and the global spread of Christianity, these potentially explosive sentiments were carried to the far corners of the world, bringing anti-Semitism to places where no Jewish foot had ever trod.
According to some Christian thinkers, persecution of the powerless Jews was justified as a kind of divine payback for the Jewish rejection of Jesus. This heavenly stamp of approval would be invoked many times through the centuries, especially by those who had tried and failed to convince the Jews to acknowledge the superior truth of Christianity. The most famous case may be that of Martin Luther: at first extremely friendly toward Jews—as a young man he had complained about their mistreatment by the Church—Luther turned into one of their bitterest enemies as soon as he realized that his efforts to woo them to his new form of Christianity would never bear fruit.
Nor was this pattern unique to the Christian religion. Muhammad, too, had hoped to attract the Jewish communities of Arabia, and to this end he initially incorporated elements of Judaism into his new faith (directing prayer toward Jerusalem, fasting on Yom Kippur, and the like). When, however, the Jews refused to accept his code of law, Muhammad wheeled upon them with a vengeance, cursing them in words strikingly reminiscent of the early Church fathers: “Humiliation and wretchedness were stamped upon them, and they were visited with the wrath of Allah. That was because they disbelieved in Allah’s revelation and slew the prophets wrongfully.”
In these cases, too, we might ask whether the perception of Jewish rejectionism was accurate. Of course the Jews did not drain the blood of children, poison wells, attempt to mutilate the body of Christ, or commit any of the other wild crimes of which the Church accused them. Moreover, since many teachings of Christianity and Islam stemmed directly from Jewish ones, Jews could hardly be said to have denied them. But if rejecting the Christian or Islamic world meant rejecting the Christian or Islamic creed, then Jews who clung to their own separate faith and way of life were, certainly, rejectionist.
This brings us to an apparent point of difference between pre-modern and modern anti-Semitism. For many Jews over the course of two millennia, there was, in theory at least, a way out of institutionalized discrimination and persecution: the Greco-Roman, Christian, and Muslim worlds were only too happy to embrace converts to their way of life. In the modern era, this choice often proved illusory. Both assimilated and non-assimilated Jews, both religious and secular Jews, were equally victimized by pogroms, persecutions, and genocide. In fact, the terrors directed at the assimilated Jews of Western Europe have led some to conclude that far from ending anti-Semitism, assimilation actually contributed to arousing it.
What accounts for this? In the pre-modern world, Jews and Gentiles were largely in agreement as to what defined Jewish rejectionism, and therefore what would constitute a reprieve from it: it was mostly a matter of beliefs and moral concepts, and of the social behavior that flowed from them. In the modern world, although the question of whether a Jew ate the food or worshiped the God of his neighbors remained relevant, it was less relevant than before. Instead, the modern Jew was seen as being born into a Jewish nation or race whose collective values were deeply embedded in the very fabric of his being. Assimilation, with or without conversion to the majority faith, might succeed in masking this bedrock taint; it could not expunge it.
While such views were not entirely absent in earlier periods, the burden of proof faced by the modern Jew to convince others that he could transcend his “Jewishness” was much greater than the one faced by his forebears. Despite the increasing secularism and openness of European society, which should have smoothed the prospects of assimilation, many modern Jews would find it more difficult to become real Frenchmen or true Germans than their ancestors would have found it to become Greeks or Romans, Christians or Muslims.
The novelty of modern anti-Semitism is thus not that the Jews were seen as the enemies of mankind. Indeed, Hitler’s observation in Mein Kampf that “wherever I went, I began to see Jews, and the more I saw, the more sharply they became distinguished in my eyes from the rest of humanity” sounds no different from the one penned by Philostratus 1,700 years earlier. No, the novelty of modern anti-Semitism is only that it was far more difficult—and sometimes impossible—for the Jew to stop being an enemy of mankind.
Was there any kernel of factual truth to this charge? It is demeaning to have to point out that, wherever and whenever they were given the chance, most modern Jews strove to become model citizens and showed, if anything, an exemplary talent for acculturation; the idea that by virtue of their birth, race, or religion they were implacable enemies of the state or nation was preposterous. So, too, with other modern libels directed against the Jews, which displayed about as much or as little truth content as ancient ones. The Jews did not and do not control the banks. They did not and do not control the media of communication. They did not and do not control governments. And they are not plotting to take over anything.
What some of them have indeed done, in various places and under specific circumstances, is to demonstrate—with an ardor and tenacity redolent perhaps of their long national experience—an attachment to great causes of one stripe or another, including, at times, the cause of their own people. This has had the effect (not everywhere, of course, but notably in highly stratified and/or intolerant societies) of putting them in a visibly adversary position to prevailing values or ideologies, and thereby awakening the never dormant dragon of anti-Semitism. Particularly instructive in this regard is the case of Soviet Jewry.
What makes the Soviet case instructive is, in no small measure, the fact that the professed purpose of Communism was to abolish all nations, peoples, and religions—those great engines of exclusion—on the road to the creation of a new world and a new man. As is well known, quite a few Jews, hoping to emancipate humanity and to “normalize” their own condition in the process, hitched their fates to this ideology and to the movements associated with it. After the Bolshevik revolution, these Jews proved to be among the most devoted servants of the Soviet regime.
Once again, however, the perception of ineradicable Jewish otherness proved as lethal as any reality. In the eyes of Stalin and his henchmen, the Jews, starting with the loyal Communists among them, were always suspect—“ideological immigrants,” in the telling phrase. But the animosity went beyond Jewish Communists. The Soviet regime declared war on the over 100 nationalities and religions under its boot; whole peoples were deported, entire classes destroyed, millions starved to death, and tens of millions killed. Everybody suffered, not only Jews. But, decades later, long after Stalin’s repression had given way to Khrushchev’s “thaw,” only one national language, Hebrew, was still banned in the Soviet Union; only one group, the Jews, was not permitted to establish schools for its children; only in the case of one group, the Jews, did the term “fifth line,” referring to the space reserved for nationality on a Soviet citizen’s identification papers, become a code for licensed discrimination.
Clearly, then, Jews were suspect in the Soviet Union as were no other group. Try as they might to conform, it turned out that joining the mainstream of humanity through the medium of the great socialist cause in the East was no easier than joining the nation-state in the West. But that is not the whole story, either. To scant the rest of it is not only to do an injustice to Soviet Jews as historical actors in their own right but to miss something essential about anti-Semitism, which, even as it operates in accordance with its own twisted definitions and its own mad logic, proceeds almost always by reference to some genuine quality in its chosen victims.
As it happens, although Jews were disproportionately represented in the ranks of the early Bolsheviks, the majority of Russian Jews were far from being Bolsheviks, or even Bolshevik sympathizers. More importantly, Jews would also, in time, come to play a disproportionate role in Communism’s demise. In the middle of the 1960’s, by which time their overall share of the country’s population had dwindled dramatically, Soviet Jews made up a significant element in the “democratic opposition.” A visitor to the Gulag in those years would have discovered that Jews were also prominent among political dissidents and those convicted of so-called “economic crimes.” Even more revealing, in the 1970’s the Jews were the first to challenge the Soviet regime as a national group, and to do so publicly, en masse, with tens of thousands openly demanding to leave the totalitarian state.
To that degree, then, the claim of Soviet anti-Semites that “Jewish thoughts” and “Jewish values” were in opposition to prevailing norms was not entirely unfounded. And, to that degree, Soviet anti-Semitism partook of the essential characteristic of all anti-Semitism. This hardly makes its expression any the less monstrous; it merely, once again, takes it out of the realm of fantasy.
And so we arrive back at today, and at the hatred that takes as its focus the state of Israel. That state—the world’s Jew—has the distinction of challenging two separate political/moral orders simultaneously: the order of the Arab and Muslim Middle East, and the order that prevails in Western Europe. The Middle Eastern case is the easier to grasp; the Western European one may be the more ominous.
The values ascendant in today’s Middle East are shaped by two forces: Islamic fundamentalism and state authoritarianism. In the eyes of the former, any non-Muslim sovereign power in the region—for that matter, any secular Muslim power—is anathema. Particularly galling is Jewish sovereignty in an area delineated as dar al-Islam, the realm where Islam is destined to enjoy exclusive dominance. Such a violation cannot be compromised with; nothing will suffice but its extirpation.
In the eyes of the secular Arab regimes, the Jews of Israel are similarly an affront, but not so much on theological grounds as on account of the society they have built: free, productive, democratic, a living rebuke to the corrupt, autocratic regimes surrounding it. In short, the Jewish state is the ultimate freedom fighter—an embodiment of the subversive liberties that threaten Islamic civilization and autocratic Arab rule alike. It is for this reason that, in the state-controlled Arab media as in the mosques, Jews have been turned into a symbol of all that is menacing in the democratic, materialist West as a whole, and are confidently reputed to be the insidious force manipulating the United States into a confrontation with Islam.
The particular dynamic of anti-Semitism in the Middle East orbit today may help explain why—unlike, as we shall see, in Europe—there was no drop in the level of anti-Jewish incitement in the region after the inception of the Oslo peace process. Quite the contrary. And the reason is plain: to the degree that Oslo were to have succeeded in bringing about a real reconciliation with Israel or in facilitating the spread of political freedom, to that degree it would have frustrated the overarching aim of eradicating the Jewish “evil” from the heart of the Middle East and/or preserving the autocratic power of the Arab regimes.
And so, while in the 1990’s the democratic world, including the democratic society of Israel, was (deludedly, as it turned out) celebrating the promise of a new dawn in the Middle East, the schools in Gaza, the textbooks in Ramallah, the newspapers in Egypt, and the television channels in Saudi Arabia were projecting a truer picture of the state of feeling in the Arab world. It should come as no surprise that, in Egypt, pirated copies of Shimon Peres’s A New Middle East, a book heralding a messianic era of free markets and free ideas, were printed with an introduction in Arabic claiming that what this bible of Middle East peacemaking proved was the veracity of everything written in the Protocols of the Elders of Zion about a Jewish plot to rule the world.
As for Western Europe, there the reputation of Israel and of the Jews has undergone a number of ups and downs over the decades. Before 1967, the shadow of the Holocaust and the perception of Israel as a small state struggling for its existence in the face of Arab aggression combined to ensure, if not the favor of the European political classes, at least a certain dispensation from harsh criticism. But all this changed in June 1967, when the truncated Jewish state achieved a seemingly miraculous victory against its massed Arab enemies in the Six-Day war, and the erstwhile victim was overnight transformed into an aggressor. A possibly apocryphal story about Jean-Paul Sartre encapsulates the shift in the European mood. Before the war, as Israel lay diplomatically isolated and Arab leaders were already trumpeting its certain demise, the famous French philosopher signed a statement in support of the Jewish state. After the war, he reproached the man who had solicited his signature: “But you assured me they would lose.”
Decades before “occupation” became a household word, the mood in European chancelleries and on the Left turned decidedly hostile. There were, to be sure, venal interests at stake, from the perceived need to curry favor with the oil-producing nations of the Arab world to, in later years, the perceived need to pander to the growing Muslim populations in Western Europe itself. But other currents were also at work, as anti-Western, anti-“imperialist,” pacifist, and pro-liberationist sentiments, fanned and often subsidized by the USSR, took over the advanced political culture both of Europe and of international diplomacy. Behind the new hostility to Israel lay the new ideological orthodoxy, according to whose categories the Jewish state had emerged on the world scene as a certified “colonial” and “imperialist” power, a “hegemon,” and an “oppressor.”
Before 1967, anti-Zionist resolutions sponsored by the Arabs and their Soviet patrons in the United Nations garnered little or no support among the democracies. After 1967, more and more Western countries joined the chorus of castigation. By 1974, Yasir Arafat, whose organization openly embraced both terrorism and the destruction of a UN member state, was invited to address the General Assembly. The next year, that same body passed the infamous “Zionism-is-racism” resolution. In 1981, Israel’s strike against Iraq’s nuclear reactor was condemned by the entire world, including the United States.
Then, in the 1990’s, things began to change again. Despite the constant flow of biased UN resolutions, despite the continuing double standard, there were a number of positive developments as well: the Zionism-is-racism resolution was repealed, and over 65 member states either established or renewed diplomatic relations with Israel.
What had happened? Had Arab oil dried up? Had Muslims suddenly become a less potent political force on the European continent? Hardly. What changed was that, at Madrid and then at Oslo, Israel had agreed, first reluctantly and later with self-induced optimism, to conform to the ascendant ethos of international politics. Extending its hand to a terrorist organization still committed to its destruction, Israel agreed to the establishment of a dictatorial and repressive regime on its very doorstep, sustaining its commitment to the so-called peace process no matter how many innocent Jews were killed and wounded in its fraudulent name.
The rewards for thus conforming to the template of the world’s moralizers, cosmetic and temporary though they proved to be, flowed predictably not just to Israel but to the Jewish people as a whole. Sure enough, worldwide indices of anti-Semitismin the 1990’s dropped to their lowest point since the Holocaust. As the world’s Jews benefited from the increasing tolerance extended to the world’s Jew, Western organizations devoted to fighting the anti-Semitic scourge began cautiously to declare victory and to refocus their efforts on other parts of the Jewish communal agenda.
But of course it would not last. In the summer of 2000, at Camp David, Ehud Barak offered the Palestinians nearly everything their leadership was thought to be demanding. The offer was summarily rejected, Arafat started his “uprising,” Israel undertook to defend itself—and Europe ceased to applaud. For many Jews at the time, this seemed utterly incomprehensible: had not Israel taken every last step for peace? But it was all too comprehensible. Europe was staying true to form; it was the world’s Jew, by refusing to accept its share of blame for the “cycle of violence,” that was out of line. And so were the world’s Jews, who by definition, and whether they supported Israel or not, came rapidly to be associated with the Jewish state in its effrontery.
To Americans, the process I have been describing may sound eerily familiar. It should: Americans, too, have had numerous opportunities to see their nation in the dock of world opinion over recent years for the crime of rejecting the values of the so-called international community, and never more so than during the widespread hysteria that greeted President Bush’s announced plan to dismantle the tyrannical regime of Saddam Hussein. In dozens of countries, protesters streamed into the streets to voice their fury at this refusal of the United States to conform to what “everybody” knew to be required of it. To judge from the placards on display at these rallies, President Bush, the leader of the free world, was a worse enemy of mankind than the butcher of Baghdad.
At first glance, this too must have seemed incomprehensible. Saddam Hussein was one of the world’s most brutal dictators, a man who had gassed his own citizens, invaded his neighbors, defied Security Council resolutions, and was widely believed to possess weapons of mass destruction. But no matter: the protests were less about Iraqi virtue than about American vice, and the grievances aired by the assorted anti-capitalists, anti-globalists, radical environmentalists, self-styled anti-imperialists, and many others who assembled to decry the war had little to do with the possible drawbacks of a military operation in Iraq. They had to do, rather, with a genuine clash of values.
Insofar as the clash is between the United States and Europe—there is a large “European” body of opinion within the United States as well—it has been well diagnosed by Robert Kagan in his best-selling book, Of Paradise and Power. For our purposes, it is sufficient to remark on how quickly the initial “why-do-they-hate-us” debate in the wake of September 11, focusing on anti-American sentiment in the Muslim world, came to be overtaken by a “why-do-they-hate-us” debate centered on anti-American sentiment in “Old Europe.” Generally, the two hatreds have been seen to emanate from divergent impulses, in the one case a perception of the threat posed by Western freedoms to Islamic civilization, in the other a perception of the threat posed by a self-confident and powerful America to the postmodern European idea of a world regulated not by force but by reason, compromise, and nonjudgmentalism. In today’s Europe—professedly pacifist, postnationalist, anti-hegemonic—an expression like “axis of evil” wins few friends, and the idea of actually confronting the axis of evil still fewer.
Despite the differences between them, however, anti-Americanism in the Islamic world and anti-Americanism in Europe are in fact linked, and both bear an uncanny resemblance to anti-Semitism. It is, after all, with some reason that the United States is loathed and feared by the despots and fundamentalists of the Islamic world as well as by many Europeans. Like Israel, but in a much more powerful way, America embodies a different—a non-conforming—idea of the good, and refuses to abandon its moral clarity about the objective worth of that idea or of the free habits and institutions to which it has given birth. To the contrary, in undertaking their war against the evil of terrorism, the American people have demonstrated their determination not only to fight to preserve the blessings of liberty for themselves and their posterity, but to carry them to regions of the world that have proved most resistant to their benign influence.
In this, positive sense as well, Israel and the Jewish people share something essential with the United States. The Jews, after all, have long held that they were chosen to play a special role in history, to be what their prophets called “a light unto the nations.” What precisely is meant by that phrase has always been a matter of debate, and I would be the last to deny the mischief that has sometimes been done, including to the best interests of the Jews, by some who have raised it as their banner. Nevertheless, over four millennia, the universal vision and moral precepts of the Jews have not only worked to secure the survival of the Jewish people themselves but have constituted a powerful force for good in the world, inspiring myriads to fight for the right even as in others they have aroused rivalry, enmity, and unappeasable resentment.
It is similar with the United States—a nation that has long regarded itself as entrusted with a mission to be what John Winthrop in the 17th century called a “city on a hill” and Ronald Reagan in the 20th parsed as a “shining city on a hill.” What precisely is meant by that phrase is likewise a matter of debate, but Americans who see their country in such terms certainly regard the advance of American values as central to American purpose. And, though the United States is still a very young nation, there can be no disputing that those values have likewise constituted an immense force for good in the world—even as they have earned America the enmity and resentment of many.
In resolving to face down enmity and hatred, an important source of strength is the lesson to be gained from contemplating the example of others. From Socrates to Churchill to Sakharov, there have been individuals whose voices and whose personal heroism have reinforced in others the resolve to stand firm for the good. But history has also been generous enough to offer, in the Jews, the example of an ancient people fired by the message of human freedom under God and, in the Americans, the example of a modern people who over the past century alone, acting in fidelity with their inmost beliefs, have confronted and defeated the greatest tyrannies ever known to man.
Fortunately for America, and fortunately for the world, the United States has been blessed by providence with the power to match its ideals. The Jewish state, by contrast, is a tiny island in an exceedingly dangerous sea, and its citizens will need every particle of strength they can muster for the trials ahead. It is their own people’s astounding perseverance, despite centuries of suffering at the hands of faiths, ideologies, peoples, and individuals who have hated them and set out to do them in, that inspires one with confidence that the Jews will once again outlast their enemies.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
On Hating the Jews
Must-Reads from Magazine
Terror is a choice.
Ari Fuld described himself on Twitter as a marketer and social media consultant “when not defending Israel by exposing the lies and strengthening the truth.” On Sunday, a Palestinian terrorist stabbed Fuld at a shopping mall in Gush Etzion, a settlement south of Jerusalem. The Queens-born father of four died from his wounds, but not before he chased down his assailant and neutralized the threat to other civilians. Fuld thus gave the full measure of devotion to the Jewish people he loved. He was 45.
The episode is a grim reminder of the wisdom and essential justice of the Trump administration’s tough stance on the Palestinians.
Start with the Taylor Force Act. The act, named for another U.S. citizen felled by Palestinian terror, stanched the flow of American taxpayer fund to the Palestinian Authority’s civilian programs. Though it is small consolation to Fuld’s family, Americans can breathe a sigh of relief that they are no longer underwriting the PA slush fund used to pay stipends to the family members of dead, imprisoned, or injured terrorists, like the one who murdered Ari Fuld.
No principle of justice or sound statesmanship requires Washington to spend $200 million—the amount of PA aid funding slashed by the Trump administration last month—on an agency that financially induces the Palestinian people to commit acts of terror. The PA’s terrorism-incentive budget—“pay-to-slay,” as Douglas Feith called it—ranges from $50 million to $350 million annually. Footing even a fraction of that bill is tantamount to the American government subsidizing terrorism against its citizens.
If we don’t pay the Palestinians, the main line of reasoning runs, frustration will lead them to commit still more and bloodier acts of terror. But U.S. assistance to the PA dates to the PA’s founding in the Oslo Accords, and Palestinian terrorists have shed American and Israeli blood through all the years since then. What does it say about Palestinian leaders that they would unleash more terror unless we cross their palms with silver?
President Trump likewise deserves praise for booting Palestinian diplomats from U.S. soil. This past weekend, the State Department revoked a visa for Husam Zomlot, the highest-ranking Palestinian official in Washington. The State Department cited the Palestinians’ years-long refusal to sit down for peace talks with Israel. The better reason for expelling them is that the label “envoy” sits uneasily next to the names of Palestinian officials, given the links between the Palestine Liberation Organization, President Mahmoud Abbas’s Fatah faction, and various armed terrorist groups.
Fatah, for example, praised the Fuld murder. As the Jerusalem Post reported, the “al-Aqsa Martyrs Brigades, the military wing of Fatah . . . welcomed the attack, stressing the necessity of resistance ‘against settlements, Judaization of the land, and occupation crimes.’” It is up to Palestinian leaders to decide whether they want to be terrorists or statesmen. Pretending that they can be both at once was the height of Western folly, as Ari Fuld no doubt recognized.
May his memory be a blessing.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The end of the water's edge.
It was the blatant subversion of the president’s sole authority to conduct American foreign policy, and the political class received it with fury. It was called “mutinous,” and the conspirators were deemed “traitors” to the Republic. Those who thought “sedition” went too far were still incensed over the breach of protocol and the reckless way in which the president’s mandate was undermined. Yes, times have certainly changed since 2015, when a series of Republican senators signed a letter warning Iran’s theocratic government that the Joint Comprehensive Plan of Action (aka, the Iran nuclear deal) was built on a foundation of sand.
The outrage that was heaped upon Senate Republicans for freelancing on foreign policy in the final years of Barack Obama’s administration has not been visited upon former Secretary of State John Kerry, though he arguably deserves it. In the publicity tour for his recently published memoir, Kerry confessed to conducting meetings with Iranian Foreign Minister Javad Zarif “three or four times” as a private citizen. When asked by Fox News Channel’s Dana Perino if Kerry had advised his Iranian interlocutor to “wait out” the Trump administration to get a better set of terms from the president’s successor, Kerry did not deny the charge. “I think everybody in the world is sitting around talking about waiting out President Trump,” he said.
Think about that. This is a former secretary of state who all but confirmed that he is actively conducting what the Boston Globe described in May as “shadow diplomacy” designed to preserve not just the Iran deal but all the associated economic relief and security guarantees it provided Tehran. The abrogation of that deal has put new pressure on the Iranians to liberalize domestically, withdraw their support for terrorism, and abandon their provocative weapons development programs—pressures that the deal’s proponents once supported.
“We’ve got Iran on the ropes now,” said former Democratic Sen. Joe Lieberman, “and a meeting between John Kerry and the Iranian foreign minister really sends a message to them that somebody in America who’s important may be trying to revive them and let them wait and be stronger against what the administration is trying to do.” This is absolutely correct because the threat Iran poses to American national security and geopolitical stability is not limited to its nuclear program. The Iranian threat will not be neutralized until it abandons its support for terror and the repression of its people, and that will not end until the Iranian regime is no more.
While Kerry’s decision to hold a variety of meetings with a representative of a nation hostile to U.S. interests is surely careless and unhelpful, it is not uncommon. During his 1984 campaign for the presidency, Jesse Jackson visited the Soviet Union and Cuba to raise his own public profile and lend credence to Democratic claims that Ronald Reagan’s confrontational foreign policy was unproductive. House Speaker Jim Wright’s trip to Nicaragua to meet with the Sandinista government was a direct repudiation of the Reagan administration’s support for the country’s anti-Communist rebels. In 2007, as Bashar al-Assad’s government was providing material support for the insurgency in Iraq, House Speaker Nancy Pelosi sojourned to Damascus to shower the genocidal dictator in good publicity. “The road to Damascus is a road to peace,” Pelosi insisted. “Unfortunately,” replied George W. Bush’s national security council spokesman, “that road is lined with the victims of Hamas and Hezbollah, the victims of terrorists who cross from Syria into Iraq.”
Honest observers must reluctantly conclude that the adage is wrong. American politics does not, in fact, stop at the water’s edge. It never has, and maybe it shouldn’t. Though it may be commonplace, American political actors who contradict the president in the conduct of their own foreign policy should be judged on the policies they are advocating. In the case of Iran, those who seek to convince the mullahs and their representatives that repressive theocracy and a terroristic foreign policy are dead-ends are advancing the interests not just of the United States but all mankind. Those who provide this hopelessly backward autocracy with the hope that America’s resolve is fleeting are, as John Kerry might say, on “the wrong side of history.”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Michael Wolff is its Marquis de Sade. Released on January 5, 2018, Wolff’s Fire and Fury became a template for authors eager to satiate the growing demand for unverified stories of Trump at his worst. Wolff filled his pages with tales of the president’s ignorant rants, his raging emotions, his television addiction, his fast-food diet, his unfamiliarity with and contempt for Beltway conventions and manners. Wolff made shocking insinuations about Trump’s mental state, not to mention his relationship with UN ambassador Nikki Haley. Wolff’s Trump is nothing more than a knave, dunce, and commedia dell’arte villain. The hero of his saga is, bizarrely, Steve Bannon, who in Wolff’s telling recognized Trump’s inadequacies, manipulated him to advance a nationalist-populist agenda, and tried to block his worst impulses.
Wolff’s sources are anonymous. That did not slow down the press from calling his accusations “mind-blowing” (Mashable.com), “wild” (Variety), and “bizarre” (Entertainment Weekly). Unlike most pornographers, he had a lesson in mind. He wanted to demonstrate Trump’s unfitness for office. “The story that I’ve told seems to present this presidency in such a way that it says that he can’t do this job, the emperor has no clothes,” Wolff told the BBC. “And suddenly everywhere people are going, ‘Oh, my God, it’s true—he has no clothes.’ That’s the background to the perception and the understanding that will finally end this, that will end this presidency.”
Nothing excites the Resistance more than the prospect of Trump leaving office before the end of his term. Hence the most stirring examples of Resistance Porn take the president’s all-too-real weaknesses and eccentricities and imbue them with apocalyptic significance. In what would become the standard response to accusations of Trumpian perfidy, reviewers of Fire and Fury were less interested in the truth of Wolff’s assertions than in the fact that his argument confirmed their preexisting biases.
Saying he agreed with President Trump that the book is “fiction,” the Guardian’s critic didn’t “doubt its overall veracity.” It was, he said, “what Mailer and Capote once called a nonfiction novel.” Writing in the Atlantic, Adam Kirsch asked: “No wonder, then, Wolff has written a self-conscious, untrustworthy, postmodern White House book. How else, he might argue, can you write about a group as self-conscious, untrustworthy, and postmodern as this crew?” Complaining in the New Yorker, Masha Gessen said Wolff broke no new ground: “Everybody” knew that the “president of the United States is a deranged liar who surrounded himself with sycophants. He is also functionally illiterate and intellectually unsound.” Remind me never to get on Gessen’s bad side.
What Fire and Fury lacked in journalistic ethics, it made up in receipts. By the third week of its release, Wolff’s book had sold more than 1.7 million copies. His talent for spinning second- and third-hand accounts of the president’s oddity and depravity into bestselling prose was unmistakable. Imitators were sure to follow, especially after Wolff alienated himself from the mainstream media by defending his innuendos about Haley.
It was during the first week of September that Resistance Porn became a competitive industry. On the afternoon of September 4, the first tidbits from Bob Woodward’s Fear appeared in the Washington Post, along with a recording of an 11-minute phone call between Trump and the white knight of Watergate. The opposition began panting soon after. Woodward, who like Wolff relies on anonymous sources, “paints a harrowing portrait” of the Trump White House, reported the Post.
No one looks good in Woodward’s telling other than former economics adviser Gary Cohn and—again bizarrely—the former White House staff secretary who was forced to resign after his two ex-wives accused him of domestic violence. The depiction of chaos, backstabbing, and mutual contempt between the president and high-level advisers who don’t much care for either his agenda or his personality was not so different from Wolff’s. What gave it added heft was Woodward’s status, his inviolable reputation.
“Nothing in Bob Woodward’s sober and grainy new book…is especially surprising,” wrote Dwight Garner at the New York Times. That was the point. The audience for Wolff and Woodward does not want to be surprised. Fear is not a book that will change minds. Nor is it intended to be. “Bob Woodward’s peek behind the Trump curtain is 100 percent as terrifying as we feared,” read a CNN headline. “President Trump is unfit for office. Bob Woodward’s ‘Fear’ confirms it,” read an op-ed headline in the Post. “There’s Always a New Low for the Trump White House,” said the Atlantic. “Amazingly,” wrote Susan Glasser in the New Yorker, “it is no longer big news when the occupant of the Oval Office is shown to be callous, ignorant, nasty, and untruthful.” How could it be, when the press has emphasized nothing but these aspects of Trump for the last three years?
The popular fixation with Trump the man, and with the turbulence, mania, frenzy, confusion, silliness, and unpredictability that have surrounded him for decades, serves two functions. It inoculates the press from having to engage in serious research into the causes of Trump’s success in business, entertainment, and politics, and into the crises of borders, opioids, stagnation, and conformity of opinion that occasioned his rise. Resistance Porn also endows Trump’s critics, both external and internal, with world-historical importance. No longer are they merely journalists, wonks, pundits, and activists sniping at a most unlikely president. They are politically correct versions of Charles Martel, the last line of defense preventing Trump the barbarian from enacting the policies on which he campaigned and was elected.
How closely their sensational claims and inflated self-conceptions track with reality is largely beside the point. When the New York Times published the op-ed “I am Part of the Resistance Inside the Trump Administration,” by an anonymous “senior official” on September 5, few readers bothered to care that the piece contained no original material. The author turned policy disagreements over trade and national security into a psychiatric diagnosis. In what can only be described as a journalistic innovation, the author dispensed with middlemen such as Wolff and Woodward, providing the Times the longest background quote in American history. That the author’s identity remains a secret only adds to its prurient appeal.
“The bigger concern,” the author wrote, “is not what Mr. Trump has done to the presidency but what we as a nation have allowed him to do to us.” Speak for yourself, bud. What President Trump has done to the Resistance is driven it batty. He’s made an untold number of people willing to entertain conspiracy theories, and to believe rumor is fact, hyperbole is truth, self-interested portrayals are incontrovertible evidence, credulity is virtue, and betrayal is fidelity—so long as all of this is done to stop that man in the White House.
Choose your plan and pay nothing for six Weeks!
Review of 'Stanley Kubrick' By Nathan Abrams
Except for Stanley Donen, every director I have worked with has been prone to the idea, first propounded in the 1950s by François Truffaut and his tendentious chums in Cahiers du Cinéma, that directors alone are authors, screenwriters merely contingent. In singular cases—Orson Welles, Michelangelo Antonioni, Woody Allen, Kubrick himself—the claim can be valid, though all of them had recourse, regular or occasional, to helping hands to spice their confections.
Kubrick’s variety of topics, themes, and periods testifies both to his curiosity and to his determination to “make it new.” Because his grades were not high enough (except in physics), this son of a Bronx doctor could not get into colleges crammed with returning GIs. The nearest he came to higher education was when he slipped into accessible lectures at Columbia. He told me, when discussing the possibility of a movie about Julius Caesar, that the great classicist Moses Hadas made a particularly strong impression.
While others were studying for degrees, solitary Stanley was out shooting photographs (sometimes with a hidden camera) for Look magazine. As a movie director, he often insisted on take after take. This gave him choices of the kind available on the still photographer’s contact sheets. Only Peter Sellers and Jack Nicholson had the nerve, and irreplaceable talent, to tell him, ahead of shooting, that they could not do a particular scene more than two or three times. The energy to electrify “Mein Führer, I can walk” and “Here’s Johnny!” could not recur indefinitely. For everyone else, “Can you do it again?” was the exhausting demand, and it could come close to being sadistic.
The same method could be applied to writers. Kubrick might recognize what he wanted when it was served up to him, but he could never articulate, ahead of time, even roughly what it was. Picking and choosing was very much his style. Cogitation and opportunism went together: The story goes that he attached Strauss’s Blue Danube to the opening sequence of 2001 because it happened to be playing in the sound studio when he came to dub the music. Genius puts chance to work.
Until academics intruded lofty criteria into cinema/film, the better to dignify their speciality, Alfred Hitchcock’s attitude covered most cases: When Ingrid Bergman asked for her motivation in walking to the window, Hitch replied, fatly, “Your salary.” On another occasion, told that some scene was not plausible, Hitch said, “It’s only a movie.” He did not take himself seriously until the Cahiers du Cinéma crowd elected to make him iconic. At dinner, I once asked Marcello Mastroianni why he was so willing to play losers or clowns. Marcello said, “Beh, cinema non e gran’ cosa” (cinema is no big deal). Orson Welles called movie-making the ultimate model-train set.
That was then; now we have “film studies.” After they moved in, academics were determined that their subject be a very big deal indeed. Comedy became no laughing matter. In his monotonous new book, the film scholar Nathan Abrams would have it that Stanley Kubrick was, in essence, a “New York Jewish intellectual.” Abrams affects to unlock what Stanley was “really” dealing with, in all his movies, never mind their apparent diversity. It is declared to be, yes, Yiddishkeit, and in particular, the Holocaust. This ground has been tilled before by Geoffrey Cocks, when he argued that the room numbers in the empty Overlook Hotel in The Shining encrypted references to the Final Solution. Abrams would have it that even Barry Lyndon is really all about the outsider seeking, and failing, to make his awkward way in (Gentile) Society. On this reading, Ryan O’Neal is seen as Hannah Arendt’s pariah in 18th-century drag. The movie’s other characters are all engaged in the enjoyment of “goyim-naches,” an expression—like menschlichkayit—he repeats ad nauseam, lest we fail to get the stretched point.
Theory is all when it comes to the apotheosis of our Jew-ridden Übermensch. So what if, in order to make a topic his own, Kubrick found it useful to translate its logic into terms familiar to him from his New York youth? In Abrams’s scheme, other mundane biographical facts count for little. No mention is made of Stanley’s displeasure when his 14-year-old daughter took a fancy to O’Neal. The latter was punished, some sources say, by having Barry’s voiceover converted from first person so that Michael Hordern would displace the star as narrator. By lending dispassionate irony to the narrative, it proved a pettish fluke of genius.
While conning Abrams’s volume, I discovered, not greatly to my chagrin, that I am the sole villain of the piece. Abrams calls me “self-serving” and “unreliable” in my accounts of my working and personal relationship with Stanley. He insinuates that I had less to do with Eyes Wide Shut than I pretend and that Stanley regretted my involvement. It is hard for him to deny (but convenient to omit) that, after trying for some 30 years to get a succession of writers to “crack” how to do Schnitzler’s Traumnovelle, Kubrick greeted my first draft with “I’m absolutely thrilled.” A source whose anonymity I respect told me that he had never seen Stanley so happy since the day he received his first royalty check (for $5 million) for 2001. No matter.
Were Abrams (the author also of a book as hostile to Commentary as this one is to me) able to put aside his waxed wrath, he might have quoted what I reported in my memoir Eyes Wide Open to support his Jewish-intellectual thesis. One day, Stanley asked me what a couple of hospital doctors, walking away with their backs to the camera, would be talking about. We were never going to hear or care what it was, but Stanley—at that early stage of development—said he wanted to know everything. I said, “Women, golf, the stock market, you know…”
“Couple of Gentiles, right?”
“That’s what you said you wanted them to be.”
“Those people, how do we ever know what they’re talking about when they’re alone together?”
“Come on, Stanley, haven’t you overheard them in trains and planes and places?”
Kubrick said, “Sure, but…they always know you’re there.”
If he was even halfway serious, Abrams’s banal thesis that, despite decades of living in England, Stanley never escaped the Old Country, might have been given some ballast.
Now, as for Stanley Kubrick’s being an “intellectual.” If this implies membership in some literary or quasi-philosophical elite, there’s a Jewish joke to dispense with it. It’s the one about the man who makes a fortune, buys himself a fancy yacht, and invites his mother to come and see it. He greets her on the gangway in full nautical rig. She says, “What’s with the gold braid already?”
“Mama, you have to realize, I’m a captain now.”
She says, “By you, you’re a captain, by me, you’re a captain, but by a captain, are you a captain?”
As New York intellectuals all used to know, Karl Popper’s definition of bad science, and bad faith, involves positing a theory and then selecting only whatever data help to furnish its validity. The honest scholar makes it a matter of principle to seek out elements that might render his thesis questionable.
Abrams seeks to enroll Lolita in his obsessive Jewish-intellectual scheme by referring to Peter Arno, a New Yorker cartoonist whom Kubrick photographed in 1949. The caption attached to Kubrick’s photograph in Look asserted that Arno liked to date “fresh, unspoiled girls,” and Abrams says this “hint[s] at Humbert Humbert in Lolita.” Ah, but Lolita was published, in Paris, in 1955, six years later. And how likely is it, in any case, that Kubrick wrote the caption?
The film of Lolita is unusual for its garrulity. Abrams’s insistence on the sinister Semitic aspect of both Clare Quilty and Humbert Humbert supposedly drawing Kubrick like moth to flame is a ridiculous camouflage of the commercial opportunism that led Stanley to seek to film the most notorious novel of the day, while fudging its scandalous eroticism.
That said, in my view, The Killing, Paths of Glory, Barry Lyndon, and Clockwork Orange were and are sans pareil. The great French poet Paul Valéry wrote of “the profundity of the surface” of a work of art. Add D.H. Lawrence’s “never trust the teller, trust the tale,” and you have two authoritative reasons for looking at or reading original works of art yourself and not relying on academic exegetes—especially when they write in the solemn, sometimes ungrammatical style of Professor Abrams, who takes time out to tell those of us at the back of his class that padre “is derived from the Latin pater.”
Abrams writes that I “claim” that I was told to exclude all overt reference to Jews in my Eyes Wide Shut screenplay, with the fatuous implication that I am lying. I am again accused of “claiming” to have given the name Ziegler to the character played by Sidney Pollack, because I once had a (quite famous) Hollywood agent called Evarts Ziegler. So I did. The principal reason for Abrams to doubt my veracity is that my having chosen the name renders irrelevant his subsequent fanciful digression on the deep, deep meanings of the name Ziegler in Jewish lore; hence he wishes to assign the naming to Kubrick. Pop goes another wished-for proof of Stanley’s deep and scholarly obsession with Yiddishkeit.
Abrams would be a more formidable enemy if he could turn a single witty phrase or even abstain from what Karl Kraus called mauscheln, the giveaway jargon of Jewish journalists straining to pass for sophisticates at home in Gentile circles. If you choose, you can apply, on line, for screenwriting lessons from Nathan Abrams, who does not have a single cinematic credit to his name. It would be cheaper, and wiser, to look again, and then again, at Kubrick’s masterpieces.
Choose your plan and pay nothing for six Weeks!
Is American opera in terminal condition?
At the Met, distinguished singers and conductors, mostly born and trained in Europe, appeared in theatrically conservative big-budget productions of the popular operas of the 19th century, with a sprinkling of pre-romantic and modern works thrown in to leaven the loaf. City Opera, by contrast, presented younger artists—many, like Beverly Sills, born in this country—in a wider-ranging, more adventurously staged repertoire that often included new operas, some of them written by American composers, to which the public was admitted at what were then called “popular prices.”
Between them, the companies represented a feast for culture-consuming New Yorkers, though complaints were already being heard that their new theaters were too big. Moreover, neither the Met nor City Opera was having any luck at commissioning memorable new operas and thereby expanding and refreshing the operatic repertoire, to which only a handful of significant new works—none of them, then or since, premiered by either company—had been added since World War I.
A half-century later, the feast has turned to famine. In 2011, New York City Opera left Lincoln Center, declaring bankruptcy. It closed its doors forever two years later. The Met has weathered a nearly uninterrupted string of crises that climaxed earlier this year with the firing of James Levine, the company’s once-celebrated music director emeritus. He was accused in 2017 of molesting teenage musicians and was dismissed from all of his conducting posts in New York and elsewhere. Today the Met is in dire financial straits that threaten its long-term survival.
And while newer opera companies in such other American cities as Chicago, Houston, San Francisco, Santa Fe, and Seattle now offer alternative models of leadership, none has established itself as a potential successor either to the Met or the now-defunct NYCO.1
Is American opera as a whole in a terminal condition? Or are the collapse of the New York City Opera and the Met’s ongoing struggle to survive purely local matters of no relevance elsewhere? Heidi Waleson addresses these questions in Mad Scenes and Exit Arias: The Death of the New York City Opera and the Future of Opera in America.2 Waleson draws on her experience as the opera critic of the Wall Street Journal to speculate on the prospects for an art form that has never quite managed to set down firm roots in American culture.
In this richly informative chronicle of NYCO’s decline and fall, Waleson persuasively argues that what happened to City Opera (and, by extension, the Met) could happen to other opera companies as well. The days in which an ambitious community sought successfully to elevate itself into the first rank of world cities by building and manning an opera house are long past, and Mad Scenes and Exit Arias helps us understand why.As Waleson reminds us, it was Fiorello LaGuardia, the New York mayor who played a central role in the creation of the NYCO, who dubbed the company “the people’s opera” when it was founded in 1943. According to LaGuardia, NYCO existed to perform popular operas at popular prices for a mass audience. In later years, it moved away from that goal, but the slogan stuck. Indeed, no opera company has ever formulated a clearer statement of its institutional mission.
Even after it moved to Lincoln Center in 1966, NYCO had an equally coherent and similarly appealing purpose: It was where you went to see the opera stars of tomorrow, foremost among them Sills and Plácido Domingo, in inexpensively but imaginatively staged productions of the classics. The company went out of its way to present modern operas, too, but it never did so at the expense of its central repertoire—and tickets to its performances cost half of what the Met charged. Well into the 21st century, City Opera stuck more or less closely to its redefined mission. Under Paul Kellogg, the general and artistic director from 1996 to 2007, it did so with consistent artistic success. But revenues declined throughout the latter part of Kellogg’s tenure, in part because younger New Yorkers were unwilling to become subscribers.
In those days, the Metropolitan Opera, NYCO’s next-door neighbor, was still one of the world’s most conservative opera houses. That changed when Peter Gelb became its general manager in 2006. Gelb was resolved to modernize the Met’s productions and, to a lesser extent, its repertoire, and he simultaneously sought to heighten its national profile by digitally simulcasting live performances into movie theaters throughout America.
Kellogg was frustrated by the chronic acoustic inadequacies of the New York State Theater and sought in vain to move City Opera to a three-theater complex that was to be built (but never was) on the World Trade Center site. He retired soon after Gelb came to the Met. Kellogg was succeeded by Gérard Mortier, a European impresario who was accustomed to working in state-subsidized theaters. Mortier made a pair of fateful decisions. First, he canceled City Opera’s entire 2008–2009 season while the interior of the State Theater underwent much-needed renovations. Then he announced a follow-up season of 20th-century operas that lacked audience appeal.
That follow-up season never happened, because Mortier resigned in 2008 and fled New York. He was replaced by George Steel, who had previously served for just three months as general manager of the Dallas Opera. Under Steel, NYCO slashed its schedule to ribbons in a futile attempt to get back on its financial feet after Mortier’s financially ruinous year-long hiatus. Then he mounted a series of productions of nonstandard repertory that received mixed reviews and flopped at the box office.
The combined effect of Gelb’s innovations and the inept leadership of Mortier and Steel all but obliterated City Opera’s reason for existing. Under Gelb, the Met’s repertory ranged from such warhorses as Rigoletto and Tosca to 20th-century masterpieces like Benjamin Britten’s Midsummer Night’s Dream and Alban Berg’s Wozzeck, and tickets could be bought for as little as $20. With the Met performing a more interesting repertoire under a wider range of directors, and in part at “people’s prices,” City Opera no longer did anything that the Met wasn’t already doing on a far larger and better-financed scale. What, then, was its mission now? The truth was that it had none, and when the company went under in 2013, few mourned its passing.
As it happened, Gelb’s own innovations were a mere artistic Band-aid, for he was unwilling or unable to trim the Met’s bloated budget to any meaningful extent. He made no serious attempt to cut the company’s labor costs until a budget crisis in 2014 forced him to confront its unions, which he did with limited success. In addition, his new productions of the standard-repertory operas on which the Met relied to draw and hold older subscribers were felt by many to be trashily trendy.
The Met had particular difficulty managing the reduced circumstances of the 21st century when it came to opera. Its 3,800-seat theater has an 80-foot-deep stage with a proscenium opening that measures 54 feet on each side. (Bayreuth, by contrast, seats 1,925, La Scala 2,030, and the Vienna State Opera 2,200.) As a result, it is all but impossible to mount low-to-medium-budget shows in the Metropolitan Opera House, even as the company finds it is no longer able to fill its increasingly empty house. Two decades ago, the Met earned 90 percent of its potential box-office revenue. That figure plummeted to 66 percent by 2015, forcing Gelb to raise ticket prices to an average of $158.50 per head. On Broadway, the average price of a ticket that season was $103.86.
Above all, Gelb was swimming against the cultural tide. Asked about the effects on audience development of the Met simulcasts, he admitted that three-quarters of the people who attended them were “over 65, and 30 percent of them are over 75.” As he explained: “Grand opera is in itself a kind of a dinosaur of an art form…. The question is not whether I think I’m doing a good job or not in trying to keep the [Metropolitan Opera] alive. It’s whether I’m doing a good job or not in the face of a cultural and social rejection of opera as an art form. And what I’m doing is fighting an uphill battle to try and maintain an audience in a very difficult time.”
Was that statement buck-passing defeatism, or a fair appraisal of the state of American opera? Other opera executives distanced themselves from Gelb’s remarks, and it was true—and still is—that smaller American companies have done a somewhat better job of attracting younger audiences than the top-heavy Met. But according to the National Endowment for the Arts, the percentage of U.S. adults who attend at least one operatic performance each year declined from 3.2 percent in 2002 to 2.1 percent in 2012. This problem, of course, is not limited to opera. As I wrote in these pages in 2010, the disappearance of secondary-school arts education and the rise of digital media may well be leading to “not merely a decline in public interest in the fine arts but the death of the live audience as a cultural phenomenon.”3D oes American opera have a future in an era of what Heidi Waleson succinctly describes as “flat ticket income and rising expenses”? In the last chapter of Mad Scenes and Exit Arias, she chronicles the activities of a group of innovative smaller troupes that are “rethinking what an opera company is, what it does, and who it serves.” Yet in the same breath, she acknowledges the possibility that “filling a giant theater for multiple productions of grand operas [is] no longer an achievable goal.”
If that is so, then it may be worth asking a different question: Did American opera ever have a past? It is true that opera in America has had a great and glorious history, but virtually the whole of that history consisted of American productions of 18th- and 19th-century European operas. By contrast, no opera by an American classical composer has ever entered the international major-house repertoire. Indeed, while new American operas are still commissioned and premiered at an impressive rate, few things are so rare as a second production of any of these works.
While a handful continue to be performed—John Adams’s Nixon in China (1987), André Previn’s A Streetcar Named Desire (1995), Mark Adamo’s Little Women (1998), and Jake Heggie’s Dead Man Walking (2000)—their success is a tribute to the familiarity of their subject matter and source material, not their musico-theatrical quality. As for the rest, the hard but inescapable truth is that with the exception of George Gershwin’s Porgy and Bess (1935), virtually all large-scale American operas have been purpose-written novelties that were shelved and forgotten immediately after their premieres.
The success of Porgy and Bess, which received its premiere not in an opera house but on Broadway, reminds us that American musical comedy, unlike American opera, is deeply rooted in our national culture, in much the same way that grand opera is no less deeply rooted in the national cultures of Germany and Italy, where it is still genuinely popular (if less so today than a half-century ago). By comparison with Porgy, Carousel, Guys and Dolls, or My Fair Lady, American opera as a homegrown form simply does not exist: It is merely an obscure offshoot of its European counterpart. Aaron Copland, America’s greatest composer, was not really joking when he wittily described opera as “la forme fatale,” and his own failed attempts to compose an audience-friendly opera that would be as successful as his folk-flavored ballet scores say much about the difficulties facing any composer who seeks to follow in his footsteps.
It is not that grand opera is incapable of appealing to American theatergoers. Even now, there are many Americans who love it passionately, just as there are regional companies such as Chicago’s Lyric Opera and San Francisco Opera that have avoided making the mistakes that closed City Opera’s doors. Yet the crises from which the Metropolitan Opera has so far failed to extricate itself suggest that in the absence of the generous state subsidies that keep European opera houses in business, large-house grand opera in America may simply be too expensive to thrive—or, ultimately, to survive. At its best, no art form is more thrilling or seductive. But none is at greater risk of following the dinosaurs down the cold road to extinction.
1 The “New York City Opera” founded in 2016 that now mounts operas in various New York theaters on an ad hoc basis is a brand-new enterprise that has no connection with its predecessor.
2 Metropolitan Books, 304 pages