The conscience of a neoconservative giant.
This year marks the centennial of the birth of Henry M. “Scoop” Jackson, one of the towering figures of American politics in the latter half of the 20th century and the avatar of neoconservatism. A Democrat representing the state of Washington in the U.S. Senate from 1953 until his sudden death in 1983, he deserves to be recalled not only because he merits honor but also because little of today’s politics would be comprehensible without understanding his 30 years in office.
I worked for him during his unsuccessful campaign for president in 1976 and got to see him up close for those months. Although he was entirely absorbed in politics from the time he reached adulthood, he was very unlike a politician. He was awkward on the stump and uneasy pressing the flesh. The reason, one could see, is that he had a modest ego. He was uncomfortable, too, amid the bevy of highfliers and self-promoters who make up the crew and groupies of any presidential campaign. I remember Sterling Munro, Jackson’s top Senate aide and the campaign’s chief of staff, battling long-distance with the candidate’s traveling entourage to get everyone to leave Scoop’s hotel room at night so he could get some rest. The senator himself, apparently, was too polite or unassuming to throw them out. Scoop’s aide, Richard Perle, kept a paper cocktail napkin displayed on his desk in the Senate Office Building. On it was scrawled in felt-tipped pen: “Richard, I was here. Where were you? I waited. Scoop.” Anyone familiar with the hierarchical norms on Capitol Hill between members and staff will recognize how extraordinary this missive was.
As he was humble, so was he unpretentious. Jackson was astonished that Jimmy Carter, his main opponent for the 1976 Democratic presidential nomination, made a campaign theme of being a born-again Christian. Scoop allowed in private that he was a regular churchgoer, but he did not believe that this was anyone else’s concern and wouldn’t have thought of speaking about it in public. Carter, who had pandered to segregationists when running for governor, then turned on a dime when he went national, and who embellished his resume with the claim to be a “nuclear physicist” and other falsehoods, carefully cultivated an image of virtue, while Scoop, a genuinely virtuous individual, never gave a thought to his image. As ABC’s Barrie Dunsmore, who covered both men and was no ideological fan of Scoop’s, put it: “Carter would carry his bags on the campaign trail and make a big thing of it. Jackson would carry his and yours and make nothing of it at all.”
Dunsmore also recalled that Scoop was “extraordinarily forgiving” toward journalists, such as himself, who wrote critically of Scoop. The same was true for staff. I was tasked to write a speech for a campaign event, my first such venture. Later I overheard Jackson asking a more senior aide to rework it, commenting with a laugh, “It’d make a fine doctoral dissertation, but it’s not a speech.” Had I not inadvertently eavesdropped, I would not have known of my failure. What I heard from Scoop’s lips was only thanks.
As Dunsmore’s anecdote about the luggage suggests, Scoop was abstemious with himself but generous to others. He drove a jalopy that was a laughingstock and gave all his honoraria to charity—mostly anonymously. Even after he married and fathered two children, and therefore spent a lot more on his own household than he had as a bachelor, he gave away as much as 44 percent of his income, which, all told, measured only five figures long. He also, as need arose, covered medical expenses and the like for a troubled sibling, this of course without the benefit of a tax deduction. In short, as political scientist Robert G. Kaufman, author of Henry M. Jackson: A Life in Politics (2003), a fine definitive biography, puts it, Scoop “personified integrity and decency in all aspects of his life.”
He was also—and this must have flowed from the same source as his virtue—an utter Boy Scout. His usual order at cocktail parties was grapefruit juice. He remained single until age 49, by which time he had already served nine years in the Senate (following 10 in the House). Although he went on the occasional date, he saw fewer women than some of his married Senate colleagues did. Kaufman repeats an anecdote from Jackson’s bachelor years that captures Jackson’s prudishness as well as his single-mindedness:
Martha Wright, from Duvall, Washington, in Scoop’s old district, was an aspiring musical comedy actress around Seattle. Then she went to New York in 1947 or 1948 and did very well. Scoop had known her around Seattle. He went up to New York to take her out. Martha had familiarity with show business mores, which were not strict about sex. Scoop wanted to take her back to her apartment. She was intrigued that straight arrow Scoop was interested. When they got to the apartment, Scoop said he wanted to watch the 11 o’clock news. He did and then left.
Like a good Scout, Scoop had the highest attendance rating in the Senate and maintained it at 99 percent even while campaigning for president. He was exceedingly diligent and a master of the intricacies of the chamber in which he served, winning designation by Ralph Nader’s study group as the “most effective senator.” His success, though, was not only a product of skill. As ill at ease as he often seemed with crowds or among strangers, he was warm and gregarious with those he knew, including colleagues and staff. When he died, the tributes from fellow Senators were published in a book some 450 pages long. Eulogies in the Senate are common, but nothing like this effusion. And those who worked for him formed a group called Scoop’s Troops that continues to hold annual meetings in his honor decades after his death.
Jackson’s deeply ingrained desire to do the right thing, which made him a man of many virtues and something of a prig, may also have helped to shape his political views, which were liberal. In those innocent days before it absorbed an admixture of angry, arrogant 1960s leftism, liberalism was a vaguely defined creed that attempted to give political expression to the impulse to do good.
Scoop was the epitome of the liberalism of his day. The two cardinal programmatic ideas of that philosophy were devotion to the “common man” and the conviction that government had nigh limitless power to make people’s lives better.
The common man was an American concept more elastic than the European notion of class. It encompassed ordinary people whether of town or country, whether wage earners or self-employed. Scoop’s father, a Norwegian immigrant, served for 26 years as treasurer of his local branch of the Plasterers and Cement Masons’ Union, but he also worked for a time as a policeman and later as a private contractor, pouring concrete. In the formulaic categories of Marxism, that would have meant that he leapt from class to class. But little if anything changed in the Jacksons’ circumstances as he changed jobs, and in American terms, he remained a common man. His kind were at once the beneficiaries of the policies championed by liberalism and its intended constituency. This average American was also seen to incarnate a country whose essential goodness was beyond doubt.
As for the role of government, liberalism embraced private enterprise, not socialism, as the basic form of economy. It did, however, also believe that the Great Depression had proved that capitalism could not be left safely to its own devices. An activist government was essential to provide the protections and benefits that would make the good life possible for the common man and to assure that the country as a whole would prosper. Also it would supervise land use, sponsoring public utilities and conservationism.
To these core tenets of liberalism we might add a third and fourth: a sense of fair play at home and active engagement abroad. Fair play entailed support for causes such as civil rights and civil liberties. Internationalism rested on the hard-won lessons that American isolation had helped bring about the Second World War and that the threat posed by predatory dictators was universal.
Jackson embraced each of these causes and was a leader in most. Organized labor powered the fight for the panoply of benefits for the common man—Medicare, Medicaid, unemployment insurance, minimum wage, workmen’s compensation, and the like—and Jackson posted the most perfect voting record in the U.S. Senate by the lights of the AFL-CIO’s Committee on Political Education.
In college, Jackson had joined the League for Industrial Democracy, a more opaque name for an organization that had originally called itself the Intercollegiate Socialist Society. If Jackson had a dalliance with socialism, it did not last long, but he was a lifelong believer in socialized medicine, public-works programs to provide jobs, ever greater federal aid for education, an expanded public role in utilities, and regulation of the energy industry. And such was his devotion to conservationism that he became the first politician to win the Sierra Club’s John Muir Award, its highest honor.
As for fair play, Jackson was a supporter of civil rights but not a leader in this field, perhaps because the state of Washington was home to few blacks. Indians were a more prominent minority, and Jackson’s first legislative accomplishment as a member of the House was the 1946 Indian Claims Commission Act, providing some legal remedy for long-ago broken treaty obligations. Regarding civil liberties, Jackson first made a name for himself as a senator by taking on Senator Joseph McCarthy in the Permanent Subcommittee on Investigations, the vessel for McCarthy’s scurrilous inquests.
If these causes appealed to Scoop reflexively, that was not true of internationalism. Early in World War II, he had declared himself “unalterably opposed to our country[’s] entering the European conflict.” But Pearl Harbor and Germany’s declaration of war against the United States made all that moot. Although already a member of the House of Representatives, Scoop quickly enlisted in the Army and served a few months until an order from President Franklin Roosevelt barred congressmen from service, sending Scoop and a few colleagues back to Capitol Hill. In his reversal about the war, Jackson was similar to many other Americans, liberal and conservative. But his turn against isolationism—reinforced at war’s end by a tour of Buchenwald two days after its liberation—was more impassioned and far-reaching than that of most others.
In the three years after the war, liberals were divided about how to keep the peace. The majority, including Jackson, followed President Truman in viewing Communism and the Soviet Union as presenting a totalitarian threat akin to Fascism and Nazi Germany. A minority, including Jackson’s future Senate colleague George McGovern, followed former Vice President Henry Wallace in perceiving the Soviet Union in a more benign light. Wallace said there was “a great difference between the fascist dictatorship, which tries to perpetuate itself for its own profit, power, and glory, and the dictatorship in the Soviet Union which has as its goal an economy of abundance for all its people and the eventual dissolution of the dictatorship” and which therefore has “no necessity to expand [its] borders, nor will [it] for many decades to come, except as [compelled by] external threats and pressures.”
With Truman’s surprise reelection victory in 1948 and Wallace’s pathetic third-party showing (2 percent of the popular vote), the debate was sealed. But its conclusion was reversed 20 or so years later when the Vietnam War discredited anti-Communism; and McGovern, still adhering to Wallace’s worldview,1 captured the 1972 Democratic presidential nomination, handily besting Jackson among others.
McGovern’s success, though focused on the issue of Vietnam, represented the triumph of a broader ideology whose roots were in the New Left of the 1960s. Initially, the New Left had disdained electoral politics, favoring revolution over mere reform within “the system.” And its more zealous adherents continued to embrace violence and other forms of chiliastic rebellion. But the mass of New Left sympathizers were less dogmatic and harkened to the call in 1968 to shave beards and shed love beads to go “neat and clean for Gene” McCarthy, the senator from Minnesota whose 1968 antiwar presidential campaign paved the way for McGovern’s in 1972.
This new movement, dubbing itself the “New Politics,” was at variance with almost every tenet of the old liberalism. It was contemptuous of the common man, exemplified by the protagonist of the era’s most popular sitcom: Archie Bunker, a racist narrow-minded patriot. (Carroll O’Connor, the actor who so brilliantly rendered this caricature, did TV ads for the 1972 presidential campaign of New York City’s Mayor John V. Lindsay, who competed with McGovern for the left wing of the Democratic Party.) Instead, the New Politics championed youth and the college-educated elite, who, in coalition with blacks and Latinos, were seen as the true constituency for a better America. The labor movement, the old pillar of liberalism, said one of Eugene McCarthy’s top advisers, was not worth “the powder it would take to blow it to hell.”
As for government activism, the attitude was more ambivalent. Years later, the heirs of this New Politics were to embrace big government whole-heartedly. But early on, the attitude of the movement was less clear, perhaps precisely because the prosaic forms of social insurance were so closely identified with organized labor and traditional liberals such as Jackson. What excited New Politics adherents more were projects to aid the “underclass” such as “welfare rights.”
This movement turned the cause of civil rights on its head. Instead of the goal of a color-blind society espoused so eloquently by Martin Luther King Jr., it advocated preferential treatment for blacks. The rationale for this was to redress past discrimination. But the flimsiness of this reasoning was made clear by the proliferation of demographic categories that soon were added to the list of beneficiaries of reverse discrimination—amounting to more or less everyone except middle-aged white males. It reached the height of absurdity when the Democratic Party embraced the “McGovern reforms” that gave special advantage to minorities, women, and youth. The latter category meant newly eligible voters. But of course, by definition, these individuals could not possibly have been discriminated against in previous elections.
The old liberal concern for conservationism was supplanted by the new environmentalism, whose hallmark was wariness of economic growth (and, ipso facto, higher living standards for the common man). The sanctity of civil liberties gave way to the new “political correctness” in which some ideas could not be expressed and people who expressed them could not be hired.
Above all, the new liberalism defined itself by its approach to international issues. The old liberalism had included a philosophy called “liberal internationalism,” which, in a self-conscious antithesis of isolationism, held America’s security to depend on global security. Conversely, it believed that America’s role in the world—through its military strength, its alliances, its political influence, its prosperity and generosity—was to be the essential cornerstone of global security. The new liberalism, in contrast, saw America as being as much a part of the problem as the solution, and it thought the world and our country would be better off if it played a much more modest role abroad. “Come home, America” was the theme of McGovern’s presidential bid.
Although McGovern lost the 1972 general election in a landslide, the grip of his followers and his ideas on the Democratic Party only grew stronger over the following decade. True, Jimmy Carter, the next Democratic nominee, campaigned for president as a centrist, but no sooner had he won than he revealed his affinity with the new liberalism. McGovern told friends that Carter’s appointments to the policy-making positions at the State Department were “excellent…quite close to those I would have made myself.” When Carter was defeated for reelection in 1980, the party chose as its next standard-bearer his more liberal vice president, Walter Mondale, whose votes on international issues as a senator had met the approval of the new-liberal advocacy group, Americans for Democratic Action, 96 percent of the time.
During this era, the new liberalism waged political and ideological warfare against the old. On the ideological side, the centerpiece was the coinage of the term neoconservatism as an anathema cast on unreconstructed adherents of the old liberalism. This was at first received with indignation, but gradually those to whom the epithet was applied acquiesced, leaving the label liberal to those in the McGovern tradition. (Eventually, they brought it into such bad odor that they abandoned it in favor of progressive). On the political side, the new liberals sponsored candidates in Democratic primaries against incumbent representatives and senators of old-liberal stripe, defeating several and terrifying the rest.
The ideological and the political warfare differed from each other in a crucial respect. The targets of the former were intellectuals for whom the clash of ideas was meat and drink. The newly christened “neoconservatives” fought back with gusto, even joie de combat, not least in the pages of this magazine. But politicians are in a different position. Needing the support of many and diverse voters, their calling is not to duel but to placate, not to sharpen differences but to blur them. Of all the things a politician fears, few are more alarming than opposition within his or her own party.
Thus, faced with the threat of internecine challenge, incumbent Democratic officeholders, one after another, fell into line with the new liberalism. Consider for example, the leaders of the Coalition for a Democratic Majority (CDM), formed in the immediate aftermath of McGovern’s 1972 general election debacle to win the Democratic Party back to its old-liberal traditions, a mission that seemed well within reach, given the dimensions of McGovern’s defeat. Among its original officers were Representatives Jim Wright (Texas) and Tom Foley (Washington), each of whom rose to become speaker of the House. One of its two honorary co-chairmen was Sen. Hubert Humphrey, later succeeded in that title by Sen. Daniel Patrick Moynihan. Over the course of the next decade, however, the new liberalism proved surprisingly resilient, and gradually each of these four leading Democrats remade himself as a “liberal” on foreign-policy issues.
Something analogous happened with organized labor. The AFL-CIO under the leadership of George Meany and his successor, Lane Kirkland, had been the political muscle behind the old liberalism. So staunch was its anti-Communism that when the Democrats nominated McGovern in 1972, labor, despite Meany’s fierce clashes with President Richard Nixon throughout his first term, broke its tradition of supporting the Democrat and declared its neutrality. But a new-labor faction emerged, dovish on foreign policy, that forced the AFL-CIO to cease siding with the old liberalism against the new, and it eventually drove Kirkland from office.
Amid all this surrender under heavy fire, one single politician among the old liberals yielded not an inch. That was Scoop Jackson (who was, by the way, the other honorary co-chairman of CDM). This was true even though he knew that his stand might keep his party’s presidential nomination from his grasp. He lacked the ego of a politician, but he did not lack the ambition. He had been keenly disappointed when the weight of Texas’s electoral college votes led John F. Kennedy to select Lyndon B. Johnson as his vice president in 1960 despite numerous reports that Scoop had been JFK’s preferred choice. Scoop made a desultory run for the nomination in 1972 and seemed well positioned to capture the prize in 1976—many judged him the front-runner at the outset—were he not so abhorrent to new-liberal Democrats. Although he carried the Massachusetts and New York primaries, that abhorrence coupled with Scoop’s weaknesses as a campaigner brought him up short against Jimmy Carter.
After losing the showdown Pennsylvania primary to Carter in April, Scoop bowed out of the race and threw his support to the winner, even while other Democratic contenders battled the Georgian through the last primary months later. Thus, after Carter won the general election, Scoop was poised to have a strong relationship with the new president, as behooves a senator of the same party. But Carter’s foreign-policy appointments, which so pleased McGovern, appalled Jackson.
Carter capped these by naming Paul Warnke as head of the Arms Control and Disarmament Agency and chief weapons negotiator. As chief defense-policy adviser to McGovern’s 1972 campaign, Warnke had authored a plan to reduce the entire U.S. military by one-third. This appointment was too much for Scoop, and he decided to fight Warnke’s nomination despite the tradition of allowing a new president a honeymoon. Warnke won confirmation in the Senate but only by a vote of 58 to 40. This implied that Scoop would be able to muster the votes to block any arms-control treaty that did not meet his approval since ratification requires a two-thirds vote.
Scoop continued to dog Carter over his defense and Cold War policies to the point of accusing the president, in one 1979 speech, of practicing “appeasement,” charging that the “explanations, extenuations, and excuses” for Soviet conduct put forward by his administration were “ominously reminiscent of Great Britain in the 1930s.”
Still, Scoop’s battles with Carter amounted to little more than a mild reprise of the titanic struggles he had waged with Republican presidents Richard Nixon and Gerald Ford and their brilliant foreign-policy strategist, Henry Kissinger, over détente. These were, arguably, the most perilous years of the Cold War. America was in the last agonizing throes of losing the war in Vietnam for which it had sacrificed 50,000 to 60,000 lives. The public had become deeply war-weary.
The Democrats had mostly gone over to the view that the core problem was America’s own paranoia and belligerency. That was the theme of McGovern’s candidacy, echoed a few years later by Jimmy Carter, who spoke of the “inordinate fear of Communism” that had led America to fight “fire with fire, never thinking that fire is better quenched with water.” In search of this peace-giving elixir, the Democrats pushed for yearly reductions in military spending, tight new restraints on the CIA, and the withdrawal from bases and commitments abroad.
And the Republicans? It is easy to forget that before the Reagan era, the GOP was not particularly hawkish. Its traditions were isolationist. America’s engagement with the world had been framed by Democrats: Franklin Roosevelt had led (some say dragged) us into World War II; Harry Truman’s liberal internationalism brought us into the Cold War; and John F. Kennedy and Lyndon Johnson got us into Vietnam. True, in 1964, Republican presidential candidate Barry Goldwater had made some thunderous pronouncements that the Democrats used against him with good effect, but during his long career in the Senate, Goldwater was never a leader on foreign or defense issues.
Now, the Republican administration unfolded the policy of détente, which meant an abatement of the Cold War. This would have been especially welcome to the U.S. at that difficult moment, but why would the Kremlin go along at a moment when the United States was least prepared to parry any Communist thrust? Indeed, there was precious little sign that the Soviet effort to gain the upper hand had slowed, much less stopped. Soviet support for guerrilla movements in Africa and Latin America intensified, as it did for radical regimes in the Middle East and Palestinian terrorists. To boot, Moscow itself was in the midst of an arduous build-up of its nuclear forces that promised to give it an advantage at every level of weaponry—conventional and nuclear, short-range and long-range.
The chief of naval operations, Admiral Elmo Zumwalt, claimed that Kissinger told him that America’s position was declining inexorably and that his job was to win from the Soviets the best terms he could for a graceful U.S. retreat. Kissinger strongly denied saying or believing any such thing. He explains in his memoirs that his goal was largely to “outmaneuver the ‘peace’ pressures” in order to protect America’s defense structure “from Congressional savaging.” In other words, Kissinger orchestrated a display of peacemaking, he says, in order to cut the ground from under the doves.
If this was truly his strategy, it was too clever by half. The underlying problem was the degree to which Vietnam had eaten away Americans’ faith in the worthiness of their own country and their understanding of the malignity of their foes. This new mood was neither Nixon’s nor Kissinger’s fault, but it could only be changed by confronting it. Americans needed to be reminded that whatever had gone wrong in Vietnam, or if the war had been a mistake, the larger Cold War of which it was a part was both just and necessary. To tell them, instead, that the Cold War was over or nearly over or on its way to being over would only make a bad situation worse.
Did the Republican administration claim such a thing? Kissinger concedes it did, but he blames Nixon. Here is his account:
Nixon’s penchant for hyperbole was unlikely to be restrained in an election year. He started out expressing the “hope” for a generation of peace. Soon he came to claim it as an “accomplishment.” And in the closing days of the 1972 election campaign he even escalated the goal to be a “century of peace.” His public-relations people were indefatigable in propounding these propositions—over my frequently expressed, but rather ineffective, dissent.
If Kissinger dissented, this was very much in private. In public he was scarcely less enthusiastic than his boss. Briefing the press at the close of the May 1972 Moscow summit about the strategic arms limitations (SALT) accords that had been signed along with a slew of less important agreements, Kissinger said: “Both governments decided that in an agreement of this kind, the stakes were larger than the simple technical issues…that what was at stake was a major step toward international stability, confidence among nations, and a turn in the pattern of postwar relations.”
What Kissinger called “simple technical issues” was the fact that the agreement, supposedly a five-year “interim” deal, granted the Soviets a substantially larger arsenal of nuclear missiles than the U.S. Fine, perhaps, if the Cold War was really over. Worrisome, if not.
The 1972 Moscow summit, at which the atmospherics were warmer than in any previous such meeting, was the high point of détente. But in the months that followed, Scoop introduced two amendments that took the wind out of Nixon’s and Kissinger’s sails and helped to refocus the country on essential truths of the Cold War. Rarely has a legislator had such a large hand in steering the ship of state—especially against the will of the executive.
The first Jackson Amendment concerned SALT itself. It comprised a treaty, sharply limiting defensive missiles (antimissile missiles), and an executive agreement limiting offensive weapons, the unequal “interim agreement.” Executive agreements ordinarily do not require Congressional assent, but for arcane legal reasons, this one did. Scoop took aim at the inequality and the fact that the interim agreement was intended to be superseded by a long-term or permanent treaty yet to be negotiated. His amendment to the motion of assent said simply that the Congress “urges and requests the president to seek a future treaty that…would not limit the United States to levels of intercontinental strategic forces inferior to the limits provided for the Soviet Union.”
Who could be against that? Well, J. William Fulbright, for one, chairman of the Senate Foreign Affairs Committee, who complained, “It raises questions about the prudence and propriety of the original agreement,” and 34 other doves who voted against the amendment in the Senate. It was harder for the administration to oppose it, and after Scoop had toned down some earlier language, Nixon acquiesced in the amendment, thus assuring its passage by a wide margin. If Zumwalt was right that Kissinger’s strategy amounted to acceptance of second place, this provision stopped that process in its tracks.
But even if Zumwalt had entirely misconstrued Kissinger, the amendment opened up a momentous discussion. Why had the Soviets sought a preponderance of large ICBMs, the most destructive and destabilizing weapons? The United States had been first to build atomic weapons and ICBMs. Then, during the Kennedy administration, Washington concluded that it had enough missiles to hit every conceivable enemy target, and it ceased to add more. Secretary of Defense Robert McNamara expressed confidence that Moscow would do likewise as soon as it caught up. But when the Soviet Union reached parity in ICBMs, it just kept building more, and now it had insisted on locking in that advantage (and some others, such as the size of these missiles) in SALT. How could such behavior be reconciled with the new relationship promised by détente?
The second and more famous Jackson Amendment concerned immigration from the Soviet Union. It denied trade benefits to any “non-market economy country” that did not allow its citizens to emigrate. In practice it affected primarily Jews seeking to emigrate from the Soviet Union. Jackson first introduced this amendment in 1972, but the trade bill to which it was attached was not enacted until 1974. While it was in prospect, the Kremlin allowed emigration to swell greatly. Then, when it passed, the numbers were squeezed back to their pre–Jackson Amendment level before rising again a few years later. Although this variance sparked a debate about the wisdom of the measure, the Soviet Jews themselves were not ambivalent. As their leading light, Natan Sharansky, put it:
For many Jews in the Soviet Union Jackson became the savior of their lives. Hundreds of thousands of Jews could join their people in freedom. Thousands of non-Jews who wanted to live in freedom and not in the Soviet Union could do it. And they attribute it first of all to that noble position of Senator Jackson.
Apart from the individuals who benefited, this amendment, like Jackson’s SALT amendment, raised some freighted questions. What kind of country treated its citizens as captives? How would such a country treat us? And why, if the Soviet regime was prepared to lay to rest its conflict with us, would it continue to give so little quarter to its own people?
These questions were pressed home by Aleksandr Solzhenitsyn, the writer-cum-prophet who had laid bare the Soviet gulag system, when he first arrived in America in 1975. President Gerald Ford refused to invite Solzhenitsyn to the White House for fear of roiling the Kremlin, so Jackson and George Meany, Washington’s two doughtiest anti-Communists, provided his welcome—Jackson hosting him in the Senate and Meany for a public lecture.
Embarrassed by such displays of pusillanimity and even more so by Moscow’s continued aggressiveness, the Ford administration let it be known in 1976 that it would no longer use the word détente. Then Jimmy Carter pursued his own, still more conciliatory version of this policy until the Soviet invasion of Afghanistan in 1979 wrote finis to the whole era and paved the way for the election of Ronald Reagan in 1980. Jackson had been accused of killing détente; in truth, it was the Soviets themselves who did that. But Jackson’s Amendments, his criticisms of Nixon, Ford, and Carter, and his other actions, such as hosting Solzhenitsyn, did perhaps more than anything else to help America regain perspective on its enemies and thus on itself and to recover from the bout of self-laceration brought on by Vietnam.
Scoop served on Reagan’s transition team, then turned down appointment as secretary of defense. He won reelection in 1982 preaching his old liberal domestic policies and campaigning against Reaganomics—before dying suddenly less than a year into his new term. His legacy found expression within the Reagan administration where a number of Jackson’s followers helped shape policy. Jeane Kirkpatrick, who had been Scoop’s representative (along with Ben J. Wattenberg) to the Democratic platform committee in 1976, became Reagan’s ambassador to the United Nations and the chief exponent of his foreign-policy philosophy. Richard Perle became assistant secretary of defense, formulating positions on armaments and arms control, while Elliott Abrams, who had also worked in Scoop’s Senate office, became assistant secretary of state and point man for Central America policy. A number of other Democrats closely affiliated with Scoop—Max Kampelman, Paul Nitze, Eugene V. Rostow, Richard Schifter, to name a few—also took on important roles.
If neoconservatives made an impact on Reagan’s administration, so did he on them. On the eve of Reagan’s inauguration, most or all of these old liberals were still Democrats and still liberals in the old sense, except that they had surrendered the label. But under Reagan several things drew them closer to conservatism and the Republican Party. One was the success of Reaganomics, which restored the U.S. economy to strength with high growth and low inflation after Jimmy Carter had thrown up his hands at the challenge. A second was the transformation of the labor movement from a bastion of patriotism to a vehicle for leftism akin to European labor unions; labor had once tied the neocons to the world of Keynesianism and the Democratic Party, but no more. Above all was their appreciation of Reagan himself, who steered America to victory in the Cold War and became the hero who succeeded Jackson in their hearts. This did not lead to automatic acceptance of all positions, but it did lead these old liberals to revisit conservative ideas with fresh eyes.
In addition to Reagan, Scoop had one other heir, less momentous a historical figure, but politically a closer copy: Senator Joseph Lieberman of Connecticut, who by chance is serving his last year in the Senate during Scoop’s centennial. Lieberman of course is a Democrat. He is not quite as liberal as Scoop was on domestic issues, but he’s liberal nonetheless, and the difference may be chalked up to the fact that the whole spectrum on domestic economic and welfare issues shifted somewhat rightward after registering the failure of Lyndon Johnson’s War on Poverty programs and the resonance of Reagan’s attacks on big government.
It was, however, on international issues that Lieberman truly made his mark. Here, although the enemies of America had changed (Islamists and terrorists replacing Communists), the essential choice remained the same: to confront the challenge squarely or to dance around it. Once again, as remained true ever since the new liberals of the 1970s took over the party, Democrats mostly favored dancing. But in this battle the Republicans were more staunch thanks to the legacy of Reagan. Like Scoop when his party went AWOL in the Cold War, Lieberman found himself entirely out of step with other Democrats. And the new liberals, still up to their old methods, challenged him in the primaries and won, forcing him to run as an independent, which he did successfully. But disapproval from fellow Democrats, however shrill, did not seem to faze Lieberman any more than Scoop had been fazed by his detractors. Lieberman did not command the Senate as Scoop once did, perhaps simply for want of the committee assignments to make that possible. But by virtue of his boldness and dedication to the cause, he made himself a leader in the war against terror. One hopes he will make himself heard in retirement. Like Scoop’s, his voice will be missed in the Senate.
During the frightening years in the 1970s when what the Communists called the “correlation of forces” seemed to be shifting in their favor and when the Vietnam debacle had sapped American self-confidence, Moynihan coined the phrase “the freedom party” to describe the dwindling contingent who held to the conviction that the fight against Communism was tantamount to the defense of civilization itself. Scoop Jackson was the undisputed leader of the freedom party. How had he come to that role? In other words, what made Scoop, Scoop?
It had taken the confrontation with Nazism to teach him the full evil of predatory totalitarianism. He absorbed the lesson deeply, and when he saw that the Soviet regime was of cognate character, he became its most implacable opponent. Most storied figures in the battle against Communism were intellectuals, former Communists such as Whittaker Chambers or veterans of other precincts of the left, such as George Orwell. Scoop was neither a former leftist, unless one counts his inconsequential college membership in the League for Industrial Democracy, nor an intellectual. He was, however, more intellectually serious than most politicians.
He used the Subcommittee on National Security and International Operations, which he chaired, to convene hearings whose purpose was not to prepare legislation but to give an official platform to some of the leading scholars on Soviet affairs or related international issues, such as Richard Pipes, James Schlesinger, Robert Conquest, Suzanne Massie, Bernard Lewis, James Billington, and Albert Wohlstetter. He would also fly to London periodically for a few days devoted to briefings from such Britain-based scholars as Conquest, Malcolm Mackintosh, Leonard Schapiro, Leopold Labedz, R.V. Jones, and Walter Laqueur.
In addition to this intellectual side, the other thing that separated Scoop from the pack of politicians was his courage. Richard Perle quotes Scoop’s private comment on Moynihan’s defection from the CDM camp to become a conventional new-liberal Democrat: “The trouble with Pat is that he is afraid of the New York Times.” Perle adds: “Scoop just didn’t care what the New York Times had to say.” This is probably an exaggeration, but Scoop didn’t care enough to trim his sails—which amounts to the same thing. This combination of guts and gravitas has rarely graced the Senate. With the departure of Lieberman, it may, alas, be years before the like of Scoop Jackson appears in that chamber again.
1Had we “listened to some of the things that Henry Wallace said,” wrote McGovern, “we might have avoided Korea [!] and the Vietnam War.”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
‘Scoop’ Jackson at One Hundred
Must-Reads from Magazine
Terror is a choice.
Ari Fuld described himself on Twitter as a marketer and social media consultant “when not defending Israel by exposing the lies and strengthening the truth.” On Sunday, a Palestinian terrorist stabbed Fuld at a shopping mall in Gush Etzion, a settlement south of Jerusalem. The Queens-born father of four died from his wounds, but not before he chased down his assailant and neutralized the threat to other civilians. Fuld thus gave the full measure of devotion to the Jewish people he loved. He was 45.
The episode is a grim reminder of the wisdom and essential justice of the Trump administration’s tough stance on the Palestinians.
Start with the Taylor Force Act. The act, named for another U.S. citizen felled by Palestinian terror, stanched the flow of American taxpayer fund to the Palestinian Authority’s civilian programs. Though it is small consolation to Fuld’s family, Americans can breathe a sigh of relief that they are no longer underwriting the PA slush fund used to pay stipends to the family members of dead, imprisoned, or injured terrorists, like the one who murdered Ari Fuld.
No principle of justice or sound statesmanship requires Washington to spend $200 million—the amount of PA aid funding slashed by the Trump administration last month—on an agency that financially induces the Palestinian people to commit acts of terror. The PA’s terrorism-incentive budget—“pay-to-slay,” as Douglas Feith called it—ranges from $50 million to $350 million annually. Footing even a fraction of that bill is tantamount to the American government subsidizing terrorism against its citizens.
If we don’t pay the Palestinians, the main line of reasoning runs, frustration will lead them to commit still more and bloodier acts of terror. But U.S. assistance to the PA dates to the PA’s founding in the Oslo Accords, and Palestinian terrorists have shed American and Israeli blood through all the years since then. What does it say about Palestinian leaders that they would unleash more terror unless we cross their palms with silver?
President Trump likewise deserves praise for booting Palestinian diplomats from U.S. soil. This past weekend, the State Department revoked a visa for Husam Zomlot, the highest-ranking Palestinian official in Washington. The State Department cited the Palestinians’ years-long refusal to sit down for peace talks with Israel. The better reason for expelling them is that the label “envoy” sits uneasily next to the names of Palestinian officials, given the links between the Palestine Liberation Organization, President Mahmoud Abbas’s Fatah faction, and various armed terrorist groups.
Fatah, for example, praised the Fuld murder. As the Jerusalem Post reported, the “al-Aqsa Martyrs Brigades, the military wing of Fatah . . . welcomed the attack, stressing the necessity of resistance ‘against settlements, Judaization of the land, and occupation crimes.’” It is up to Palestinian leaders to decide whether they want to be terrorists or statesmen. Pretending that they can be both at once was the height of Western folly, as Ari Fuld no doubt recognized.
May his memory be a blessing.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The end of the water's edge.
It was the blatant subversion of the president’s sole authority to conduct American foreign policy, and the political class received it with fury. It was called “mutinous,” and the conspirators were deemed “traitors” to the Republic. Those who thought “sedition” went too far were still incensed over the breach of protocol and the reckless way in which the president’s mandate was undermined. Yes, times have certainly changed since 2015, when a series of Republican senators signed a letter warning Iran’s theocratic government that the Joint Comprehensive Plan of Action (aka, the Iran nuclear deal) was built on a foundation of sand.
The outrage that was heaped upon Senate Republicans for freelancing on foreign policy in the final years of Barack Obama’s administration has not been visited upon former Secretary of State John Kerry, though he arguably deserves it. In the publicity tour for his recently published memoir, Kerry confessed to conducting meetings with Iranian Foreign Minister Javad Zarif “three or four times” as a private citizen. When asked by Fox News Channel’s Dana Perino if Kerry had advised his Iranian interlocutor to “wait out” the Trump administration to get a better set of terms from the president’s successor, Kerry did not deny the charge. “I think everybody in the world is sitting around talking about waiting out President Trump,” he said.
Think about that. This is a former secretary of state who all but confirmed that he is actively conducting what the Boston Globe described in May as “shadow diplomacy” designed to preserve not just the Iran deal but all the associated economic relief and security guarantees it provided Tehran. The abrogation of that deal has put new pressure on the Iranians to liberalize domestically, withdraw their support for terrorism, and abandon their provocative weapons development programs—pressures that the deal’s proponents once supported.
“We’ve got Iran on the ropes now,” said former Democratic Sen. Joe Lieberman, “and a meeting between John Kerry and the Iranian foreign minister really sends a message to them that somebody in America who’s important may be trying to revive them and let them wait and be stronger against what the administration is trying to do.” This is absolutely correct because the threat Iran poses to American national security and geopolitical stability is not limited to its nuclear program. The Iranian threat will not be neutralized until it abandons its support for terror and the repression of its people, and that will not end until the Iranian regime is no more.
While Kerry’s decision to hold a variety of meetings with a representative of a nation hostile to U.S. interests is surely careless and unhelpful, it is not uncommon. During his 1984 campaign for the presidency, Jesse Jackson visited the Soviet Union and Cuba to raise his own public profile and lend credence to Democratic claims that Ronald Reagan’s confrontational foreign policy was unproductive. House Speaker Jim Wright’s trip to Nicaragua to meet with the Sandinista government was a direct repudiation of the Reagan administration’s support for the country’s anti-Communist rebels. In 2007, as Bashar al-Assad’s government was providing material support for the insurgency in Iraq, House Speaker Nancy Pelosi sojourned to Damascus to shower the genocidal dictator in good publicity. “The road to Damascus is a road to peace,” Pelosi insisted. “Unfortunately,” replied George W. Bush’s national security council spokesman, “that road is lined with the victims of Hamas and Hezbollah, the victims of terrorists who cross from Syria into Iraq.”
Honest observers must reluctantly conclude that the adage is wrong. American politics does not, in fact, stop at the water’s edge. It never has, and maybe it shouldn’t. Though it may be commonplace, American political actors who contradict the president in the conduct of their own foreign policy should be judged on the policies they are advocating. In the case of Iran, those who seek to convince the mullahs and their representatives that repressive theocracy and a terroristic foreign policy are dead-ends are advancing the interests not just of the United States but all mankind. Those who provide this hopelessly backward autocracy with the hope that America’s resolve is fleeting are, as John Kerry might say, on “the wrong side of history.”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Michael Wolff is its Marquis de Sade. Released on January 5, 2018, Wolff’s Fire and Fury became a template for authors eager to satiate the growing demand for unverified stories of Trump at his worst. Wolff filled his pages with tales of the president’s ignorant rants, his raging emotions, his television addiction, his fast-food diet, his unfamiliarity with and contempt for Beltway conventions and manners. Wolff made shocking insinuations about Trump’s mental state, not to mention his relationship with UN ambassador Nikki Haley. Wolff’s Trump is nothing more than a knave, dunce, and commedia dell’arte villain. The hero of his saga is, bizarrely, Steve Bannon, who in Wolff’s telling recognized Trump’s inadequacies, manipulated him to advance a nationalist-populist agenda, and tried to block his worst impulses.
Wolff’s sources are anonymous. That did not slow down the press from calling his accusations “mind-blowing” (Mashable.com), “wild” (Variety), and “bizarre” (Entertainment Weekly). Unlike most pornographers, he had a lesson in mind. He wanted to demonstrate Trump’s unfitness for office. “The story that I’ve told seems to present this presidency in such a way that it says that he can’t do this job, the emperor has no clothes,” Wolff told the BBC. “And suddenly everywhere people are going, ‘Oh, my God, it’s true—he has no clothes.’ That’s the background to the perception and the understanding that will finally end this, that will end this presidency.”
Nothing excites the Resistance more than the prospect of Trump leaving office before the end of his term. Hence the most stirring examples of Resistance Porn take the president’s all-too-real weaknesses and eccentricities and imbue them with apocalyptic significance. In what would become the standard response to accusations of Trumpian perfidy, reviewers of Fire and Fury were less interested in the truth of Wolff’s assertions than in the fact that his argument confirmed their preexisting biases.
Saying he agreed with President Trump that the book is “fiction,” the Guardian’s critic didn’t “doubt its overall veracity.” It was, he said, “what Mailer and Capote once called a nonfiction novel.” Writing in the Atlantic, Adam Kirsch asked: “No wonder, then, Wolff has written a self-conscious, untrustworthy, postmodern White House book. How else, he might argue, can you write about a group as self-conscious, untrustworthy, and postmodern as this crew?” Complaining in the New Yorker, Masha Gessen said Wolff broke no new ground: “Everybody” knew that the “president of the United States is a deranged liar who surrounded himself with sycophants. He is also functionally illiterate and intellectually unsound.” Remind me never to get on Gessen’s bad side.
What Fire and Fury lacked in journalistic ethics, it made up in receipts. By the third week of its release, Wolff’s book had sold more than 1.7 million copies. His talent for spinning second- and third-hand accounts of the president’s oddity and depravity into bestselling prose was unmistakable. Imitators were sure to follow, especially after Wolff alienated himself from the mainstream media by defending his innuendos about Haley.
It was during the first week of September that Resistance Porn became a competitive industry. On the afternoon of September 4, the first tidbits from Bob Woodward’s Fear appeared in the Washington Post, along with a recording of an 11-minute phone call between Trump and the white knight of Watergate. The opposition began panting soon after. Woodward, who like Wolff relies on anonymous sources, “paints a harrowing portrait” of the Trump White House, reported the Post.
No one looks good in Woodward’s telling other than former economics adviser Gary Cohn and—again bizarrely—the former White House staff secretary who was forced to resign after his two ex-wives accused him of domestic violence. The depiction of chaos, backstabbing, and mutual contempt between the president and high-level advisers who don’t much care for either his agenda or his personality was not so different from Wolff’s. What gave it added heft was Woodward’s status, his inviolable reputation.
“Nothing in Bob Woodward’s sober and grainy new book…is especially surprising,” wrote Dwight Garner at the New York Times. That was the point. The audience for Wolff and Woodward does not want to be surprised. Fear is not a book that will change minds. Nor is it intended to be. “Bob Woodward’s peek behind the Trump curtain is 100 percent as terrifying as we feared,” read a CNN headline. “President Trump is unfit for office. Bob Woodward’s ‘Fear’ confirms it,” read an op-ed headline in the Post. “There’s Always a New Low for the Trump White House,” said the Atlantic. “Amazingly,” wrote Susan Glasser in the New Yorker, “it is no longer big news when the occupant of the Oval Office is shown to be callous, ignorant, nasty, and untruthful.” How could it be, when the press has emphasized nothing but these aspects of Trump for the last three years?
The popular fixation with Trump the man, and with the turbulence, mania, frenzy, confusion, silliness, and unpredictability that have surrounded him for decades, serves two functions. It inoculates the press from having to engage in serious research into the causes of Trump’s success in business, entertainment, and politics, and into the crises of borders, opioids, stagnation, and conformity of opinion that occasioned his rise. Resistance Porn also endows Trump’s critics, both external and internal, with world-historical importance. No longer are they merely journalists, wonks, pundits, and activists sniping at a most unlikely president. They are politically correct versions of Charles Martel, the last line of defense preventing Trump the barbarian from enacting the policies on which he campaigned and was elected.
How closely their sensational claims and inflated self-conceptions track with reality is largely beside the point. When the New York Times published the op-ed “I am Part of the Resistance Inside the Trump Administration,” by an anonymous “senior official” on September 5, few readers bothered to care that the piece contained no original material. The author turned policy disagreements over trade and national security into a psychiatric diagnosis. In what can only be described as a journalistic innovation, the author dispensed with middlemen such as Wolff and Woodward, providing the Times the longest background quote in American history. That the author’s identity remains a secret only adds to its prurient appeal.
“The bigger concern,” the author wrote, “is not what Mr. Trump has done to the presidency but what we as a nation have allowed him to do to us.” Speak for yourself, bud. What President Trump has done to the Resistance is driven it batty. He’s made an untold number of people willing to entertain conspiracy theories, and to believe rumor is fact, hyperbole is truth, self-interested portrayals are incontrovertible evidence, credulity is virtue, and betrayal is fidelity—so long as all of this is done to stop that man in the White House.
Choose your plan and pay nothing for six Weeks!
Review of 'Stanley Kubrick' By Nathan Abrams
Except for Stanley Donen, every director I have worked with has been prone to the idea, first propounded in the 1950s by François Truffaut and his tendentious chums in Cahiers du Cinéma, that directors alone are authors, screenwriters merely contingent. In singular cases—Orson Welles, Michelangelo Antonioni, Woody Allen, Kubrick himself—the claim can be valid, though all of them had recourse, regular or occasional, to helping hands to spice their confections.
Kubrick’s variety of topics, themes, and periods testifies both to his curiosity and to his determination to “make it new.” Because his grades were not high enough (except in physics), this son of a Bronx doctor could not get into colleges crammed with returning GIs. The nearest he came to higher education was when he slipped into accessible lectures at Columbia. He told me, when discussing the possibility of a movie about Julius Caesar, that the great classicist Moses Hadas made a particularly strong impression.
While others were studying for degrees, solitary Stanley was out shooting photographs (sometimes with a hidden camera) for Look magazine. As a movie director, he often insisted on take after take. This gave him choices of the kind available on the still photographer’s contact sheets. Only Peter Sellers and Jack Nicholson had the nerve, and irreplaceable talent, to tell him, ahead of shooting, that they could not do a particular scene more than two or three times. The energy to electrify “Mein Führer, I can walk” and “Here’s Johnny!” could not recur indefinitely. For everyone else, “Can you do it again?” was the exhausting demand, and it could come close to being sadistic.
The same method could be applied to writers. Kubrick might recognize what he wanted when it was served up to him, but he could never articulate, ahead of time, even roughly what it was. Picking and choosing was very much his style. Cogitation and opportunism went together: The story goes that he attached Strauss’s Blue Danube to the opening sequence of 2001 because it happened to be playing in the sound studio when he came to dub the music. Genius puts chance to work.
Until academics intruded lofty criteria into cinema/film, the better to dignify their speciality, Alfred Hitchcock’s attitude covered most cases: When Ingrid Bergman asked for her motivation in walking to the window, Hitch replied, fatly, “Your salary.” On another occasion, told that some scene was not plausible, Hitch said, “It’s only a movie.” He did not take himself seriously until the Cahiers du Cinéma crowd elected to make him iconic. At dinner, I once asked Marcello Mastroianni why he was so willing to play losers or clowns. Marcello said, “Beh, cinema non e gran’ cosa” (cinema is no big deal). Orson Welles called movie-making the ultimate model-train set.
That was then; now we have “film studies.” After they moved in, academics were determined that their subject be a very big deal indeed. Comedy became no laughing matter. In his monotonous new book, the film scholar Nathan Abrams would have it that Stanley Kubrick was, in essence, a “New York Jewish intellectual.” Abrams affects to unlock what Stanley was “really” dealing with, in all his movies, never mind their apparent diversity. It is declared to be, yes, Yiddishkeit, and in particular, the Holocaust. This ground has been tilled before by Geoffrey Cocks, when he argued that the room numbers in the empty Overlook Hotel in The Shining encrypted references to the Final Solution. Abrams would have it that even Barry Lyndon is really all about the outsider seeking, and failing, to make his awkward way in (Gentile) Society. On this reading, Ryan O’Neal is seen as Hannah Arendt’s pariah in 18th-century drag. The movie’s other characters are all engaged in the enjoyment of “goyim-naches,” an expression—like menschlichkayit—he repeats ad nauseam, lest we fail to get the stretched point.
Theory is all when it comes to the apotheosis of our Jew-ridden Übermensch. So what if, in order to make a topic his own, Kubrick found it useful to translate its logic into terms familiar to him from his New York youth? In Abrams’s scheme, other mundane biographical facts count for little. No mention is made of Stanley’s displeasure when his 14-year-old daughter took a fancy to O’Neal. The latter was punished, some sources say, by having Barry’s voiceover converted from first person so that Michael Hordern would displace the star as narrator. By lending dispassionate irony to the narrative, it proved a pettish fluke of genius.
While conning Abrams’s volume, I discovered, not greatly to my chagrin, that I am the sole villain of the piece. Abrams calls me “self-serving” and “unreliable” in my accounts of my working and personal relationship with Stanley. He insinuates that I had less to do with Eyes Wide Shut than I pretend and that Stanley regretted my involvement. It is hard for him to deny (but convenient to omit) that, after trying for some 30 years to get a succession of writers to “crack” how to do Schnitzler’s Traumnovelle, Kubrick greeted my first draft with “I’m absolutely thrilled.” A source whose anonymity I respect told me that he had never seen Stanley so happy since the day he received his first royalty check (for $5 million) for 2001. No matter.
Were Abrams (the author also of a book as hostile to Commentary as this one is to me) able to put aside his waxed wrath, he might have quoted what I reported in my memoir Eyes Wide Open to support his Jewish-intellectual thesis. One day, Stanley asked me what a couple of hospital doctors, walking away with their backs to the camera, would be talking about. We were never going to hear or care what it was, but Stanley—at that early stage of development—said he wanted to know everything. I said, “Women, golf, the stock market, you know…”
“Couple of Gentiles, right?”
“That’s what you said you wanted them to be.”
“Those people, how do we ever know what they’re talking about when they’re alone together?”
“Come on, Stanley, haven’t you overheard them in trains and planes and places?”
Kubrick said, “Sure, but…they always know you’re there.”
If he was even halfway serious, Abrams’s banal thesis that, despite decades of living in England, Stanley never escaped the Old Country, might have been given some ballast.
Now, as for Stanley Kubrick’s being an “intellectual.” If this implies membership in some literary or quasi-philosophical elite, there’s a Jewish joke to dispense with it. It’s the one about the man who makes a fortune, buys himself a fancy yacht, and invites his mother to come and see it. He greets her on the gangway in full nautical rig. She says, “What’s with the gold braid already?”
“Mama, you have to realize, I’m a captain now.”
She says, “By you, you’re a captain, by me, you’re a captain, but by a captain, are you a captain?”
As New York intellectuals all used to know, Karl Popper’s definition of bad science, and bad faith, involves positing a theory and then selecting only whatever data help to furnish its validity. The honest scholar makes it a matter of principle to seek out elements that might render his thesis questionable.
Abrams seeks to enroll Lolita in his obsessive Jewish-intellectual scheme by referring to Peter Arno, a New Yorker cartoonist whom Kubrick photographed in 1949. The caption attached to Kubrick’s photograph in Look asserted that Arno liked to date “fresh, unspoiled girls,” and Abrams says this “hint[s] at Humbert Humbert in Lolita.” Ah, but Lolita was published, in Paris, in 1955, six years later. And how likely is it, in any case, that Kubrick wrote the caption?
The film of Lolita is unusual for its garrulity. Abrams’s insistence on the sinister Semitic aspect of both Clare Quilty and Humbert Humbert supposedly drawing Kubrick like moth to flame is a ridiculous camouflage of the commercial opportunism that led Stanley to seek to film the most notorious novel of the day, while fudging its scandalous eroticism.
That said, in my view, The Killing, Paths of Glory, Barry Lyndon, and Clockwork Orange were and are sans pareil. The great French poet Paul Valéry wrote of “the profundity of the surface” of a work of art. Add D.H. Lawrence’s “never trust the teller, trust the tale,” and you have two authoritative reasons for looking at or reading original works of art yourself and not relying on academic exegetes—especially when they write in the solemn, sometimes ungrammatical style of Professor Abrams, who takes time out to tell those of us at the back of his class that padre “is derived from the Latin pater.”
Abrams writes that I “claim” that I was told to exclude all overt reference to Jews in my Eyes Wide Shut screenplay, with the fatuous implication that I am lying. I am again accused of “claiming” to have given the name Ziegler to the character played by Sidney Pollack, because I once had a (quite famous) Hollywood agent called Evarts Ziegler. So I did. The principal reason for Abrams to doubt my veracity is that my having chosen the name renders irrelevant his subsequent fanciful digression on the deep, deep meanings of the name Ziegler in Jewish lore; hence he wishes to assign the naming to Kubrick. Pop goes another wished-for proof of Stanley’s deep and scholarly obsession with Yiddishkeit.
Abrams would be a more formidable enemy if he could turn a single witty phrase or even abstain from what Karl Kraus called mauscheln, the giveaway jargon of Jewish journalists straining to pass for sophisticates at home in Gentile circles. If you choose, you can apply, on line, for screenwriting lessons from Nathan Abrams, who does not have a single cinematic credit to his name. It would be cheaper, and wiser, to look again, and then again, at Kubrick’s masterpieces.
Choose your plan and pay nothing for six Weeks!
Is American opera in terminal condition?
At the Met, distinguished singers and conductors, mostly born and trained in Europe, appeared in theatrically conservative big-budget productions of the popular operas of the 19th century, with a sprinkling of pre-romantic and modern works thrown in to leaven the loaf. City Opera, by contrast, presented younger artists—many, like Beverly Sills, born in this country—in a wider-ranging, more adventurously staged repertoire that often included new operas, some of them written by American composers, to which the public was admitted at what were then called “popular prices.”
Between them, the companies represented a feast for culture-consuming New Yorkers, though complaints were already being heard that their new theaters were too big. Moreover, neither the Met nor City Opera was having any luck at commissioning memorable new operas and thereby expanding and refreshing the operatic repertoire, to which only a handful of significant new works—none of them, then or since, premiered by either company—had been added since World War I.
A half-century later, the feast has turned to famine. In 2011, New York City Opera left Lincoln Center, declaring bankruptcy. It closed its doors forever two years later. The Met has weathered a nearly uninterrupted string of crises that climaxed earlier this year with the firing of James Levine, the company’s once-celebrated music director emeritus. He was accused in 2017 of molesting teenage musicians and was dismissed from all of his conducting posts in New York and elsewhere. Today the Met is in dire financial straits that threaten its long-term survival.
And while newer opera companies in such other American cities as Chicago, Houston, San Francisco, Santa Fe, and Seattle now offer alternative models of leadership, none has established itself as a potential successor either to the Met or the now-defunct NYCO.1
Is American opera as a whole in a terminal condition? Or are the collapse of the New York City Opera and the Met’s ongoing struggle to survive purely local matters of no relevance elsewhere? Heidi Waleson addresses these questions in Mad Scenes and Exit Arias: The Death of the New York City Opera and the Future of Opera in America.2 Waleson draws on her experience as the opera critic of the Wall Street Journal to speculate on the prospects for an art form that has never quite managed to set down firm roots in American culture.
In this richly informative chronicle of NYCO’s decline and fall, Waleson persuasively argues that what happened to City Opera (and, by extension, the Met) could happen to other opera companies as well. The days in which an ambitious community sought successfully to elevate itself into the first rank of world cities by building and manning an opera house are long past, and Mad Scenes and Exit Arias helps us understand why.As Waleson reminds us, it was Fiorello LaGuardia, the New York mayor who played a central role in the creation of the NYCO, who dubbed the company “the people’s opera” when it was founded in 1943. According to LaGuardia, NYCO existed to perform popular operas at popular prices for a mass audience. In later years, it moved away from that goal, but the slogan stuck. Indeed, no opera company has ever formulated a clearer statement of its institutional mission.
Even after it moved to Lincoln Center in 1966, NYCO had an equally coherent and similarly appealing purpose: It was where you went to see the opera stars of tomorrow, foremost among them Sills and Plácido Domingo, in inexpensively but imaginatively staged productions of the classics. The company went out of its way to present modern operas, too, but it never did so at the expense of its central repertoire—and tickets to its performances cost half of what the Met charged. Well into the 21st century, City Opera stuck more or less closely to its redefined mission. Under Paul Kellogg, the general and artistic director from 1996 to 2007, it did so with consistent artistic success. But revenues declined throughout the latter part of Kellogg’s tenure, in part because younger New Yorkers were unwilling to become subscribers.
In those days, the Metropolitan Opera, NYCO’s next-door neighbor, was still one of the world’s most conservative opera houses. That changed when Peter Gelb became its general manager in 2006. Gelb was resolved to modernize the Met’s productions and, to a lesser extent, its repertoire, and he simultaneously sought to heighten its national profile by digitally simulcasting live performances into movie theaters throughout America.
Kellogg was frustrated by the chronic acoustic inadequacies of the New York State Theater and sought in vain to move City Opera to a three-theater complex that was to be built (but never was) on the World Trade Center site. He retired soon after Gelb came to the Met. Kellogg was succeeded by Gérard Mortier, a European impresario who was accustomed to working in state-subsidized theaters. Mortier made a pair of fateful decisions. First, he canceled City Opera’s entire 2008–2009 season while the interior of the State Theater underwent much-needed renovations. Then he announced a follow-up season of 20th-century operas that lacked audience appeal.
That follow-up season never happened, because Mortier resigned in 2008 and fled New York. He was replaced by George Steel, who had previously served for just three months as general manager of the Dallas Opera. Under Steel, NYCO slashed its schedule to ribbons in a futile attempt to get back on its financial feet after Mortier’s financially ruinous year-long hiatus. Then he mounted a series of productions of nonstandard repertory that received mixed reviews and flopped at the box office.
The combined effect of Gelb’s innovations and the inept leadership of Mortier and Steel all but obliterated City Opera’s reason for existing. Under Gelb, the Met’s repertory ranged from such warhorses as Rigoletto and Tosca to 20th-century masterpieces like Benjamin Britten’s Midsummer Night’s Dream and Alban Berg’s Wozzeck, and tickets could be bought for as little as $20. With the Met performing a more interesting repertoire under a wider range of directors, and in part at “people’s prices,” City Opera no longer did anything that the Met wasn’t already doing on a far larger and better-financed scale. What, then, was its mission now? The truth was that it had none, and when the company went under in 2013, few mourned its passing.
As it happened, Gelb’s own innovations were a mere artistic Band-aid, for he was unwilling or unable to trim the Met’s bloated budget to any meaningful extent. He made no serious attempt to cut the company’s labor costs until a budget crisis in 2014 forced him to confront its unions, which he did with limited success. In addition, his new productions of the standard-repertory operas on which the Met relied to draw and hold older subscribers were felt by many to be trashily trendy.
The Met had particular difficulty managing the reduced circumstances of the 21st century when it came to opera. Its 3,800-seat theater has an 80-foot-deep stage with a proscenium opening that measures 54 feet on each side. (Bayreuth, by contrast, seats 1,925, La Scala 2,030, and the Vienna State Opera 2,200.) As a result, it is all but impossible to mount low-to-medium-budget shows in the Metropolitan Opera House, even as the company finds it is no longer able to fill its increasingly empty house. Two decades ago, the Met earned 90 percent of its potential box-office revenue. That figure plummeted to 66 percent by 2015, forcing Gelb to raise ticket prices to an average of $158.50 per head. On Broadway, the average price of a ticket that season was $103.86.
Above all, Gelb was swimming against the cultural tide. Asked about the effects on audience development of the Met simulcasts, he admitted that three-quarters of the people who attended them were “over 65, and 30 percent of them are over 75.” As he explained: “Grand opera is in itself a kind of a dinosaur of an art form…. The question is not whether I think I’m doing a good job or not in trying to keep the [Metropolitan Opera] alive. It’s whether I’m doing a good job or not in the face of a cultural and social rejection of opera as an art form. And what I’m doing is fighting an uphill battle to try and maintain an audience in a very difficult time.”
Was that statement buck-passing defeatism, or a fair appraisal of the state of American opera? Other opera executives distanced themselves from Gelb’s remarks, and it was true—and still is—that smaller American companies have done a somewhat better job of attracting younger audiences than the top-heavy Met. But according to the National Endowment for the Arts, the percentage of U.S. adults who attend at least one operatic performance each year declined from 3.2 percent in 2002 to 2.1 percent in 2012. This problem, of course, is not limited to opera. As I wrote in these pages in 2010, the disappearance of secondary-school arts education and the rise of digital media may well be leading to “not merely a decline in public interest in the fine arts but the death of the live audience as a cultural phenomenon.”3D oes American opera have a future in an era of what Heidi Waleson succinctly describes as “flat ticket income and rising expenses”? In the last chapter of Mad Scenes and Exit Arias, she chronicles the activities of a group of innovative smaller troupes that are “rethinking what an opera company is, what it does, and who it serves.” Yet in the same breath, she acknowledges the possibility that “filling a giant theater for multiple productions of grand operas [is] no longer an achievable goal.”
If that is so, then it may be worth asking a different question: Did American opera ever have a past? It is true that opera in America has had a great and glorious history, but virtually the whole of that history consisted of American productions of 18th- and 19th-century European operas. By contrast, no opera by an American classical composer has ever entered the international major-house repertoire. Indeed, while new American operas are still commissioned and premiered at an impressive rate, few things are so rare as a second production of any of these works.
While a handful continue to be performed—John Adams’s Nixon in China (1987), André Previn’s A Streetcar Named Desire (1995), Mark Adamo’s Little Women (1998), and Jake Heggie’s Dead Man Walking (2000)—their success is a tribute to the familiarity of their subject matter and source material, not their musico-theatrical quality. As for the rest, the hard but inescapable truth is that with the exception of George Gershwin’s Porgy and Bess (1935), virtually all large-scale American operas have been purpose-written novelties that were shelved and forgotten immediately after their premieres.
The success of Porgy and Bess, which received its premiere not in an opera house but on Broadway, reminds us that American musical comedy, unlike American opera, is deeply rooted in our national culture, in much the same way that grand opera is no less deeply rooted in the national cultures of Germany and Italy, where it is still genuinely popular (if less so today than a half-century ago). By comparison with Porgy, Carousel, Guys and Dolls, or My Fair Lady, American opera as a homegrown form simply does not exist: It is merely an obscure offshoot of its European counterpart. Aaron Copland, America’s greatest composer, was not really joking when he wittily described opera as “la forme fatale,” and his own failed attempts to compose an audience-friendly opera that would be as successful as his folk-flavored ballet scores say much about the difficulties facing any composer who seeks to follow in his footsteps.
It is not that grand opera is incapable of appealing to American theatergoers. Even now, there are many Americans who love it passionately, just as there are regional companies such as Chicago’s Lyric Opera and San Francisco Opera that have avoided making the mistakes that closed City Opera’s doors. Yet the crises from which the Metropolitan Opera has so far failed to extricate itself suggest that in the absence of the generous state subsidies that keep European opera houses in business, large-house grand opera in America may simply be too expensive to thrive—or, ultimately, to survive. At its best, no art form is more thrilling or seductive. But none is at greater risk of following the dinosaurs down the cold road to extinction.
1 The “New York City Opera” founded in 2016 that now mounts operas in various New York theaters on an ad hoc basis is a brand-new enterprise that has no connection with its predecessor.
2 Metropolitan Books, 304 pages