Everybody hates it. And it’s not going anywhere.
In May, President Barack Obama responded to critics of his counterterrorism policies by declaring that he was ready to work with Congress “to refine, and ultimately repeal” the authorization for the use of military force that served as the legal foundation and operational genesis of the war on terror. “Our systematic effort to dismantle terrorist organizations must continue,” he said. “But this war, like all wars, must end. That’s what history advises. That’s what our democracy demands.”
The muted and skeptical reaction to the president’s speech cast doubt on the implication that this war is, in any crucial way, “like all wars.” The next day, Time asked, “Can Obama End the War on Terror?” CBS News headlined its story “Should President Obama end the war on terror?” The questions lingered and piled up; a week after the speech, the Nation chimed in: “Will Obama End the Long War on Terror?” Can? Should? Will? Suddenly, the president who claims to have been “elected to end wars, not start them” appeared powerless to do anything but perpetuate this one.
What explains the resilience and persistence of this particular war? It isn’t lack of a suitably peace-minded commander in chief, for Obama is a wartime president who longs to be a peacetime president. It isn’t bureaucratic inertia or a recalcitrant Congress, for they didn’t stop the president’s quest to end the wars in Iraq and Afghanistan. And it isn’t lack of a scandal sparking public outrage, for the recent revelations about the National Security Agency’s prying have left Americans wondering if the government is reading their emails.
That the war on terror has been criticized from a variety of political angles obscures its key fact: Through its success and the underappreciated conceptual wisdom of its design, the war on terror has quietly forged a coalition that spans the ideological spectrum. In the end, this elastic term that Western politicians have time and again resolved to stop using has become the durable bipartisan consensus underpinning security in the 21st century.
How did this happen? If we are to understand this, we must discard the conventional narratives and examine the surprising political and intellectual alliance that has sustained the fight against terrorism, just as that alliance faces a newly energized coalition against it.
The War on the Term ‘War on Terror’
A ubiquitous complaint about the war on terror from the beginning has been its name. The common refrain, employed by those on the right as well as the left, is that terrorism is a tactic, not an enemy. Obama himself has said this. After his victory in the 2004 Senate election in Illinois, Obama responded to a question about terrorism by saying: “Ultimately, terrorism is a tactic. It’s not—we’re not fighting terrorists; we’re fighting people who engage in terrorism.”
While it’s certainly true that terrorism is a tactic, Obama’s statement seemed to deny the existence of the entire category of “terrorist” in favor of “people who engage in terrorism.” If that seems confused, it’s nothing compared with trying to build national policy around such word games, as Obama discovered when he became president. In 2010, the president sent top counterterrorism adviser John Brennan (now director of the CIA) to give a speech on the eve of the White House’s unveiling of its National Security Strategy. It turned out there were several terms the president wanted deleted from the political lexicon. Brennan said:
The president’s strategy is absolutely clear about the threat we face. Our enemy is not “terrorism” because terrorism is but a tactic. Our enemy is not “terror” because terror is a state of mind and as Americans we refuse to live in fear. Nor do we describe our enemy as “jihadists” or “Islamists” because jihad is a holy struggle, a legitimate tenet of Islam, meaning to purify oneself or one’s community, and there is nothing holy or legitimate or Islamic about murdering innocent men, women, and children….
Instead, the president’s strategy is clear and precise. Our enemy is al-Qaeda and its terrorist affiliates. For it was al-Qaeda who attacked us so viciously on 9/11 and whose desire to attack the United States, our allies, and our partners remains undiminished. And…it is its affiliates who have taken up al-Qaeda’s call to arms against the United States in other parts of the world.
The knock on “war on terror” is that it is too broad. But Brennan’s garbled address is a perfect example of what can happen when the attempt is made to narrow the target. It is logical to object that terrorism is “just” a tactic. It’s also correct for Brennan to say that “terror” is a state of mind, and the commander in chief cannot plausibly order the armed forces to pursue and rid the world of a state of mind. So who or what, then, is the enemy?
Contra Senator-elect Obama circa 2004, Brennan identifies the enemy as terrorists—but not all terrorists, just those affiliated with al-Qaeda. He is opposed to the terms “terrorism” and “terror.” But by drawing a distinction between al-Qaeda-affiliated terrorists and other terrorists, the administration can’t call it the “war on terrorists” either—even though it is including “terrorists” as a subcategory of the enemy. If they are not “terrorists,” what are they? They are certainly jihadists and usually Islamists. But Brennan objects to both those terms as well.
So Brennan, here speaking as the administration’s chief adviser on the subject, wants to remove all mention of the enemy’s actions (“terrorism”), motivations (“jihad”), and ideology (“Islamism”). This is an absurd approach to national security, and yet it emerged as the official strategy of the American government because, following the old adage, the administration could neither live with the “war on terror” nor without it.
To criticize Brennan for an unwillingness to use certain words, however, is not to suggest that substituting those words for “terrorism” would solve the problem. If the United States were to launch a “war on jihad,” that would be no less broad—and no more edifying—than a “war on terror.” Similarly, there is no strategic clarity to be gained in going to war against Islamism. Not all Islamists are terrorists, and such a posture would put us at war with Islamist states or states run by Islamist ruling parties. A “war on Islamism” would, in addition, be unjustifiably close to a “war on Islam,” something the administration of George W. Bush went to great pains to avoid (and rightly so).
But some critics of the war on terror think Bush was wrong to make that distinction. This is where right and left diverge in their critiques of the war, each in their own destructive way.
The Rightist War on the ‘War on Terror’
In the aftermath of the September 11, 2001, attacks, two events that seemed terrifying and inevitable didn’t materialize. The first was a follow-up mass-casualty terrorist attack on the U.S. homeland. The second was an extensive and violent backlash against America’s sizable Muslim population in a fit of misdirected rage. It was reassuring that such a backlash didn’t happen, but the misdirected rage was there nonetheless.
The 9/11 attacks posed a discomfiting challenge to the tolerance on which liberal democracies pride themselves. The motive and justification for the attacks, and the new age of atrocity they ushered in, rested on a violent, and far too widely accepted, interpretation of Islamic law. And so some sought to inculcate a general suspicion of Islam, and thus of Muslims, in the public consciousness. They were aided in their quest by the undeniable tendency of Muslim extremists to use the institution of the local mosque as an organizational structure to spread their violent ideology.
President Bush anticipated and attempted to counteract this by visiting mosques in the aftermath of the attacks and speaking of Islam as a religion of peace. Bush understood that if the West was going to be at war with devout and stateless Muslims, it must be at pains not to be at war with Islam itself. Some on the right took the other tack, hoping for widespread Islamic reform from within. “I am at war with Islam,” the former Muslim dissident Ayaan Hirsi Ali told an interviewer in 2007, “but I am not at war with Muslims.”
Others, however, wanted to be at war with both, and continue to want this. In June 2012, a handful of congressional Republicans led by Representative Michele Bachmann sent a letter of inquiry to the office of the State Department inspector general. As far as they were concerned, then-Secretary of State Hillary Clinton had been too friendly to the transnational Islamist group known as the Muslim Brotherhood, a high-ranking member of which was poised to win the presidency of Egypt. (The figure, Mohamed Morsi, was ousted in a military coup a year later.)
The letter requested an investigation into the reason for the leniency with which the State Department apparently viewed the Brotherhood. But Bachmann’s group had a theory. According to a briefing book published by the Center for Security Policy, the letter alleged, “the Department’s Deputy Chief of Staff, Huma Abedin, has three family members—her late father, her mother, and her brother—connected to Muslim Brotherhood operatives and/or organizations.” Abedin is a longtime aide to Hillary Clinton and is married to former congressman Anthony Weiner. The unstated accusation was that Abedin was acting as a sleeper agent for the Muslim Brotherhood.
This guilt-by-family-association campaign against Abedin was an extreme case, but it was and is not an isolated phenomenon. Also troubling are the attempts to pass so-called anti-Sharia legislation aimed at curbing the application of Islamic law in American legal disputes. The courts should not, of course, substitute narrow religious doctrine for traditional jurisprudence. Nor are they in danger of doing so. Additionally, accommodating religious concerns without undermining the rule of law should be embraced, not discouraged.
Nor can it be claimed that the anti-Sharia legislation was not intended to infringe on the traditional sphere of private worship. One prominent anti-Sharia organization, the Society of Americans for National Existence, put together model draft legislation to guide state legislators in banning Sharia. One such draft stated that “it shall be a felony punishable by 20 years in prison to knowingly act in furtherance of, or to support, the adherence to Sharia” and that “the president of the United States of America shall immediately declare that all non-US citizen Sharia-adherents are Alien Enemies under Chapter 3 of Title 50 of the US Code and shall be subject to immediate deportation.”
Such is the reasoning of those who think “the war on terror” a weasel term—and think that the right war, “the war on Islam,” permits the effective outlawing of a faith.
The Leftist War on the ‘War on Terror’
While extreme voices on the right wanted those who simply shared the faith of Islamist terrorists to be treated like criminals, the mainstream left wanted those who prosecuted the war against terrorists to be treated like criminals.
“The thing to do,” posited liberal writer Matthew Yglesias in 2007, then writing for the Atlantic, “is to impeach Bush and Cheney on a dual docket.” Yglesias’s tone was cheerful and nonchalant. He was offering this advice because, he said, the question of whether to impeach George W. Bush “needs to enter the mainstream conversation.”
What did Bush do to earn a purge of the executive branch? A better question is: In the minds of Democrats, what didn’t Bush do? The political scribe Ben Smith noted in 2011 that “more than half of Democrats, according to a neutral survey, said they believed Bush was complicit in the 9/11 terror attacks.” The accusations hurled at Bush while he was in office descended into irrationality and lost their power to shock. But what was surprising was the degree to which the questions and accusations persisted well after Bush left office. In March of this year, CNN ran a segment with the tagline, “Legacy of Iraq War: Should Bush officials be tried for war crimes?”
Far more troubling was the Obama administration’s attitude that it had a responsibility to make that determination. And it would be a mistake to suggest that the purge of the Bush administration would have stopped at the president and vice president. During the Bush administration, career prosecutors at the Justice Department were tapped to investigate cases of possible misconduct at the CIA, which used some of the more controversial methods of interrogation of detainees, such as waterboarding. The attorneys found that one prosecution was warranted and it went forward; they determined that the other prosecutions should not be pursued.
But the Obama team, which strode into office denouncing the moral turpitude of its predecessor, wasn’t so sure. In an unprecedented move, Attorney General Eric Holder ordered the cases reopened in search of a sacrificial lamb—the sort of hyperpartisan prosecutorial stunt that is blessedly rare in healthy democracies. No prosecutions were filed in the end, but the cases took three years before Holder put the issue to rest.
The detention center at the U.S. Naval base at Guantanamo Bay provided another opportunity to make an example out of somebody from the Bush administration. In her book The Obamas, Jodi Kantor describes a meeting Obama held early in his first term with leading civil-liberties advocates in the hope that a private meeting with the president would placate them enough to not criticize him publicly, as they (and he) did to Bush. According to Kantor, ACLU Executive Director Anthony Romero pushed the president to prosecute somebody—anybody—from the previous White House for the stories of abuse coming out of Gitmo. “Hunt one head and hunt it famously and bring it down to ensure we don’t make the same mistakes again,” he pleaded.
But the new president couldn’t even guarantee closing or emptying the prison. Aside from having to overcome public and congressional opposition, there was the issue of its plain wisdom. According to Kantor, Obama worried that releasing the wrong detainee could result in a terrorist attack against America. “When I was a senator running for office, I talked very firmly about what I thought was right based on the information I had,” Obama was quoted as saying in that meeting. “Now I’m the president of all the people, and the decisions I make have to be from that perspective based on the information I now have.”
This phenomenon, in which the president suddenly embraced the logic of his predecessor’s policies, soon became the norm.
The War on the ‘War on Terror’
Inside Iraq and Afghanistan
Public appetite for the wars in Iraq and Afghanistan have plummeted, and a recent ABC News/Washington Post poll found that only 28 percent of Americans think the Afghanistan war was worth fighting. Politically, however, the two wars have been treated in vastly different ways. Barack Obama built his career in part by denouncing the Iraq war as a “dumb war” cooked up by Bush administration advisers intent on distracting the American public from their domestic policies. The war in Afghanistan, however, was a different story. In 2009, Obama said the Afghanistan campaign “is not a war of choice” but “a war of necessity,” adding: “This is not only a war worth fighting. This is fundamental to the defense of our people.”
Afghanistan was “the good war,” as this mind-set became known. Even über-libertarian Congressman Justin Amash defended the decision to invade Afghanistan. Making sense of this disparity was easy: Those responsible for 9/11 were based in Afghanistan and shielded by the country’s governing Taliban regime. Diverting resources to deposing Saddam Hussein and rebuilding Iraq was effortlessly demagogued as a foolish distraction. The full picture is more complicated.
The common picture of the Iraq war divides the conflict into a conventional ground war between America and Iraq’s armed forces, which ended with the toppling of Saddam’s regime and his army’s swift defeat, and the nation-building era that followed it as the country fell into sectarian civil war fed by the opportunism and terror-tourism of al-Qaeda. In reality, however, the campaign in Iraq was a combination of a conventional ground war—which included the war’s aftermath and occupation—and the war on terror, with both taking place simultaneously.
After the initial military victory over the Iraqi armed forces, the task of rebuilding Iraq was directed on the ground by the Coalition Provisional Authority, led for the first year by L. Paul Bremer III. Bremer took over in May 2003, and within weeks made two significant decisions. His first CPA order initiated the so-called de-Baathification process, in which tens of thousands of members of Saddam Hussein’s Baath Party were removed from their positions throughout Iraq’s fragile civil-society institutions. The second order disbanded the Iraqi army. Iraq’s state institutions were thrown into chaos, and the recently routed soldiers were now unemployed and with a chip on their shoulder, free to settle scores and join the insurgency. This vacuum of power, combined with the thousands of weapons caches hidden around the country, created a security nightmare for coalition forces.
That security nightmare hit its stride when al-Qaeda joined the fight. Led initially by the Jordanian militant Abu Musab al-Zarqawi, al-Qaeda in Iraq became the focal point for the American war on terror in Iraq. The American invasion did not bring Zarqawi to Iraq; by all accounts he was already there, first in Iraqi Kurdistan as early as 2002 and then involved in establishing terrorist cells in and around Baghdad early the following year. He officially pledged loyalty to Osama bin Laden in 2004. Zarqawi became obsessed with triggering a full-blown civil war between Iraqi Sunnis and Shiites, an effort that reached its pinnacle with the 2006 bombing of the Golden Mosque in Samarra, one of the most important holy sites to Shia Muslims.
Al-Qaeda in Iraq was a foreign body, hostile to Iraq’s native Shiite majority while directed operationally by the Jordanian Zarqawi, loyal to the Saudi Bin Laden, and led spiritually by a Palestinian cleric. It was also, as the prolific author on al-Qaeda Peter Bergen has noted, “made up largely of foreigners at its inception,” before recruiting sympathetic Iraqis to its cause. This came into stark relief in 2006 when the Iraqi body politic began expelling the al-Qaeda infection. Fed up with al-Qaeda’s brutal rule over Iraq’s Anbar province, tribal sheikhs were ready to fight back. American forces gave them the protection, firepower, and cooperation essential to turn the tide. The Anbar “Awakening,” as the sheikhs termed it, served as a model for counterinsurgency throughout Iraq after George W. Bush’s troop surge put enough boots on the ground to export the Awakening to other troubled provinces. The clinching factor in restoring order to Iraq was the effort to reform the coalition’s counterinsurgency doctrine led by General David Petraeus, who would later take top command of the theater and implement the successful strategy.
That success had to be carried out against the headwinds of protest from congressional Democrats, notably Barack Obama, and some Republicans, such as current Defense Secretary Chuck Hagel, at the time a Republican senator from Nebraska. But if war critics were mistaken to see perpetual violence as the new natural order in Iraq, supporters of the war effort made the reverse error. They thought victory over Iraqi insurgents and especially al-Qaeda vindicated the Iraq war as an advantageous front on which to fight the war on terror. The criticism that the Iraq war was a reckless distraction from the more noble and just war in Afghanistan isn’t quite right; it would be more accurate to say that many supporters of the Iraq war learned the wrong lesson from the American successes in Afghanistan.
A common defense of the Iraq war is that even when al-Qaeda in Iraq was at its strongest, the group was really laying the ground for its own demise by falling prey to the “flypaper” trap. There is truth to this: As previously noted, AQI was a foreign entity that ramped up its terrorist activity and recruitment in Iraq after the invasion. Killing terrorists in Iraq (and elsewhere) was preferable to playing defense and waiting for them to organize attacks on American soil. However, far more preferable than creating a terrorist “flypaper” is keeping them on the run. That’s because flypaper itself can also serve as a recruitment center. To understand why this is so, it’s instructive to look at what the U.S. accomplished in Afghanistan.
The rise of disconnected and decentralized jihadist operations is a new and troubling threat, but it’s one born of necessity, even desperation, on the part of anti-American holy warriors. As Bergen writes in The Longest War: The Enduring Conflict Between America and al-Qaeda (Free Press, 486 pages):
After the fall of the Taliban, al-Qaeda of necessity had to adopt a flatter structure because the group had been flattened by the American assault on Afghanistan and it would subsequently never resurrect its network of Afghan-era, large-scale training camps that had churned out thousands of graduates every year. But al-Qaeda and its affiliated groups continued to try to build organizational structures from Iraq to Pakistan, as it was those structures that gave them the ability to carry out large-scale operations.
The reason is simple: The Internet can spread hate (and even provide crude bomb-making recipes), and local Islamist organizations can galvanize recruits, but complex terrorist capabilities require comprehensive training. Dispersal is preferable because even for potential suicide bombers, there is strength in numbers. For this reason, the flypaper analogy is, at best, a way of making lemonade out of lemons. In addition, those who warned that nation-building in Iraq would be much more difficult than in past experiences were right—but not because of an Arab distaste for democracy. Rather, the war on terror persists because the modern era of transnational terrorism fundamentally alters the character of projects such as nation-building.
Consider the Arab Spring. During the Cold War, part of the reason for the West’s confidence was the presence of myriad inspirational opposition figures threatening authoritarian regimes from within. Long before the Soviet Union had a figure like Mikhail Gorbachev reforming the empire from the top down, it had dissidents working from the ground up, like Aleksandr Solzhenitsyn, Lech Walesa and the Solidarity trade union, Andrei Sakharov and his wife Yelena Bonner, refuseniks such as Natan Sharansky, and dissidents from the world of the arts like famed soprano Galina Vishnevskaya.
Whether or not the Arab Spring has turned to an Arab Winter, as many claim, it’s far from clear the forces of democracy and liberty have any true champion on either side of several of the Middle East’s civil wars. In conflict after conflict, aging and teetering dictators have warned the West with some variation of après moi, le déluge—a sentiment that would have been dismissed as ridiculous on its face coming from, say, Leonid Brezhnev but with which the White House now seems to more or less agree, having promised to arm Syrian rebels attempting to overthrow Bashar al-Assad and then gotten cold feet when officials looked at who might receive those weapons.
And it’s not just the possibility of empowering local terrorists in Syria that has the West sweating out this round of popular rebellion. In February, British Foreign Secretary William Hague went public with concerns that some foreign jihadists joining the Syrian rebellion carried European passports, and if they survived they could return to Europe battle-hardened, trained for holy war, and running with some pretty bad company. By late September, Reuters was reporting that foreign fighters were so plentiful that there were entirely non-Syrian brigades. Where once jihadists flocked to Afghanistan and to Chechnya, they are now, reportedly, flocking from those locales to Syria.
Some of these groups are working with al-Qaeda affiliates and some aren’t. How does that fit into the administration’s paradigm that our “enemy is al-Qaeda and its terrorist affiliates,” strictly speaking? Does the administration mean to say that jihadists coming from Afghanistan—where we are still fighting the “good war”—and joining in alliance with al-Qaeda in Syria, but not joining al-Qaeda de jure, are not our enemy?
Administration officials might respond that there is near unanimity among analysts that the al-Qaeda affiliate in Syria, Jabhat al-Nusra, is the most effective armed group and the clear leader of the opposition to Bashar al-Assad’s forces. This shows, they might argue, that defeating al-Qaeda and its affiliates should be the defined goal of this war.
The problem with this line of reasoning is not only the inexactness and inherent elasticity of the term “affiliate” but also the inconvenient fact that on the other side of the Syrian civil war is another terrorist group that has joined in battle against American troops in a war zone and that is arguably more dangerous than al-Qaeda: Lebanon’s Hezbollah.
Hezbollah was created in the early 1980s in Lebanon in an Iranian attempt to unify Shiite resistance groups. In its early years, it fought the Israeli army’s anti-terror efforts in south Lebanon, and the terrorist group was also behind the 1983 bombing of the U.S. Marine barracks in Beirut, which killed 241 American service members. The Iranian government’s role in creating, funding, and directing Hezbollah has been key to the group’s resilience, though it has been aided greatly by its location: Lebanon is a weakly governed state manifestly incapable of reining in Hezbollah that borders on Hezbollah’s other major state ally, Syria.
Hezbollah has much more recent American blood on its hands, however. As terrorism expert Matthew Levitt documents in his authoritative and immensely important new book Hezbollah: The Global Footprint of Lebanon’s Party of God (Georgetown University Press, 426 pages), the group was involved in lethal attacks on Americans in Iraq as well as training and recruiting anti-coalition forces there. While Hezbollah has not carried out an attack on American soil, the group has, Levitt details, organized and directed extensive criminal activities to fundraise for Hezbollah in America. So, the inevitable question: Are we at war with Hezbollah?
The Illogic of the War on the ‘War on Terror’
As the ground underneath the president’s feet has shifted, the conventional wisdom has shifted with it. In May, after Obama delivered his speech calling for an end to the war on terror, the Economist—that bastion of conventional thinking and twice an endorser of Barack Obama—nodded approvingly, noting the president’s “solemnity” and dismissing Republican criticism as inspired not by principle or genuine concern for the world’s security but simply because Republicans were “bent on finding ways to attack a president who they despise.”
Yet on September 28, the Economist was singing a different tune. “The West thought it was winning the battle against jihadist terrorism,” it intoned ominously. “It should think again.” After reviewing the recent gains of terrorist groups, it handed down its judgment: “How much should Western complacency be blamed for this stunning revival? Quite a bit.”
The magazine was startled out of its own complacency by the terror-filled weekend of September 21–22, which perfectly illustrated the imprudence of the high-minded alternatives to the war on terror. On September 21, gunmen from the al-Shabab terror group stormed an Israeli-owned mall in Kenya and killed more than 60 people. That same day more than 100 were killed in suicide attacks in Iraq, most of them attendees at a funeral. The following day, the Pakistani Taliban bombed a Peshawar church killing 85 in the worst attack on Christians in Pakistan’s history.
Some attacks can be tied to al-Qaeda affiliates. Others cannot. Yet it’s worth pointing out that while the Obama administration makes semantic distinctions between terrorists of various stripes (while preferring not to call them terrorists, if possible), the administration’s anti-terror policies make no such distinction. Barack Obama’s war on al-Qaeda necessitates an intelligence and homeland-security structure that attempts to discriminate between terrorists and civilians but cannot afford—and in most cases, simply isn’t able—to discriminate between al-Qaeda-based threats and others.
So the war on terror persists, even with a Nobel peace laureate in the White House who has been eager to end the country’s military commitments. It does so because the war on terror defies its caricature as an ideologically narrow, hawkish neoconservative project.
President Obama is celebrated as a realist; indeed, when the president decided to modestly increase aid to the Syrian rebels that seemed intended to help level the playing field but not tip the scales of the civil war, Tufts University professor and Foreign Policy blogger Daniel Drezner called it “brutally realpolitik” fully in line with the administration’s “pretty realist policy towards Syria.”
The realists’ fondness for Obama is requited. Yet it should not surprise anyone that a realist approach to managing the war on terror so closely resembles its neoconservative variant. To take the definition offered by the Stanford Encyclopedia of Philosophy, realism stresses the “competitive and conflictual side” of international politics, and realists “consider the principal actors in the international arena to be states, which are concerned with their own security, act in pursuit of their own national interests, and struggle for power.” If realism is primarily concerned with order and stability among nation-states, the war on terror, in which states often at odds with one another cooperate to prevent the spread of destabilizing violence, is a logical approach to security policy.
It is for the same reason that multilateralists can’t quite quit the war on terror. In Every Nation for Itself: Winners and Losers in a G-Zero World, Eurasia Group President Ian Bremmer writes of the efforts to stop transnational terrorism: “Persuading states around the world to share the costs and burdens that come with a uniform screening standard for people and packages that travel by air or sea has never been more important, but when it’s every nation for itself, this will be more difficult than ever to accomplish.” Stewart M. Patrick of the Council on Foreign Relations lamented in 2011 that the war on terror has been “caricatured” as a unilateral adventure. “But a more positive, if unsung aspect of this struggle has been its multilateral ethos,” Patrick wrote. “In the decade since 9/11, the international community has shown remarkable cohesiveness and solidarity in its effort to protect innocent people from terrorist attacks, despite significant challenges that remain. Much of this cooperation has occurred under the radar, through quiet, everyday multilateral and bilateral cooperation among law enforcement agencies, intelligence services, and militaries.”
Liberal interventionists may be an endangered species in Washington these days, but those who remain seem all to be working for the Obama administration. Samantha Power, one of the leading proponents of humanitarian intervention based on the international community’s “responsibility to protect” at-risk populations, is now the administration’s ambassador to the United Nations—a Cabinet-level position. Another such interventionist, Susan Rice, is the president’s national-security adviser. Both have framed military action in the Middle East as extensions of the president’s anti-terror policies and essential actions to shore up international law.
Advocates of the rule of law here at home are also defenders of the war on terror. Arguing against ending the war on terror at a debate in New York City 10 years after the 9/11 attacks, former CIA director Michael Hayden made the case that not only is the war on terror necessary and lawful, but it can actually protect Americans’ civil liberties. He reviewed the incident on Christmas Day 2009 in which a Nigerian national named Umar Farouk Abdulmutallab, aka the “underwear bomber,” tried to set off a bomb on a plane headed to Detroit. He was arrested and interrogated briefly before being told he had the right to remain silent. Which he did. This act of Mirandizing the underwear bomber was, Hayden declared, understood to have been a dreadful mistake. As a result of that, Attorney General Eric Holder was among those floating the idea of passing legislation to make the rules governing interrogation of Mirandized suspects more “malleable.”
Hayden’s response was blunt: “I don’t want to make Miranda more malleable. Miranda defends me. Defends you. Defends your rights. And we’re forced to contort the law enforcement approach when we attempt to make it answer and deal with questions it was never designed to deal with.”
In late June of last year, the Fifth Circuit of the U.S. Court of Appeals released a decision written by Judge Edward C. Prado. The court had before it a rather weighty question, which it dispatched without fanfare: The judges ruled that the Iraq war was not over yet.
The defendant was a lieutenant colonel charged in 2010 with abusing his authority running an Iraqi base in Anbar province in 2003 and 2004. Under the Wartime Suspension of Limitations Act, the clock on the statute of limitations would not begin ticking while the country was still at war. The defendant argued that the war ended when President Bush announced the end of major combat operations in 2003, in which case the statute of limitations had run out on his case. The court did not buy it.
That the end of the Iraq war—a fairly conventional land war—can be disputed is an indication of just how open-ended the war on terror appears to be by comparison. It’s easy to understand why President Obama would be tempted to narrow the war’s aim retroactively in the face of public exhaustion and with the successful elimination of the enemy’s avatar, Osama bin Laden. In Iraq, David Petraeus famously told a reporter: “Tell me how this ends.” The variation for the war on terror might be, “Tell me when this ends.”
While rallying the public to this new war barely a week after the September 11, 2001 attacks, George W. Bush was clear: By definition, it can end only when we win. “Our war on terror begins with al-Qaeda, but it does not end there,” the president said. “It will not end until every terrorist group of global reach has been found, stopped and defeated.” Bush warned that this would be a war unlike other wars, both in duration and form. In doing so, he touched on the most important reason the war on terror is different from other conflicts. For most of this country’s history, Americans had an advantage over the rest of the Western world: They stood remote from enemies and rivals and were able to make policy in an unprecedented geopolitical vacuum.
Even when the U.S. rose to superpower status, the conflicts it engaged in were far from home. The age of transnational Islamist terrorism changed all that. It ushered Americans into a new era and shut the door behind them. The war on terror was an attempt to confront this new reality. It remains just, if imperfect, and superior to its alternatives. And despite the rhetorical and political assaults upon it from the right and the left and the center, the war on terror is likely to remain the policy the United States continues to pursue until the combatants on the other side are brought to heel—or give up the fight.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The Failed War on the ‘War on Terror’
Must-Reads from Magazine
Terror is a choice.
Ari Fuld described himself on Twitter as a marketer and social media consultant “when not defending Israel by exposing the lies and strengthening the truth.” On Sunday, a Palestinian terrorist stabbed Fuld at a shopping mall in Gush Etzion, a settlement south of Jerusalem. The Queens-born father of four died from his wounds, but not before he chased down his assailant and neutralized the threat to other civilians. Fuld thus gave the full measure of devotion to the Jewish people he loved. He was 45.
The episode is a grim reminder of the wisdom and essential justice of the Trump administration’s tough stance on the Palestinians.
Start with the Taylor Force Act. The act, named for another U.S. citizen felled by Palestinian terror, stanched the flow of American taxpayer fund to the Palestinian Authority’s civilian programs. Though it is small consolation to Fuld’s family, Americans can breathe a sigh of relief that they are no longer underwriting the PA slush fund used to pay stipends to the family members of dead, imprisoned, or injured terrorists, like the one who murdered Ari Fuld.
No principle of justice or sound statesmanship requires Washington to spend $200 million—the amount of PA aid funding slashed by the Trump administration last month—on an agency that financially induces the Palestinian people to commit acts of terror. The PA’s terrorism-incentive budget—“pay-to-slay,” as Douglas Feith called it—ranges from $50 million to $350 million annually. Footing even a fraction of that bill is tantamount to the American government subsidizing terrorism against its citizens.
If we don’t pay the Palestinians, the main line of reasoning runs, frustration will lead them to commit still more and bloodier acts of terror. But U.S. assistance to the PA dates to the PA’s founding in the Oslo Accords, and Palestinian terrorists have shed American and Israeli blood through all the years since then. What does it say about Palestinian leaders that they would unleash more terror unless we cross their palms with silver?
President Trump likewise deserves praise for booting Palestinian diplomats from U.S. soil. This past weekend, the State Department revoked a visa for Husam Zomlot, the highest-ranking Palestinian official in Washington. The State Department cited the Palestinians’ years-long refusal to sit down for peace talks with Israel. The better reason for expelling them is that the label “envoy” sits uneasily next to the names of Palestinian officials, given the links between the Palestine Liberation Organization, President Mahmoud Abbas’s Fatah faction, and various armed terrorist groups.
Fatah, for example, praised the Fuld murder. As the Jerusalem Post reported, the “al-Aqsa Martyrs Brigades, the military wing of Fatah . . . welcomed the attack, stressing the necessity of resistance ‘against settlements, Judaization of the land, and occupation crimes.’” It is up to Palestinian leaders to decide whether they want to be terrorists or statesmen. Pretending that they can be both at once was the height of Western folly, as Ari Fuld no doubt recognized.
May his memory be a blessing.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The end of the water's edge.
It was the blatant subversion of the president’s sole authority to conduct American foreign policy, and the political class received it with fury. It was called “mutinous,” and the conspirators were deemed “traitors” to the Republic. Those who thought “sedition” went too far were still incensed over the breach of protocol and the reckless way in which the president’s mandate was undermined. Yes, times have certainly changed since 2015, when a series of Republican senators signed a letter warning Iran’s theocratic government that the Joint Comprehensive Plan of Action (aka, the Iran nuclear deal) was built on a foundation of sand.
The outrage that was heaped upon Senate Republicans for freelancing on foreign policy in the final years of Barack Obama’s administration has not been visited upon former Secretary of State John Kerry, though he arguably deserves it. In the publicity tour for his recently published memoir, Kerry confessed to conducting meetings with Iranian Foreign Minister Javad Zarif “three or four times” as a private citizen. When asked by Fox News Channel’s Dana Perino if Kerry had advised his Iranian interlocutor to “wait out” the Trump administration to get a better set of terms from the president’s successor, Kerry did not deny the charge. “I think everybody in the world is sitting around talking about waiting out President Trump,” he said.
Think about that. This is a former secretary of state who all but confirmed that he is actively conducting what the Boston Globe described in May as “shadow diplomacy” designed to preserve not just the Iran deal but all the associated economic relief and security guarantees it provided Tehran. The abrogation of that deal has put new pressure on the Iranians to liberalize domestically, withdraw their support for terrorism, and abandon their provocative weapons development programs—pressures that the deal’s proponents once supported.
“We’ve got Iran on the ropes now,” said former Democratic Sen. Joe Lieberman, “and a meeting between John Kerry and the Iranian foreign minister really sends a message to them that somebody in America who’s important may be trying to revive them and let them wait and be stronger against what the administration is trying to do.” This is absolutely correct because the threat Iran poses to American national security and geopolitical stability is not limited to its nuclear program. The Iranian threat will not be neutralized until it abandons its support for terror and the repression of its people, and that will not end until the Iranian regime is no more.
While Kerry’s decision to hold a variety of meetings with a representative of a nation hostile to U.S. interests is surely careless and unhelpful, it is not uncommon. During his 1984 campaign for the presidency, Jesse Jackson visited the Soviet Union and Cuba to raise his own public profile and lend credence to Democratic claims that Ronald Reagan’s confrontational foreign policy was unproductive. House Speaker Jim Wright’s trip to Nicaragua to meet with the Sandinista government was a direct repudiation of the Reagan administration’s support for the country’s anti-Communist rebels. In 2007, as Bashar al-Assad’s government was providing material support for the insurgency in Iraq, House Speaker Nancy Pelosi sojourned to Damascus to shower the genocidal dictator in good publicity. “The road to Damascus is a road to peace,” Pelosi insisted. “Unfortunately,” replied George W. Bush’s national security council spokesman, “that road is lined with the victims of Hamas and Hezbollah, the victims of terrorists who cross from Syria into Iraq.”
Honest observers must reluctantly conclude that the adage is wrong. American politics does not, in fact, stop at the water’s edge. It never has, and maybe it shouldn’t. Though it may be commonplace, American political actors who contradict the president in the conduct of their own foreign policy should be judged on the policies they are advocating. In the case of Iran, those who seek to convince the mullahs and their representatives that repressive theocracy and a terroristic foreign policy are dead-ends are advancing the interests not just of the United States but all mankind. Those who provide this hopelessly backward autocracy with the hope that America’s resolve is fleeting are, as John Kerry might say, on “the wrong side of history.”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Michael Wolff is its Marquis de Sade. Released on January 5, 2018, Wolff’s Fire and Fury became a template for authors eager to satiate the growing demand for unverified stories of Trump at his worst. Wolff filled his pages with tales of the president’s ignorant rants, his raging emotions, his television addiction, his fast-food diet, his unfamiliarity with and contempt for Beltway conventions and manners. Wolff made shocking insinuations about Trump’s mental state, not to mention his relationship with UN ambassador Nikki Haley. Wolff’s Trump is nothing more than a knave, dunce, and commedia dell’arte villain. The hero of his saga is, bizarrely, Steve Bannon, who in Wolff’s telling recognized Trump’s inadequacies, manipulated him to advance a nationalist-populist agenda, and tried to block his worst impulses.
Wolff’s sources are anonymous. That did not slow down the press from calling his accusations “mind-blowing” (Mashable.com), “wild” (Variety), and “bizarre” (Entertainment Weekly). Unlike most pornographers, he had a lesson in mind. He wanted to demonstrate Trump’s unfitness for office. “The story that I’ve told seems to present this presidency in such a way that it says that he can’t do this job, the emperor has no clothes,” Wolff told the BBC. “And suddenly everywhere people are going, ‘Oh, my God, it’s true—he has no clothes.’ That’s the background to the perception and the understanding that will finally end this, that will end this presidency.”
Nothing excites the Resistance more than the prospect of Trump leaving office before the end of his term. Hence the most stirring examples of Resistance Porn take the president’s all-too-real weaknesses and eccentricities and imbue them with apocalyptic significance. In what would become the standard response to accusations of Trumpian perfidy, reviewers of Fire and Fury were less interested in the truth of Wolff’s assertions than in the fact that his argument confirmed their preexisting biases.
Saying he agreed with President Trump that the book is “fiction,” the Guardian’s critic didn’t “doubt its overall veracity.” It was, he said, “what Mailer and Capote once called a nonfiction novel.” Writing in the Atlantic, Adam Kirsch asked: “No wonder, then, Wolff has written a self-conscious, untrustworthy, postmodern White House book. How else, he might argue, can you write about a group as self-conscious, untrustworthy, and postmodern as this crew?” Complaining in the New Yorker, Masha Gessen said Wolff broke no new ground: “Everybody” knew that the “president of the United States is a deranged liar who surrounded himself with sycophants. He is also functionally illiterate and intellectually unsound.” Remind me never to get on Gessen’s bad side.
What Fire and Fury lacked in journalistic ethics, it made up in receipts. By the third week of its release, Wolff’s book had sold more than 1.7 million copies. His talent for spinning second- and third-hand accounts of the president’s oddity and depravity into bestselling prose was unmistakable. Imitators were sure to follow, especially after Wolff alienated himself from the mainstream media by defending his innuendos about Haley.
It was during the first week of September that Resistance Porn became a competitive industry. On the afternoon of September 4, the first tidbits from Bob Woodward’s Fear appeared in the Washington Post, along with a recording of an 11-minute phone call between Trump and the white knight of Watergate. The opposition began panting soon after. Woodward, who like Wolff relies on anonymous sources, “paints a harrowing portrait” of the Trump White House, reported the Post.
No one looks good in Woodward’s telling other than former economics adviser Gary Cohn and—again bizarrely—the former White House staff secretary who was forced to resign after his two ex-wives accused him of domestic violence. The depiction of chaos, backstabbing, and mutual contempt between the president and high-level advisers who don’t much care for either his agenda or his personality was not so different from Wolff’s. What gave it added heft was Woodward’s status, his inviolable reputation.
“Nothing in Bob Woodward’s sober and grainy new book…is especially surprising,” wrote Dwight Garner at the New York Times. That was the point. The audience for Wolff and Woodward does not want to be surprised. Fear is not a book that will change minds. Nor is it intended to be. “Bob Woodward’s peek behind the Trump curtain is 100 percent as terrifying as we feared,” read a CNN headline. “President Trump is unfit for office. Bob Woodward’s ‘Fear’ confirms it,” read an op-ed headline in the Post. “There’s Always a New Low for the Trump White House,” said the Atlantic. “Amazingly,” wrote Susan Glasser in the New Yorker, “it is no longer big news when the occupant of the Oval Office is shown to be callous, ignorant, nasty, and untruthful.” How could it be, when the press has emphasized nothing but these aspects of Trump for the last three years?
The popular fixation with Trump the man, and with the turbulence, mania, frenzy, confusion, silliness, and unpredictability that have surrounded him for decades, serves two functions. It inoculates the press from having to engage in serious research into the causes of Trump’s success in business, entertainment, and politics, and into the crises of borders, opioids, stagnation, and conformity of opinion that occasioned his rise. Resistance Porn also endows Trump’s critics, both external and internal, with world-historical importance. No longer are they merely journalists, wonks, pundits, and activists sniping at a most unlikely president. They are politically correct versions of Charles Martel, the last line of defense preventing Trump the barbarian from enacting the policies on which he campaigned and was elected.
How closely their sensational claims and inflated self-conceptions track with reality is largely beside the point. When the New York Times published the op-ed “I am Part of the Resistance Inside the Trump Administration,” by an anonymous “senior official” on September 5, few readers bothered to care that the piece contained no original material. The author turned policy disagreements over trade and national security into a psychiatric diagnosis. In what can only be described as a journalistic innovation, the author dispensed with middlemen such as Wolff and Woodward, providing the Times the longest background quote in American history. That the author’s identity remains a secret only adds to its prurient appeal.
“The bigger concern,” the author wrote, “is not what Mr. Trump has done to the presidency but what we as a nation have allowed him to do to us.” Speak for yourself, bud. What President Trump has done to the Resistance is driven it batty. He’s made an untold number of people willing to entertain conspiracy theories, and to believe rumor is fact, hyperbole is truth, self-interested portrayals are incontrovertible evidence, credulity is virtue, and betrayal is fidelity—so long as all of this is done to stop that man in the White House.
Choose your plan and pay nothing for six Weeks!
Review of 'Stanley Kubrick' By Nathan Abrams
Except for Stanley Donen, every director I have worked with has been prone to the idea, first propounded in the 1950s by François Truffaut and his tendentious chums in Cahiers du Cinéma, that directors alone are authors, screenwriters merely contingent. In singular cases—Orson Welles, Michelangelo Antonioni, Woody Allen, Kubrick himself—the claim can be valid, though all of them had recourse, regular or occasional, to helping hands to spice their confections.
Kubrick’s variety of topics, themes, and periods testifies both to his curiosity and to his determination to “make it new.” Because his grades were not high enough (except in physics), this son of a Bronx doctor could not get into colleges crammed with returning GIs. The nearest he came to higher education was when he slipped into accessible lectures at Columbia. He told me, when discussing the possibility of a movie about Julius Caesar, that the great classicist Moses Hadas made a particularly strong impression.
While others were studying for degrees, solitary Stanley was out shooting photographs (sometimes with a hidden camera) for Look magazine. As a movie director, he often insisted on take after take. This gave him choices of the kind available on the still photographer’s contact sheets. Only Peter Sellers and Jack Nicholson had the nerve, and irreplaceable talent, to tell him, ahead of shooting, that they could not do a particular scene more than two or three times. The energy to electrify “Mein Führer, I can walk” and “Here’s Johnny!” could not recur indefinitely. For everyone else, “Can you do it again?” was the exhausting demand, and it could come close to being sadistic.
The same method could be applied to writers. Kubrick might recognize what he wanted when it was served up to him, but he could never articulate, ahead of time, even roughly what it was. Picking and choosing was very much his style. Cogitation and opportunism went together: The story goes that he attached Strauss’s Blue Danube to the opening sequence of 2001 because it happened to be playing in the sound studio when he came to dub the music. Genius puts chance to work.
Until academics intruded lofty criteria into cinema/film, the better to dignify their speciality, Alfred Hitchcock’s attitude covered most cases: When Ingrid Bergman asked for her motivation in walking to the window, Hitch replied, fatly, “Your salary.” On another occasion, told that some scene was not plausible, Hitch said, “It’s only a movie.” He did not take himself seriously until the Cahiers du Cinéma crowd elected to make him iconic. At dinner, I once asked Marcello Mastroianni why he was so willing to play losers or clowns. Marcello said, “Beh, cinema non e gran’ cosa” (cinema is no big deal). Orson Welles called movie-making the ultimate model-train set.
That was then; now we have “film studies.” After they moved in, academics were determined that their subject be a very big deal indeed. Comedy became no laughing matter. In his monotonous new book, the film scholar Nathan Abrams would have it that Stanley Kubrick was, in essence, a “New York Jewish intellectual.” Abrams affects to unlock what Stanley was “really” dealing with, in all his movies, never mind their apparent diversity. It is declared to be, yes, Yiddishkeit, and in particular, the Holocaust. This ground has been tilled before by Geoffrey Cocks, when he argued that the room numbers in the empty Overlook Hotel in The Shining encrypted references to the Final Solution. Abrams would have it that even Barry Lyndon is really all about the outsider seeking, and failing, to make his awkward way in (Gentile) Society. On this reading, Ryan O’Neal is seen as Hannah Arendt’s pariah in 18th-century drag. The movie’s other characters are all engaged in the enjoyment of “goyim-naches,” an expression—like menschlichkayit—he repeats ad nauseam, lest we fail to get the stretched point.
Theory is all when it comes to the apotheosis of our Jew-ridden Übermensch. So what if, in order to make a topic his own, Kubrick found it useful to translate its logic into terms familiar to him from his New York youth? In Abrams’s scheme, other mundane biographical facts count for little. No mention is made of Stanley’s displeasure when his 14-year-old daughter took a fancy to O’Neal. The latter was punished, some sources say, by having Barry’s voiceover converted from first person so that Michael Hordern would displace the star as narrator. By lending dispassionate irony to the narrative, it proved a pettish fluke of genius.
While conning Abrams’s volume, I discovered, not greatly to my chagrin, that I am the sole villain of the piece. Abrams calls me “self-serving” and “unreliable” in my accounts of my working and personal relationship with Stanley. He insinuates that I had less to do with Eyes Wide Shut than I pretend and that Stanley regretted my involvement. It is hard for him to deny (but convenient to omit) that, after trying for some 30 years to get a succession of writers to “crack” how to do Schnitzler’s Traumnovelle, Kubrick greeted my first draft with “I’m absolutely thrilled.” A source whose anonymity I respect told me that he had never seen Stanley so happy since the day he received his first royalty check (for $5 million) for 2001. No matter.
Were Abrams (the author also of a book as hostile to Commentary as this one is to me) able to put aside his waxed wrath, he might have quoted what I reported in my memoir Eyes Wide Open to support his Jewish-intellectual thesis. One day, Stanley asked me what a couple of hospital doctors, walking away with their backs to the camera, would be talking about. We were never going to hear or care what it was, but Stanley—at that early stage of development—said he wanted to know everything. I said, “Women, golf, the stock market, you know…”
“Couple of Gentiles, right?”
“That’s what you said you wanted them to be.”
“Those people, how do we ever know what they’re talking about when they’re alone together?”
“Come on, Stanley, haven’t you overheard them in trains and planes and places?”
Kubrick said, “Sure, but…they always know you’re there.”
If he was even halfway serious, Abrams’s banal thesis that, despite decades of living in England, Stanley never escaped the Old Country, might have been given some ballast.
Now, as for Stanley Kubrick’s being an “intellectual.” If this implies membership in some literary or quasi-philosophical elite, there’s a Jewish joke to dispense with it. It’s the one about the man who makes a fortune, buys himself a fancy yacht, and invites his mother to come and see it. He greets her on the gangway in full nautical rig. She says, “What’s with the gold braid already?”
“Mama, you have to realize, I’m a captain now.”
She says, “By you, you’re a captain, by me, you’re a captain, but by a captain, are you a captain?”
As New York intellectuals all used to know, Karl Popper’s definition of bad science, and bad faith, involves positing a theory and then selecting only whatever data help to furnish its validity. The honest scholar makes it a matter of principle to seek out elements that might render his thesis questionable.
Abrams seeks to enroll Lolita in his obsessive Jewish-intellectual scheme by referring to Peter Arno, a New Yorker cartoonist whom Kubrick photographed in 1949. The caption attached to Kubrick’s photograph in Look asserted that Arno liked to date “fresh, unspoiled girls,” and Abrams says this “hint[s] at Humbert Humbert in Lolita.” Ah, but Lolita was published, in Paris, in 1955, six years later. And how likely is it, in any case, that Kubrick wrote the caption?
The film of Lolita is unusual for its garrulity. Abrams’s insistence on the sinister Semitic aspect of both Clare Quilty and Humbert Humbert supposedly drawing Kubrick like moth to flame is a ridiculous camouflage of the commercial opportunism that led Stanley to seek to film the most notorious novel of the day, while fudging its scandalous eroticism.
That said, in my view, The Killing, Paths of Glory, Barry Lyndon, and Clockwork Orange were and are sans pareil. The great French poet Paul Valéry wrote of “the profundity of the surface” of a work of art. Add D.H. Lawrence’s “never trust the teller, trust the tale,” and you have two authoritative reasons for looking at or reading original works of art yourself and not relying on academic exegetes—especially when they write in the solemn, sometimes ungrammatical style of Professor Abrams, who takes time out to tell those of us at the back of his class that padre “is derived from the Latin pater.”
Abrams writes that I “claim” that I was told to exclude all overt reference to Jews in my Eyes Wide Shut screenplay, with the fatuous implication that I am lying. I am again accused of “claiming” to have given the name Ziegler to the character played by Sidney Pollack, because I once had a (quite famous) Hollywood agent called Evarts Ziegler. So I did. The principal reason for Abrams to doubt my veracity is that my having chosen the name renders irrelevant his subsequent fanciful digression on the deep, deep meanings of the name Ziegler in Jewish lore; hence he wishes to assign the naming to Kubrick. Pop goes another wished-for proof of Stanley’s deep and scholarly obsession with Yiddishkeit.
Abrams would be a more formidable enemy if he could turn a single witty phrase or even abstain from what Karl Kraus called mauscheln, the giveaway jargon of Jewish journalists straining to pass for sophisticates at home in Gentile circles. If you choose, you can apply, on line, for screenwriting lessons from Nathan Abrams, who does not have a single cinematic credit to his name. It would be cheaper, and wiser, to look again, and then again, at Kubrick’s masterpieces.
Choose your plan and pay nothing for six Weeks!
Is American opera in terminal condition?
At the Met, distinguished singers and conductors, mostly born and trained in Europe, appeared in theatrically conservative big-budget productions of the popular operas of the 19th century, with a sprinkling of pre-romantic and modern works thrown in to leaven the loaf. City Opera, by contrast, presented younger artists—many, like Beverly Sills, born in this country—in a wider-ranging, more adventurously staged repertoire that often included new operas, some of them written by American composers, to which the public was admitted at what were then called “popular prices.”
Between them, the companies represented a feast for culture-consuming New Yorkers, though complaints were already being heard that their new theaters were too big. Moreover, neither the Met nor City Opera was having any luck at commissioning memorable new operas and thereby expanding and refreshing the operatic repertoire, to which only a handful of significant new works—none of them, then or since, premiered by either company—had been added since World War I.
A half-century later, the feast has turned to famine. In 2011, New York City Opera left Lincoln Center, declaring bankruptcy. It closed its doors forever two years later. The Met has weathered a nearly uninterrupted string of crises that climaxed earlier this year with the firing of James Levine, the company’s once-celebrated music director emeritus. He was accused in 2017 of molesting teenage musicians and was dismissed from all of his conducting posts in New York and elsewhere. Today the Met is in dire financial straits that threaten its long-term survival.
And while newer opera companies in such other American cities as Chicago, Houston, San Francisco, Santa Fe, and Seattle now offer alternative models of leadership, none has established itself as a potential successor either to the Met or the now-defunct NYCO.1
Is American opera as a whole in a terminal condition? Or are the collapse of the New York City Opera and the Met’s ongoing struggle to survive purely local matters of no relevance elsewhere? Heidi Waleson addresses these questions in Mad Scenes and Exit Arias: The Death of the New York City Opera and the Future of Opera in America.2 Waleson draws on her experience as the opera critic of the Wall Street Journal to speculate on the prospects for an art form that has never quite managed to set down firm roots in American culture.
In this richly informative chronicle of NYCO’s decline and fall, Waleson persuasively argues that what happened to City Opera (and, by extension, the Met) could happen to other opera companies as well. The days in which an ambitious community sought successfully to elevate itself into the first rank of world cities by building and manning an opera house are long past, and Mad Scenes and Exit Arias helps us understand why.As Waleson reminds us, it was Fiorello LaGuardia, the New York mayor who played a central role in the creation of the NYCO, who dubbed the company “the people’s opera” when it was founded in 1943. According to LaGuardia, NYCO existed to perform popular operas at popular prices for a mass audience. In later years, it moved away from that goal, but the slogan stuck. Indeed, no opera company has ever formulated a clearer statement of its institutional mission.
Even after it moved to Lincoln Center in 1966, NYCO had an equally coherent and similarly appealing purpose: It was where you went to see the opera stars of tomorrow, foremost among them Sills and Plácido Domingo, in inexpensively but imaginatively staged productions of the classics. The company went out of its way to present modern operas, too, but it never did so at the expense of its central repertoire—and tickets to its performances cost half of what the Met charged. Well into the 21st century, City Opera stuck more or less closely to its redefined mission. Under Paul Kellogg, the general and artistic director from 1996 to 2007, it did so with consistent artistic success. But revenues declined throughout the latter part of Kellogg’s tenure, in part because younger New Yorkers were unwilling to become subscribers.
In those days, the Metropolitan Opera, NYCO’s next-door neighbor, was still one of the world’s most conservative opera houses. That changed when Peter Gelb became its general manager in 2006. Gelb was resolved to modernize the Met’s productions and, to a lesser extent, its repertoire, and he simultaneously sought to heighten its national profile by digitally simulcasting live performances into movie theaters throughout America.
Kellogg was frustrated by the chronic acoustic inadequacies of the New York State Theater and sought in vain to move City Opera to a three-theater complex that was to be built (but never was) on the World Trade Center site. He retired soon after Gelb came to the Met. Kellogg was succeeded by Gérard Mortier, a European impresario who was accustomed to working in state-subsidized theaters. Mortier made a pair of fateful decisions. First, he canceled City Opera’s entire 2008–2009 season while the interior of the State Theater underwent much-needed renovations. Then he announced a follow-up season of 20th-century operas that lacked audience appeal.
That follow-up season never happened, because Mortier resigned in 2008 and fled New York. He was replaced by George Steel, who had previously served for just three months as general manager of the Dallas Opera. Under Steel, NYCO slashed its schedule to ribbons in a futile attempt to get back on its financial feet after Mortier’s financially ruinous year-long hiatus. Then he mounted a series of productions of nonstandard repertory that received mixed reviews and flopped at the box office.
The combined effect of Gelb’s innovations and the inept leadership of Mortier and Steel all but obliterated City Opera’s reason for existing. Under Gelb, the Met’s repertory ranged from such warhorses as Rigoletto and Tosca to 20th-century masterpieces like Benjamin Britten’s Midsummer Night’s Dream and Alban Berg’s Wozzeck, and tickets could be bought for as little as $20. With the Met performing a more interesting repertoire under a wider range of directors, and in part at “people’s prices,” City Opera no longer did anything that the Met wasn’t already doing on a far larger and better-financed scale. What, then, was its mission now? The truth was that it had none, and when the company went under in 2013, few mourned its passing.
As it happened, Gelb’s own innovations were a mere artistic Band-aid, for he was unwilling or unable to trim the Met’s bloated budget to any meaningful extent. He made no serious attempt to cut the company’s labor costs until a budget crisis in 2014 forced him to confront its unions, which he did with limited success. In addition, his new productions of the standard-repertory operas on which the Met relied to draw and hold older subscribers were felt by many to be trashily trendy.
The Met had particular difficulty managing the reduced circumstances of the 21st century when it came to opera. Its 3,800-seat theater has an 80-foot-deep stage with a proscenium opening that measures 54 feet on each side. (Bayreuth, by contrast, seats 1,925, La Scala 2,030, and the Vienna State Opera 2,200.) As a result, it is all but impossible to mount low-to-medium-budget shows in the Metropolitan Opera House, even as the company finds it is no longer able to fill its increasingly empty house. Two decades ago, the Met earned 90 percent of its potential box-office revenue. That figure plummeted to 66 percent by 2015, forcing Gelb to raise ticket prices to an average of $158.50 per head. On Broadway, the average price of a ticket that season was $103.86.
Above all, Gelb was swimming against the cultural tide. Asked about the effects on audience development of the Met simulcasts, he admitted that three-quarters of the people who attended them were “over 65, and 30 percent of them are over 75.” As he explained: “Grand opera is in itself a kind of a dinosaur of an art form…. The question is not whether I think I’m doing a good job or not in trying to keep the [Metropolitan Opera] alive. It’s whether I’m doing a good job or not in the face of a cultural and social rejection of opera as an art form. And what I’m doing is fighting an uphill battle to try and maintain an audience in a very difficult time.”
Was that statement buck-passing defeatism, or a fair appraisal of the state of American opera? Other opera executives distanced themselves from Gelb’s remarks, and it was true—and still is—that smaller American companies have done a somewhat better job of attracting younger audiences than the top-heavy Met. But according to the National Endowment for the Arts, the percentage of U.S. adults who attend at least one operatic performance each year declined from 3.2 percent in 2002 to 2.1 percent in 2012. This problem, of course, is not limited to opera. As I wrote in these pages in 2010, the disappearance of secondary-school arts education and the rise of digital media may well be leading to “not merely a decline in public interest in the fine arts but the death of the live audience as a cultural phenomenon.”3D oes American opera have a future in an era of what Heidi Waleson succinctly describes as “flat ticket income and rising expenses”? In the last chapter of Mad Scenes and Exit Arias, she chronicles the activities of a group of innovative smaller troupes that are “rethinking what an opera company is, what it does, and who it serves.” Yet in the same breath, she acknowledges the possibility that “filling a giant theater for multiple productions of grand operas [is] no longer an achievable goal.”
If that is so, then it may be worth asking a different question: Did American opera ever have a past? It is true that opera in America has had a great and glorious history, but virtually the whole of that history consisted of American productions of 18th- and 19th-century European operas. By contrast, no opera by an American classical composer has ever entered the international major-house repertoire. Indeed, while new American operas are still commissioned and premiered at an impressive rate, few things are so rare as a second production of any of these works.
While a handful continue to be performed—John Adams’s Nixon in China (1987), André Previn’s A Streetcar Named Desire (1995), Mark Adamo’s Little Women (1998), and Jake Heggie’s Dead Man Walking (2000)—their success is a tribute to the familiarity of their subject matter and source material, not their musico-theatrical quality. As for the rest, the hard but inescapable truth is that with the exception of George Gershwin’s Porgy and Bess (1935), virtually all large-scale American operas have been purpose-written novelties that were shelved and forgotten immediately after their premieres.
The success of Porgy and Bess, which received its premiere not in an opera house but on Broadway, reminds us that American musical comedy, unlike American opera, is deeply rooted in our national culture, in much the same way that grand opera is no less deeply rooted in the national cultures of Germany and Italy, where it is still genuinely popular (if less so today than a half-century ago). By comparison with Porgy, Carousel, Guys and Dolls, or My Fair Lady, American opera as a homegrown form simply does not exist: It is merely an obscure offshoot of its European counterpart. Aaron Copland, America’s greatest composer, was not really joking when he wittily described opera as “la forme fatale,” and his own failed attempts to compose an audience-friendly opera that would be as successful as his folk-flavored ballet scores say much about the difficulties facing any composer who seeks to follow in his footsteps.
It is not that grand opera is incapable of appealing to American theatergoers. Even now, there are many Americans who love it passionately, just as there are regional companies such as Chicago’s Lyric Opera and San Francisco Opera that have avoided making the mistakes that closed City Opera’s doors. Yet the crises from which the Metropolitan Opera has so far failed to extricate itself suggest that in the absence of the generous state subsidies that keep European opera houses in business, large-house grand opera in America may simply be too expensive to thrive—or, ultimately, to survive. At its best, no art form is more thrilling or seductive. But none is at greater risk of following the dinosaurs down the cold road to extinction.
1 The “New York City Opera” founded in 2016 that now mounts operas in various New York theaters on an ad hoc basis is a brand-new enterprise that has no connection with its predecessor.
2 Metropolitan Books, 304 pages